Sample records for formal extraction system

  1. Ion Channel ElectroPhysiology Ontology (ICEPO) - a case study of text mining assisted ontology development.

    PubMed

    Elayavilli, Ravikumar Komandur; Liu, Hongfang

    2016-01-01

    Computational modeling of biological cascades is of great interest to quantitative biologists. Biomedical text has been a rich source for quantitative information. Gathering quantitative parameters and values from biomedical text is one significant challenge in the early steps of computational modeling as it involves huge manual effort. While automatically extracting such quantitative information from bio-medical text may offer some relief, lack of ontological representation for a subdomain serves as impedance in normalizing textual extractions to a standard representation. This may render textual extractions less meaningful to the domain experts. In this work, we propose a rule-based approach to automatically extract relations involving quantitative data from biomedical text describing ion channel electrophysiology. We further translated the quantitative assertions extracted through text mining to a formal representation that may help in constructing ontology for ion channel events using a rule based approach. We have developed Ion Channel ElectroPhysiology Ontology (ICEPO) by integrating the information represented in closely related ontologies such as, Cell Physiology Ontology (CPO), and Cardiac Electro Physiology Ontology (CPEO) and the knowledge provided by domain experts. The rule-based system achieved an overall F-measure of 68.93% in extracting the quantitative data assertions system on an independently annotated blind data set. We further made an initial attempt in formalizing the quantitative data assertions extracted from the biomedical text into a formal representation that offers potential to facilitate the integration of text mining into ontological workflow, a novel aspect of this study. This work is a case study where we created a platform that provides formal interaction between ontology development and text mining. We have achieved partial success in extracting quantitative assertions from the biomedical text and formalizing them in ontological framework. The ICEPO ontology is available for download at http://openbionlp.org/mutd/supplementarydata/ICEPO/ICEPO.owl.

  2. Design criteria for extraction with chemical reaction and liquid membrane permeation

    NASA Technical Reports Server (NTRS)

    Bart, H. J.; Bauer, A.; Lorbach, D.; Marr, R.

    1988-01-01

    The design criteria for heterogeneous chemical reactions in liquid/liquid systems formally correspond to those of classical physical extraction. More complex models are presented which describe the material exchange at the individual droplets in an extraction with chemical reaction and in liquid membrane permeation.

  3. On the Formal Verification of Conflict Detection Algorithms

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar; Butler, Ricky W.; Carreno, Victor A.; Dowek, Gilles

    2001-01-01

    Safety assessment of new air traffic management systems is a main issue for civil aviation authorities. Standard techniques such as testing and simulation have serious limitations in new systems that are significantly more autonomous than the older ones. In this paper, we present an innovative approach, based on formal verification, for establishing the correctness of conflict detection systems. Fundamental to our approach is the concept of trajectory, which is a continuous path in the x-y plane constrained by physical laws and operational requirements. From the Model of trajectories, we extract, and formally prove, high level properties that can serve as a framework to analyze conflict scenarios. We use the Airborne Information for Lateral Spacing (AILS) alerting algorithm as a case study of our approach.

  4. Proceedings of the First NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)

    2009-01-01

    Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.

  5. Formal Foundations for Hierarchical Safety Cases

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Pai, Ganesh; Whiteside, Iain

    2015-01-01

    Safety cases are increasingly being required in many safety-critical domains to assure, using structured argumentation and evidence, that a system is acceptably safe. However, comprehensive system-wide safety arguments present appreciable challenges to develop, understand, evaluate, and manage, partly due to the volume of information that they aggregate, such as the results of hazard analysis, requirements analysis, testing, formal verification, and other engineering activities. Previously, we have proposed hierarchical safety cases, hicases, to aid the comprehension of safety case argument structures. In this paper, we build on a formal notion of safety case to formalise the use of hierarchy as a structuring technique, and show that hicases satisfy several desirable properties. Our aim is to provide a formal, theoretical foundation for safety cases. In particular, we believe that tools for high assurance systems should be granted similar assurance to the systems to which they are applied. To this end, we formally specify and prove the correctness of key operations for constructing and managing hicases, which gives the specification for implementing hicases in AdvoCATE, our toolset for safety case automation. We motivate and explain the theory with the help of a simple running example, extracted from a real safety case and developed using AdvoCATE.

  6. Toward a Formal Evaluation of Refactorings

    NASA Technical Reports Server (NTRS)

    Paul, John; Kuzmina, Nadya; Gamboa, Ruben; Caldwell, James

    2008-01-01

    Refactoring is a software development strategy that characteristically alters the syntactic structure of a program without changing its external behavior [2]. In this talk we present a methodology for extracting formal models from programs in order to evaluate how incremental refactorings affect the verifiability of their structural specifications. We envision that this same technique may be applicable to other types of properties such as those that concern the design and maintenance of safety-critical systems.

  7. The VATES-Diamond as a Verifier's Best Friend

    NASA Astrophysics Data System (ADS)

    Glesner, Sabine; Bartels, Björn; Göthel, Thomas; Kleine, Moritz

    Within a model-based software engineering process it needs to be ensured that properties of abstract specifications are preserved by transformations down to executable code. This is even more important in the area of safety-critical real-time systems where additionally non-functional properties are crucial. In the VATES project, we develop formal methods for the construction and verification of embedded systems. We follow a novel approach that allows us to formally relate abstract process algebraic specifications to their implementation in a compiler intermediate representation. The idea is to extract a low-level process algebraic description from the intermediate code and to formally relate it to previously developed abstract specifications. We apply this approach to a case study from the area of real-time operating systems and show that this approach has the potential to seamlessly integrate modeling, implementation, transformation and verification stages of embedded system development.

  8. KAM (Knowledge Acquisition Module): A tool to simplify the knowledge acquisition process

    NASA Technical Reports Server (NTRS)

    Gettig, Gary A.

    1988-01-01

    Analysts, knowledge engineers and information specialists are faced with increasing volumes of time-sensitive data in text form, either as free text or highly structured text records. Rapid access to the relevant data in these sources is essential. However, due to the volume and organization of the contents, and limitations of human memory and association, frequently: (1) important information is not located in time; (2) reams of irrelevant data are searched; and (3) interesting or critical associations are missed due to physical or temporal gaps involved in working with large files. The Knowledge Acquisition Module (KAM) is a microcomputer-based expert system designed to assist knowledge engineers, analysts, and other specialists in extracting useful knowledge from large volumes of digitized text and text-based files. KAM formulates non-explicit, ambiguous, or vague relations, rules, and facts into a manageable and consistent formal code. A library of system rules or heuristics is maintained to control the extraction of rules, relations, assertions, and other patterns from the text. These heuristics can be added, deleted or customized by the user. The user can further control the extraction process with optional topic specifications. This allows the user to cluster extracts based on specific topics. Because KAM formalizes diverse knowledge, it can be used by a variety of expert systems and automated reasoning applications. KAM can also perform important roles in computer-assisted training and skill development. Current research efforts include the applicability of neural networks to aid in the extraction process and the conversion of these extracts into standard formats.

  9. Compressive Information Extraction: A Dynamical Systems Approach

    DTIC Science & Technology

    2016-01-24

    sparsely encoded in very large data streams. (a) Target tracking in an urban canyon; (b) and (c) sample frames showing contextually abnormal events: onset...extraction to identify contextually abnormal se- quences (see section 2.2.3). Formally, the problem of interest can be stated as establishing whether a noisy...relaxations with optimality guarantees can be obtained using tools from semi-algebraic geometry. 2.2 Application: Detecting Contextually Abnormal Events

  10. Systematic errors in transport calculations of shear viscosity using the Green-Kubo formalism

    NASA Astrophysics Data System (ADS)

    Rose, J. B.; Torres-Rincon, J. M.; Oliinychenko, D.; Schäfer, A.; Petersen, H.

    2018-05-01

    The purpose of this study is to provide a reproducible framework in the use of the Green-Kubo formalism to extract transport coefficients. More specifically, in the case of shear viscosity, we investigate the limitations and technical details of fitting the auto-correlation function to a decaying exponential. This fitting procedure is found to be applicable for systems interacting both through constant and energy-dependent cross-sections, although this is only true for sufficiently dilute systems in the latter case. We find that the optimal fit technique consists in simultaneously fixing the intercept of the correlation function and use a fitting interval constrained by the relative error on the correlation function. The formalism is then applied to the full hadron gas, for which we obtain the shear viscosity to entropy ratio.

  11. Multi-Hadron Observables from Lattice Quantum Chromodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, Maxwell

    2014-01-01

    We describe formal work that relates the nite-volume spectrum in a quantum eld theory to scattering and decay amplitudes. This is of particular relevance to numerical calculations performed using Lattice Quantum Chromodynamics (LQCD). Correlators calculated using LQCD can only be determined on the Euclidean time axis. For this reason the standard method of determining scattering amplitudes via the Lehmann-Symanzik-Zimmermann reduction formula cannot be employed. By contrast, the nite-volume spectrum is directly accessible in LQCD calculations. Formalism for relating the spectrum to physical scattering observables is thus highly desirable. In this thesis we develop tools for extracting physical information from LQCDmore » for four types of observables. First we analyze systems with multiple, strongly-coupled two-scalar channels. Here we accommodate both identical and nonidentical scalars, and in the latter case allow for degenerate as well as nondegenerate particle masses. Using relativistic eld theory, and summing to all orders in perturbation theory, we derive a result relating the nite-volume spectrum to the two-to-two scattering amplitudes of the coupled-channel theory. This generalizes the formalism of Martin L uscher for the case of single-channel scattering. Second we consider the weak decay of a single particle into multiple, coupled two-scalar channels. We show how the nite-volume matrix element extracted in LQCD is related to matrix elements of asymptotic two-particle states, and thus to decay amplitudes. This generalizes work by Laurent Lellouch and Martin L uscher. Third we extend the method for extracting matrix elements by considering currents which insert energy, momentum and angular momentum. This allows one to extract transition matrix elements and form factors from LQCD. Finally we look beyond two-particle systems to those with three-particles in asymptotic states. Working again to all orders in relativistic eld theory, we derive a relation between the spectrum and an in nite-volume three-to-three scattering quantity. This nal analysis is the most complicated of the four, because the all-orders summation is more di cult for this system, and also because a number of new technical issues arise in analyzing the contributing diagrams.« less

  12. Potential-of-mean-force description of ionic interactions and structural hydration in biomolecular systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hummer, G.; Garcia, A.E.; Soumpasis, D.M.

    1994-10-01

    To understand the functioning of living organisms on a molecular level, it is crucial to dissect the intricate interplay of the immense number of biological molecules. Most of the biochemical processes in cells occur in a liquid environment formed mainly by water and ions. This solvent environment plays an important role in biological systems. The potential-of-mean-force (PMF) formalism attempts to describe quantitatively the interactions of the solvent with biological macromolecules on the basis of an approximate statistical-mechanical representation. At its current status of development, it deals with ionic effects on the biomolecular structure and with the structural hydration of biomolecules.more » The underlying idea of the PMF formalism is to identify the dominant sources of interactions and incorporate these interactions into the theoretical formalism using PMF`s (or particle correlation functions) extracted from bulk-liquid systems. In the following, the authors shall briefly outline the statistical-mechanical foundation of the PMF formalism and introduce the PMF expansion formalism, which is intimately linked to superposition approximations for higher-order particle correlation functions. The authors shall then sketch applications, which describe the effects of the ionic environment on nucleic-acid structure. Finally, the authors shall present the more recent extension of the PMF idea to describe quantitatively the structural hydration of biomolecules. Results for the interface of ice and water and for the hydration of deoxyribonucleic acid (DNA) will be discussed.« less

  13. Electrochemical evaluation of manganese reducers - Recovery of Mn from Zn-Mn and Zn-C battery waste

    NASA Astrophysics Data System (ADS)

    Sobianowska-Turek, Agnieszka; Szczepaniak, Włodzimierz; Zabłocka-Malicka, Monika

    2014-12-01

    Extraction of manganese from ores or battery waste involves the use of reductive reagents for transformation of MnO2 to Mn2+ ions. There are many reducers, both organic and inorganic, described in the literature. A series of 18 reducers has been discussed in the paper and they were classified according to standard redox potential (pE = -log ae- where pE is used to express formal electron activity and ae- is formal electron activity). The experiments of manganese extraction from paramagnetic fraction of Zn-C and Zn-Mn battery waste in the laboratory scale have been described for 3 reducers of different origin. The best result was achieved with oxalic acid (75%, with the lowest redox potential) and urea (with typical redox potential) appeared inactive. Extraction supported by hydrogen peroxide resulted in moderate yield (50%). It shows that formal thermodynamic scale is only preliminary information useful for selection of possible reducers for manganese extraction resources.

  14. A linguistic geometry for 3D strategic planning

    NASA Technical Reports Server (NTRS)

    Stilman, Boris

    1995-01-01

    This paper is a new step in the development and application of the Linguistic Geometry. This formal theory is intended to discover the inner properties of human expert heuristics, which have been successful in a certain class of complex control systems, and apply them to different systems. In this paper we investigate heuristics extracted in the form of hierarchical networks of planning paths of autonomous agents. Employing Linguistic Geometry tools the dynamic hierarchy of networks is represented as a hierarchy of formal attribute languages. The main ideas of this methodology are shown in this paper on the new pilot example of the solution of the extremely complex 3D optimization problem of strategic planning for the space combat of autonomous vehicles. This example demonstrates deep and highly selective search in comparison with conventional search algorithms.

  15. Creating an ontology driven rules base for an expert system for medical diagnosis.

    PubMed

    Bertaud Gounot, Valérie; Donfack, Valéry; Lasbleiz, Jérémy; Bourde, Annabel; Duvauferrier, Régis

    2011-01-01

    Expert systems of the 1980s have failed on the difficulties of maintaining large rule bases. The current work proposes a method to achieve and maintain rule bases grounded on ontologies (like NCIT). The process described here for an expert system on plasma cell disorder encompasses extraction of a sub-ontology and automatic and comprehensive generation of production rules. The creation of rules is not based directly on classes, but on individuals (instances). Instances can be considered as prototypes of diseases formally defined by "destrictions" in the ontology. Thus, it is possible to use this process to make diagnoses of diseases. The perspectives of this work are considered: the process described with an ontology formalized in OWL1 can be extended by using an ontology in OWL2 and allow reasoning about numerical data in addition to symbolic data.

  16. Real-Time Reliability Verification for UAV Flight Control System Supporting Airworthiness Certification.

    PubMed

    Xu, Haiyang; Wang, Ping

    2016-01-01

    In order to verify the real-time reliability of unmanned aerial vehicle (UAV) flight control system and comply with the airworthiness certification standard, we proposed a model-based integration framework for modeling and verification of time property. Combining with the advantages of MARTE, this framework uses class diagram to create the static model of software system, and utilizes state chart to create the dynamic model. In term of the defined transformation rules, the MARTE model could be transformed to formal integrated model, and the different part of the model could also be verified by using existing formal tools. For the real-time specifications of software system, we also proposed a generating algorithm for temporal logic formula, which could automatically extract real-time property from time-sensitive live sequence chart (TLSC). Finally, we modeled the simplified flight control system of UAV to check its real-time property. The results showed that the framework could be used to create the system model, as well as precisely analyze and verify the real-time reliability of UAV flight control system.

  17. Real-Time Reliability Verification for UAV Flight Control System Supporting Airworthiness Certification

    PubMed Central

    Xu, Haiyang; Wang, Ping

    2016-01-01

    In order to verify the real-time reliability of unmanned aerial vehicle (UAV) flight control system and comply with the airworthiness certification standard, we proposed a model-based integration framework for modeling and verification of time property. Combining with the advantages of MARTE, this framework uses class diagram to create the static model of software system, and utilizes state chart to create the dynamic model. In term of the defined transformation rules, the MARTE model could be transformed to formal integrated model, and the different part of the model could also be verified by using existing formal tools. For the real-time specifications of software system, we also proposed a generating algorithm for temporal logic formula, which could automatically extract real-time property from time-sensitive live sequence chart (TLSC). Finally, we modeled the simplified flight control system of UAV to check its real-time property. The results showed that the framework could be used to create the system model, as well as precisely analyze and verify the real-time reliability of UAV flight control system. PMID:27918594

  18. Application of growing nested Petri nets for modeling robotic systems operating under risk

    NASA Astrophysics Data System (ADS)

    Sorokin, E. V.; Senkov, A. V.

    2017-10-01

    The paper studies the peculiarities of modeling robotic systems engaged in mining. Existing modeling mechanisms are considered, which are based on nested Petri nets, and a new formalism of growing Petri nets is presented that allows modeling robotic systems operating under risk. Modeling is provided both for the regular operation mode and for non-standard modes in which individual elements of the system can perform uncharacteristic functions. The example shows growing Petri nets that are used for modeling extraction of flat coal seams by a robotic system consisting of several different-type autonomous robots.

  19. Cost of Quality Evaluation Methodologies Handbook

    DTIC Science & Technology

    1988-07-28

    policy. 2. Use multiple vendors for major procurements. 3. Establish a formal vendor qualification process. 4. Conduct joint quality planning; agree...and from which extrapolations and inter- polations may be extracted for estimating purposes. COST OF QUALITY - The costs of all efforts expended to...PRODUCIBILITY - The relative ease of producing an item or system which is governed by the characteristics and features of a design that enable

  20. Higher-harmonic collective modes in a trapped gas from second-order hydrodynamics

    DOE PAGES

    Lewis, William E.; Romatschke, P.

    2017-02-21

    Utilizing a second-order hydrodynamics formalism, the dispersion relations for the frequencies and damping rates of collective oscillations as well as spatial structure of these modes up to the decapole oscillation in both two- and three- dimensional gas geometries are calculated. In addition to higher-order modes, the formalism also gives rise to purely damped "non-hydrodynamic" modes. We calculate the amplitude of the various modes for both symmetric and asymmetric trap quenches, finding excellent agreement with an exact quantum mechanical calculation. Furthermore, we find that higher-order hydrodynamic modes are more sensitive to the value of shear viscosity, which may be of interestmore » for the precision extraction of transport coefficients in Fermi gas systems.« less

  1. Higher-harmonic collective modes in a trapped gas from second-order hydrodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, William E.; Romatschke, P.

    Utilizing a second-order hydrodynamics formalism, the dispersion relations for the frequencies and damping rates of collective oscillations as well as spatial structure of these modes up to the decapole oscillation in both two- and three- dimensional gas geometries are calculated. In addition to higher-order modes, the formalism also gives rise to purely damped "non-hydrodynamic" modes. We calculate the amplitude of the various modes for both symmetric and asymmetric trap quenches, finding excellent agreement with an exact quantum mechanical calculation. Furthermore, we find that higher-order hydrodynamic modes are more sensitive to the value of shear viscosity, which may be of interestmore » for the precision extraction of transport coefficients in Fermi gas systems.« less

  2. Multi-level and hybrid modelling approaches for systems biology.

    PubMed

    Bardini, R; Politano, G; Benso, A; Di Carlo, S

    2017-01-01

    During the last decades, high-throughput techniques allowed for the extraction of a huge amount of data from biological systems, unveiling more of their underling complexity. Biological systems encompass a wide range of space and time scales, functioning according to flexible hierarchies of mechanisms making an intertwined and dynamic interplay of regulations. This becomes particularly evident in processes such as ontogenesis, where regulative assets change according to process context and timing, making structural phenotype and architectural complexities emerge from a single cell, through local interactions. The information collected from biological systems are naturally organized according to the functional levels composing the system itself. In systems biology, biological information often comes from overlapping but different scientific domains, each one having its own way of representing phenomena under study. That is, the different parts of the system to be modelled may be described with different formalisms. For a model to have improved accuracy and capability for making a good knowledge base, it is good to comprise different system levels, suitably handling the relative formalisms. Models which are both multi-level and hybrid satisfy both these requirements, making a very useful tool in computational systems biology. This paper reviews some of the main contributions in this field.

  3. Using the EC decision on case definitions for communicable diseases as a terminology source--lessons learned.

    PubMed

    Balkanyi, Laszlo; Heja, Gergely; Nagy, Attlia

    2014-01-01

    Extracting scientifically accurate terminology from an EU public health regulation is part of the knowledge engineering work at the European Centre for Disease Prevention and Control (ECDC). ECDC operates information systems at the crossroads of many areas - posing a challenge for transparency and consistency. Semantic interoperability is based on the Terminology Server (TS). TS value sets (structured vocabularies) describe shared domains as "diseases", "organisms", "public health terms", "geo-entities" "organizations" and "administrative terms" and others. We extracted information from the relevant EC Implementing Decision on case definitions for reporting communicable diseases, listing 53 notifiable infectious diseases, containing clinical, diagnostic, laboratory and epidemiological criteria. We performed a consistency check; a simplification - abstraction; we represented lab criteria in triplets: as 'y' procedural result /of 'x' organism-substance/on 'z' specimen and identified negations. The resulting new case definition value set represents the various formalized criteria, meanwhile the existing disease value set has been extended, new signs and symptoms were added. New organisms enriched the organism value set. Other new categories have been added to the public health value set, as transmission modes; substances; specimens and procedures. We identified problem areas, as (a) some classification error(s); (b) inconsistent granularity of conditions; (c) seemingly nonsense criteria, medical trivialities; (d) possible logical errors, (e) seemingly factual errors that might be phrasing errors. We think our hypothesis regarding room for possible improvements is valid: there are some open issues and a further improved legal text might lead to more precise epidemiologic data collection. It has to be noted that formal representation for automatic classification of cases was out of scope, such a task would require other formalism, as e.g. those used by rule-based decision support systems.

  4. Improved formalism for precision Higgs coupling fits

    NASA Astrophysics Data System (ADS)

    Barklow, Tim; Fujii, Keisuke; Jung, Sunghoon; Karl, Robert; List, Jenny; Ogawa, Tomohisa; Peskin, Michael E.; Tian, Junping

    2018-03-01

    Future e+e- colliders give the promise of model-independent determinations of the couplings of the Higgs boson. In this paper, we present an improved formalism for extracting Higgs boson couplings from e+e- data, based on the effective field theory description of corrections to the Standard Model. We apply this formalism to give projections of Higgs coupling accuracies for stages of the International Linear Collider and for other proposed e+e- colliders.

  5. Semantically-Rigorous Systems Engineering Modeling Using Sysml and OWL

    NASA Technical Reports Server (NTRS)

    Jenkins, J. Steven; Rouquette, Nicolas F.

    2012-01-01

    The Systems Modeling Language (SysML) has found wide acceptance as a standard graphical notation for the domain of systems engineering. SysML subsets and extends the Unified Modeling Language (UML) to define conventions for expressing structural, behavioral, and analytical elements, and relationships among them. SysML-enabled modeling tools are available from multiple providers, and have been used for diverse projects in military aerospace, scientific exploration, and civil engineering. The Web Ontology Language (OWL) has found wide acceptance as a standard notation for knowledge representation. OWL-enabled modeling tools are available from multiple providers, as well as auxiliary assets such as reasoners and application programming interface libraries, etc. OWL has been applied to diverse projects in a wide array of fields. While the emphasis in SysML is on notation, SysML inherits (from UML) a semantic foundation that provides for limited reasoning and analysis. UML's partial formalization (FUML), however, does not cover the full semantics of SysML, which is a substantial impediment to developing high confidence in the soundness of any conclusions drawn therefrom. OWL, by contrast, was developed from the beginning on formal logical principles, and consequently provides strong support for verification of consistency and satisfiability, extraction of entailments, conjunctive query answering, etc. This emphasis on formal logic is counterbalanced by the absence of any graphical notation conventions in the OWL standards. Consequently, OWL has had only limited adoption in systems engineering. The complementary strengths and weaknesses of SysML and OWL motivate an interest in combining them in such a way that we can benefit from the attractive graphical notation of SysML and the formal reasoning of OWL. This paper describes an approach to achieving that combination.

  6. KneeTex: an ontology-driven system for information extraction from MRI reports.

    PubMed

    Spasić, Irena; Zhao, Bo; Jones, Christopher B; Button, Kate

    2015-01-01

    In the realm of knee pathology, magnetic resonance imaging (MRI) has the advantage of visualising all structures within the knee joint, which makes it a valuable tool for increasing diagnostic accuracy and planning surgical treatments. Therefore, clinical narratives found in MRI reports convey valuable diagnostic information. A range of studies have proven the feasibility of natural language processing for information extraction from clinical narratives. However, no study focused specifically on MRI reports in relation to knee pathology, possibly due to the complexity of knee anatomy and a wide range of conditions that may be associated with different anatomical entities. In this paper we describe KneeTex, an information extraction system that operates in this domain. As an ontology-driven information extraction system, KneeTex makes active use of an ontology to strongly guide and constrain text analysis. We used automatic term recognition to facilitate the development of a domain-specific ontology with sufficient detail and coverage for text mining applications. In combination with the ontology, high regularity of the sublanguage used in knee MRI reports allowed us to model its processing by a set of sophisticated lexico-semantic rules with minimal syntactic analysis. The main processing steps involve named entity recognition combined with coordination, enumeration, ambiguity and co-reference resolution, followed by text segmentation. Ontology-based semantic typing is then used to drive the template filling process. We adopted an existing ontology, TRAK (Taxonomy for RehAbilitation of Knee conditions), for use within KneeTex. The original TRAK ontology expanded from 1,292 concepts, 1,720 synonyms and 518 relationship instances to 1,621 concepts, 2,550 synonyms and 560 relationship instances. This provided KneeTex with a very fine-grained lexico-semantic knowledge base, which is highly attuned to the given sublanguage. Information extraction results were evaluated on a test set of 100 MRI reports. A gold standard consisted of 1,259 filled template records with the following slots: finding, finding qualifier, negation, certainty, anatomy and anatomy qualifier. KneeTex extracted information with precision of 98.00 %, recall of 97.63 % and F-measure of 97.81 %, the values of which are in line with human-like performance. KneeTex is an open-source, stand-alone application for information extraction from narrative reports that describe an MRI scan of the knee. Given an MRI report as input, the system outputs the corresponding clinical findings in the form of JavaScript Object Notation objects. The extracted information is mapped onto TRAK, an ontology that formally models knowledge relevant for the rehabilitation of knee conditions. As a result, formally structured and coded information allows for complex searches to be conducted efficiently over the original MRI reports, thereby effectively supporting epidemiologic studies of knee conditions.

  7. Natural language processing and visualization in the molecular imaging domain.

    PubMed

    Tulipano, P Karina; Tao, Ying; Millar, William S; Zanzonico, Pat; Kolbert, Katherine; Xu, Hua; Yu, Hong; Chen, Lifeng; Lussier, Yves A; Friedman, Carol

    2007-06-01

    Molecular imaging is at the crossroads of genomic sciences and medical imaging. Information within the molecular imaging literature could be used to link to genomic and imaging information resources and to organize and index images in a way that is potentially useful to researchers. A number of natural language processing (NLP) systems are available to automatically extract information from genomic literature. One existing NLP system, known as BioMedLEE, automatically extracts biological information consisting of biomolecular substances and phenotypic data. This paper focuses on the adaptation, evaluation, and application of BioMedLEE to the molecular imaging domain. In order to adapt BioMedLEE for this domain, we extend an existing molecular imaging terminology and incorporate it into BioMedLEE. BioMedLEE's performance is assessed with a formal evaluation study. The system's performance, measured as recall and precision, is 0.74 (95% CI: [.70-.76]) and 0.70 (95% CI [.63-.76]), respectively. We adapt a JAVA viewer known as PGviewer for the simultaneous visualization of images with NLP extracted information.

  8. The methodology of semantic analysis for extracting physical effects

    NASA Astrophysics Data System (ADS)

    Fomenkova, M. A.; Kamaev, V. A.; Korobkin, D. M.; Fomenkov, S. A.

    2017-01-01

    The paper represents new methodology of semantic analysis for physical effects extracting. This methodology is based on the Tuzov ontology that formally describes the Russian language. In this paper, semantic patterns were described to extract structural physical information in the form of physical effects. A new algorithm of text analysis was described.

  9. Improved formalism for precision Higgs coupling fits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barklow, Tim; Fujii, Keisuke; Jung, Sunghoon

    Future e +e – colliders give the promise of model-independent determinations of the couplings of the Higgs boson. In this paper, we present an improved formalism for extracting Higgs boson couplings from e +e – data, based on the effective field theory description of corrections to the Standard Model. Lastly, we apply this formalism to give projections of Higgs coupling accuracies for stages of the International Linear Collider and for other proposed e +e – colliders.

  10. Improved formalism for precision Higgs coupling fits

    DOE PAGES

    Barklow, Tim; Fujii, Keisuke; Jung, Sunghoon; ...

    2018-03-20

    Future e +e – colliders give the promise of model-independent determinations of the couplings of the Higgs boson. In this paper, we present an improved formalism for extracting Higgs boson couplings from e +e – data, based on the effective field theory description of corrections to the Standard Model. Lastly, we apply this formalism to give projections of Higgs coupling accuracies for stages of the International Linear Collider and for other proposed e +e – colliders.

  11. Computing generalized Langevin equations and generalized Fokker-Planck equations.

    PubMed

    Darve, Eric; Solomon, Jose; Kia, Amirali

    2009-07-07

    The Mori-Zwanzig formalism is an effective tool to derive differential equations describing the evolution of a small number of resolved variables. In this paper we present its application to the derivation of generalized Langevin equations and generalized non-Markovian Fokker-Planck equations. We show how long time scales rates and metastable basins can be extracted from these equations. Numerical algorithms are proposed to discretize these equations. An important aspect is the numerical solution of the orthogonal dynamics equation which is a partial differential equation in a high dimensional space. We propose efficient numerical methods to solve this orthogonal dynamics equation. In addition, we present a projection formalism of the Mori-Zwanzig type that is applicable to discrete maps. Numerical applications are presented from the field of Hamiltonian systems.

  12. Planning Non-Formal Education Curricula: The Case of Israel.

    ERIC Educational Resources Information Center

    Keller, Diana; Dror, Ilana

    This paper compares the formal and non-formal education systems currently operating in Israel, describing the special features of curriculum planning in non-formal education. The central argument is that the non-formal education system fulfills functions that constitute a critique of the formal education system. The non-formal system offers the…

  13. [Implementation of ontology-based clinical decision support system for management of interactions between antihypertensive drugs and diet].

    PubMed

    Park, Jeong Eun; Kim, Hwa Sun; Chang, Min Jung; Hong, Hae Sook

    2014-06-01

    The influence of dietary composition on blood pressure is an important subject in healthcare. Interactions between antihypertensive drugs and diet (IBADD) is the most important factor in the management of hypertension. It is therefore essential to support healthcare providers' decision making role in active and continuous interaction control in hypertension management. The aim of this study was to implement an ontology-based clinical decision support system (CDSS) for IBADD management (IBADDM). We considered the concepts of antihypertensive drugs and foods, and focused on the interchangeability between the database and the CDSS when providing tailored information. An ontology-based CDSS for IBADDM was implemented in eight phases: (1) determining the domain and scope of ontology, (2) reviewing existing ontology, (3) extracting and defining the concepts, (4) assigning relationships between concepts, (5) creating a conceptual map with CmapTools, (6) selecting upper ontology, (7) formally representing the ontology with Protégé (ver.4.3), (8) implementing an ontology-based CDSS as a JAVA prototype application. We extracted 5,926 concepts, 15 properties, and formally represented them using Protégé. An ontology-based CDSS for IBADDM was implemented and the evaluation score was 4.60 out of 5. We endeavored to map functions of a CDSS and implement an ontology-based CDSS for IBADDM.

  14. Matching biomedical ontologies based on formal concept analysis.

    PubMed

    Zhao, Mengyi; Zhang, Songmao; Li, Weizhuo; Chen, Guowei

    2018-03-19

    The goal of ontology matching is to identify correspondences between entities from different yet overlapping ontologies so as to facilitate semantic integration, reuse and interoperability. As a well developed mathematical model for analyzing individuals and structuring concepts, Formal Concept Analysis (FCA) has been applied to ontology matching (OM) tasks since the beginning of OM research, whereas ontological knowledge exploited in FCA-based methods is limited. This motivates the study in this paper, i.e., to empower FCA with as much as ontological knowledge as possible for identifying mappings across ontologies. We propose a method based on Formal Concept Analysis to identify and validate mappings across ontologies, including one-to-one mappings, complex mappings and correspondences between object properties. Our method, called FCA-Map, incrementally generates a total of five types of formal contexts and extracts mappings from the lattices derived. First, the token-based formal context describes how class names, labels and synonyms share lexical tokens, leading to lexical mappings (anchors) across ontologies. Second, the relation-based formal context describes how classes are in taxonomic, partonomic and disjoint relationships with the anchors, leading to positive and negative structural evidence for validating the lexical matching. Third, the positive relation-based context can be used to discover structural mappings. Afterwards, the property-based formal context describes how object properties are used in axioms to connect anchor classes across ontologies, leading to property mappings. Last, the restriction-based formal context describes co-occurrence of classes across ontologies in anonymous ancestors of anchors, from which extended structural mappings and complex mappings can be identified. Evaluation on the Anatomy, the Large Biomedical Ontologies, and the Disease and Phenotype track of the 2016 Ontology Alignment Evaluation Initiative campaign demonstrates the effectiveness of FCA-Map and its competitiveness with the top-ranked systems. FCA-Map can achieve a better balance between precision and recall for large-scale domain ontologies through constructing multiple FCA structures, whereas it performs unsatisfactorily for smaller-sized ontologies with less lexical and semantic expressions. Compared with other FCA-based OM systems, the study in this paper is more comprehensive as an attempt to push the envelope of the Formal Concept Analysis formalism in ontology matching tasks. Five types of formal contexts are constructed incrementally, and their derived concept lattices are used to cluster the commonalities among classes at lexical and structural level, respectively. Experiments on large, real-world domain ontologies show promising results and reveal the power of FCA.

  15. LexValueSets: An Approach for Context-Driven Value Sets Extraction

    PubMed Central

    Pathak, Jyotishman; Jiang, Guoqian; Dwarkanath, Sridhar O.; Buntrock, James D.; Chute, Christopher G.

    2008-01-01

    The ability to model, share and re-use value sets across multiple medical information systems is an important requirement. However, generating value sets semi-automatically from a terminology service is still an unresolved issue, in part due to the lack of linkage to clinical context patterns that provide the constraints in defining a concept domain and invocation of value sets extraction. Towards this goal, we develop and evaluate an approach for context-driven automatic value sets extraction based on a formal terminology model. The crux of the technique is to identify and define the context patterns from various domains of discourse and leverage them for value set extraction using two complementary ideas based on (i) local terms provided by the Subject Matter Experts (extensional) and (ii) semantic definition of the concepts in coding schemes (intensional). A prototype was implemented based on SNOMED CT rendered in the LexGrid terminology model and a preliminary evaluation is presented. PMID:18998955

  16. Observer properties for understanding dynamical displays: Capacities, limitations, and defaults

    NASA Technical Reports Server (NTRS)

    Proffitt, Dennis R.; Kaiser, Mary K.

    1991-01-01

    People's ability to extract relevant information while viewing ongoing events is discussed in terms of human capabilities, limitations, and defaults. A taxonomy of event complexity is developed which predicts which dynamical events people can and cannot construe. This taxonomy is related to the distinction drawn in classical mechanics between particle and extended body motions. People's commonsense understandings of simple mechanical systems are impacted little by formal training, but rather reflect heuristical simplifications that focus on a single dimension of perceived dynamical relevance.

  17. System Architecture for Temporal Information Extraction, Representation and Reasoning in Clinical Narrative Reports

    PubMed Central

    Zhou, Li; Friedman, Carol; Parsons, Simon; Hripcsak, George

    2005-01-01

    Exploring temporal information in narrative Electronic Medical Records (EMRs) is essential and challenging. We propose an architecture for an integrated approach to process temporal information in clinical narrative reports. The goal is to initiate and build a foundation that supports applications which assist healthcare practice and research by including the ability to determine the time of clinical events (e.g., past vs. present). Key components include: (1) a temporal constraint structure for temporal expressions and the development of an associated tagger; (2) a Natural Language Processing (NLP) system for encoding and extracting medical events and associating them with formalized temporal data; (3) a post-processor, with a knowledge-based subsystem to help discover implicit information, that resolves temporal expressions and deals with issues such as granularity and vagueness; and (4) a reasoning mechanism which models clinical reports as Simple Temporal Problems (STPs). PMID:16779164

  18. Non-Formal Education in Ethiopia: The Modern Sector. Program of Studies in Non-Formal Education. Discussion Papers. No. 6.

    ERIC Educational Resources Information Center

    Niehoff, Richard O.; Wilder, Bernard

    Nonformal education programs operating in the modern sector in Ethiopia are described in a perspective relevant to the Ethiopian context. The modern sector is defined as those activities concerned with the manufacture of goods, extraction of raw materials, the processing of raw materials, the provision of services, and the creation and maintenance…

  19. Applications of artificial intelligence to digital photogrammetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kretsch, J.L.

    1988-01-01

    The aim of this research was to explore the application of expert systems to digital photogrammetry, specifically to photogrammetric triangulation, feature extraction, and photogrammetric problem solving. In 1987, prototype expert systems were developed for doing system startup, interior orientation, and relative orientation in the mensuration stage. The system explored means of performing diagnostics during the process. In the area of feature extraction, the relationship of metric uncertainty to symbolic uncertainty was the topic of research. Error propagation through the Dempster-Shafer formalism for representing evidence was performed in order to find the variance in the calculated belief values due to errorsmore » in measurements made together the initial evidence needed to being labeling of observed image features with features in an object model. In photogrammetric problem solving, an expert system is under continuous development which seeks to solve photogrammetric problems using mathematical reasoning. The key to the approach used is the representation of knowledge directly in the form of equations, rather than in the form of if-then rules. Then each variable in the equations is treated as a goal to be solved.« less

  20. Resonances from lattice QCD

    DOE PAGES

    Briceno, Raul A.

    2018-03-26

    The spectrum of hadron is mainly composed as shortly-lived states (resonance) that decay onto two or more hadrons. These resonances play an important role in a variety of phenomenologically significant processes. In this talk, I give an overview on the present status of a rigorous program for studying of resonances and their properties using lattice QCD. I explain the formalism needed for extracting resonant amplitudes from the finite-volume spectra. From these one can extract the masses and widths of resonances. I present some recent examples that illustrate the power of these ideas. I then explain similar formalism that allows formore » the determination of resonant electroweak amplitudes from finite-volume matrix elements. I use the recent calculation of the πγ* → ππ amplitude as an example illustrating the power of this formalism. From such amplitudes one can determine transition form factors of resonances. I close by reviewing on-going efforts to generalize these ideas to increasingly complex reactions and I then give a outlook of the field.« less

  1. Resonances from lattice QCD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Briceno, Raul A.

    The spectrum of hadron is mainly composed as shortly-lived states (resonance) that decay onto two or more hadrons. These resonances play an important role in a variety of phenomenologically significant processes. In this talk, I give an overview on the present status of a rigorous program for studying of resonances and their properties using lattice QCD. I explain the formalism needed for extracting resonant amplitudes from the finite-volume spectra. From these one can extract the masses and widths of resonances. I present some recent examples that illustrate the power of these ideas. I then explain similar formalism that allows formore » the determination of resonant electroweak amplitudes from finite-volume matrix elements. I use the recent calculation of the πγ* → ππ amplitude as an example illustrating the power of this formalism. From such amplitudes one can determine transition form factors of resonances. I close by reviewing on-going efforts to generalize these ideas to increasingly complex reactions and I then give a outlook of the field.« less

  2. Modeling Cyber Conflicts Using an Extended Petri Net Formalism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zakrzewska, Anita N; Ferragut, Erik M

    2011-01-01

    When threatened by automated attacks, critical systems that require human-controlled responses have difficulty making optimal responses and adapting protections in real- time and may therefore be overwhelmed. Consequently, experts have called for the development of automatic real-time reaction capabilities. However, a technical gap exists in the modeling and analysis of cyber conflicts to automatically understand the repercussions of responses. There is a need for modeling cyber assets that accounts for concurrent behavior, incomplete information, and payoff functions. Furthermore, we address this need by extending the Petri net formalism to allow real-time cyber conflicts to be modeled in a way thatmore » is expressive and concise. This formalism includes transitions controlled by players as well as firing rates attached to transitions. This allows us to model both player actions and factors that are beyond the control of players in real-time. We show that our formalism is able to represent situational aware- ness, concurrent actions, incomplete information and objective functions. These factors make it well-suited to modeling cyber conflicts in a way that allows for useful analysis. MITRE has compiled the Common Attack Pattern Enumera- tion and Classification (CAPEC), an extensive list of cyber attacks at various levels of abstraction. CAPEC includes factors such as attack prerequisites, possible countermeasures, and attack goals. These elements are vital to understanding cyber attacks and to generating the corresponding real-time responses. We demonstrate that the formalism can be used to extract precise models of cyber attacks from CAPEC. Several case studies show that our Petri net formalism is more expressive than other models, such as attack graphs, for modeling cyber conflicts and that it is amenable to exploring cyber strategies.« less

  3. Dynamic Gate Product and Artifact Generation from System Models

    NASA Technical Reports Server (NTRS)

    Jackson, Maddalena; Delp, Christopher; Bindschadler, Duane; Sarrel, Marc; Wollaeger, Ryan; Lam, Doris

    2011-01-01

    Model Based Systems Engineering (MBSE) is gaining acceptance as a way to formalize systems engineering practice through the use of models. The traditional method of producing and managing a plethora of disjointed documents and presentations ("Power-Point Engineering") has proven both costly and limiting as a means to manage the complex and sophisticated specifications of modern space systems. We have developed a tool and method to produce sophisticated artifacts as views and by-products of integrated models, allowing us to minimize the practice of "Power-Point Engineering" from model-based projects and demonstrate the ability of MBSE to work within and supersede traditional engineering practices. This paper describes how we have created and successfully used model-based document generation techniques to extract paper artifacts from complex SysML and UML models in support of successful project reviews. Use of formal SysML and UML models for architecture and system design enables production of review documents, textual artifacts, and analyses that are consistent with one-another and require virtually no labor-intensive maintenance across small-scale design changes and multiple authors. This effort thus enables approaches that focus more on rigorous engineering work and less on "PowerPoint engineering" and production of paper-based documents or their "office-productivity" file equivalents.

  4. Model-Based Testability Assessment and Directed Troubleshooting of Shuttle Wiring Systems

    NASA Technical Reports Server (NTRS)

    Deb, Somnath; Domagala, Chuck; Shrestha, Roshan; Malepati, Venkatesh; Cavanaugh, Kevin; Patterson-Hine, Ann; Sanderfer, Dwight; Cockrell, Jim; Norvig, Peter (Technical Monitor)

    2000-01-01

    We have recently completed a pilot study on the Space shuttle wiring system commissioned by the Wiring Integrity Research (WIRe) team at NASA Ames Research Center, As the space shuttle ages, it is experiencing wiring degradation problems including arcing, chaffing insulation breakdown and broken conductors. A systematic and comprehensive test process is required to thoroughly test and quality assure (QA) the wiring systems. The NASA WIRe team recognized the value of a formal model based analysis for risk-assessment and fault coverage analysis. However. wiring systems are complex and involve over 50,000 wire segments. Therefore, NASA commissioned this pilot study with Qualtech Systems. Inc. (QSI) to explore means of automatically extracting high fidelity multi-signal models from wiring information database for use with QSI's Testability Engineering and Maintenance System (TEAMS) tool.

  5. TEMPTING system: a hybrid method of rule and machine learning for temporal relation extraction in patient discharge summaries.

    PubMed

    Chang, Yung-Chun; Dai, Hong-Jie; Wu, Johnny Chi-Yang; Chen, Jian-Ming; Tsai, Richard Tzong-Han; Hsu, Wen-Lian

    2013-12-01

    Patient discharge summaries provide detailed medical information about individuals who have been hospitalized. To make a precise and legitimate assessment of the abundant data, a proper time layout of the sequence of relevant events should be compiled and used to drive a patient-specific timeline, which could further assist medical personnel in making clinical decisions. The process of identifying the chronological order of entities is called temporal relation extraction. In this paper, we propose a hybrid method to identify appropriate temporal links between a pair of entities. The method combines two approaches: one is rule-based and the other is based on the maximum entropy model. We develop an integration algorithm to fuse the results of the two approaches. All rules and the integration algorithm are formally stated so that one can easily reproduce the system and results. To optimize the system's configuration, we used the 2012 i2b2 challenge TLINK track dataset and applied threefold cross validation to the training set. Then, we evaluated its performance on the training and test datasets. The experiment results show that the proposed TEMPTING (TEMPoral relaTion extractING) system (ranked seventh) achieved an F-score of 0.563, which was at least 30% better than that of the baseline system, which randomly selects TLINK candidates from all pairs and assigns the TLINK types. The TEMPTING system using the hybrid method also outperformed the stage-based TEMPTING system. Its F-scores were 3.51% and 0.97% better than those of the stage-based system on the training set and test set, respectively. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. Knowledge Acquisition of Generic Queries for Information Retrieval

    PubMed Central

    Seol, Yoon-Ho; Johnson, Stephen B.; Cimino, James J.

    2002-01-01

    Several studies have identified clinical questions posed by health care professionals to understand the nature of information needs during clinical practice. To support access to digital information sources, it is necessary to integrate the information needs with a computer system. We have developed a conceptual guidance approach in information retrieval, based on a knowledge base that contains the patterns of information needs. The knowledge base uses a formal representation of clinical questions based on the UMLS knowledge sources, called the Generic Query model. To improve the coverage of the knowledge base, we investigated a method for extracting plausible clinical questions from the medical literature. This poster presents the Generic Query model, shows how it is used to represent the patterns of clinical questions, and describes the framework used to extract knowledge from the medical literature.

  7. Studies on behaviour of information to extract the meaning behind the behaviour

    NASA Astrophysics Data System (ADS)

    Nasution, M. K. M.; Syah, R.; Elveny, M.

    2017-01-01

    Web as social media can be used as a reference for determining social behaviour. However, the information extraction involves a search engine is not easy to give that picture. There are several properties of the search engine to be formally disclosed to provide assurance that the information is feasible. Although quite a lot of research that has revealed the interest of the Web as social media, but a few of them that have revealed behaviour of information related to social behaviour. In this case, it needs the formal steps to present possibilities related properties. There are 12 properties that are interconnected as behaviour of information and then it reveals several meanings based on the simulation results of any search engine.

  8. Semantic and syntactic interoperability in online processing of big Earth observation data.

    PubMed

    Sudmanns, Martin; Tiede, Dirk; Lang, Stefan; Baraldi, Andrea

    2018-01-01

    The challenge of enabling syntactic and semantic interoperability for comprehensive and reproducible online processing of big Earth observation (EO) data is still unsolved. Supporting both types of interoperability is one of the requirements to efficiently extract valuable information from the large amount of available multi-temporal gridded data sets. The proposed system wraps world models, (semantic interoperability) into OGC Web Processing Services (syntactic interoperability) for semantic online analyses. World models describe spatio-temporal entities and their relationships in a formal way. The proposed system serves as enabler for (1) technical interoperability using a standardised interface to be used by all types of clients and (2) allowing experts from different domains to develop complex analyses together as collaborative effort. Users are connecting the world models online to the data, which are maintained in a centralised storage as 3D spatio-temporal data cubes. It allows also non-experts to extract valuable information from EO data because data management, low-level interactions or specific software issues can be ignored. We discuss the concept of the proposed system, provide a technical implementation example and describe three use cases for extracting changes from EO images and demonstrate the usability also for non-EO, gridded, multi-temporal data sets (CORINE land cover).

  9. Semantic and syntactic interoperability in online processing of big Earth observation data

    PubMed Central

    Sudmanns, Martin; Tiede, Dirk; Lang, Stefan; Baraldi, Andrea

    2018-01-01

    ABSTRACT The challenge of enabling syntactic and semantic interoperability for comprehensive and reproducible online processing of big Earth observation (EO) data is still unsolved. Supporting both types of interoperability is one of the requirements to efficiently extract valuable information from the large amount of available multi-temporal gridded data sets. The proposed system wraps world models, (semantic interoperability) into OGC Web Processing Services (syntactic interoperability) for semantic online analyses. World models describe spatio-temporal entities and their relationships in a formal way. The proposed system serves as enabler for (1) technical interoperability using a standardised interface to be used by all types of clients and (2) allowing experts from different domains to develop complex analyses together as collaborative effort. Users are connecting the world models online to the data, which are maintained in a centralised storage as 3D spatio-temporal data cubes. It allows also non-experts to extract valuable information from EO data because data management, low-level interactions or specific software issues can be ignored. We discuss the concept of the proposed system, provide a technical implementation example and describe three use cases for extracting changes from EO images and demonstrate the usability also for non-EO, gridded, multi-temporal data sets (CORINE land cover). PMID:29387171

  10. Bias effects on the electronic spectrum of a molecular bridge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, Heidi; Prociuk, Alexander; Dunietz, Barry D

    2011-01-01

    In this paper the effect of bias and geometric symmetry breaking on the electronic spectrum of a model molecular system is studied. Geometric symmetry breaking can either enhance the dissipative effect of the bias, where spectral peaks are disabled, or enable new excitations that are absent under zero bias conditions. The spectralanalysis is performed on a simple model system by solving for the electronic response to an instantaneously impulsive perturbation in the dipole approximation. The dynamical response is extracted from the electronic equations of motion as expressed by the Keldysh formalism. This expression provides for the accurate treatment of themore » electronic structure of a bulk-coupled system at the chosen model Hamiltonian electronic structure level.« less

  11. Computing diffusivities from particle models out of equilibrium

    NASA Astrophysics Data System (ADS)

    Embacher, Peter; Dirr, Nicolas; Zimmer, Johannes; Reina, Celia

    2018-04-01

    A new method is proposed to numerically extract the diffusivity of a (typically nonlinear) diffusion equation from underlying stochastic particle systems. The proposed strategy requires the system to be in local equilibrium and have Gaussian fluctuations but it is otherwise allowed to undergo arbitrary out-of-equilibrium evolutions. This could be potentially relevant for particle data obtained from experimental applications. The key idea underlying the method is that finite, yet large, particle systems formally obey stochastic partial differential equations of gradient flow type satisfying a fluctuation-dissipation relation. The strategy is here applied to three classic particle models, namely independent random walkers, a zero-range process and a symmetric simple exclusion process in one space dimension, to allow the comparison with analytic solutions.

  12. Two-Nucleon Systems in a Finite Volume

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Briceno, Raul

    2014-11-01

    I present the formalism and methodology for determining the nucleon-nucleon scattering parameters from the finite volume spectra obtained from lattice quantum chromodynamics calculations. Using the recently derived energy quantization conditions and the experimentally determined scattering parameters, the bound state spectra for finite volume systems with overlap with the 3S1-3D3 channel are predicted for a range of volumes. It is shown that the extractions of the infinite-volume deuteron binding energy and the low-energy scattering parameters, including the S-D mixing angle, are possible from Lattice QCD calculations of two-nucleon systems with boosts of |P| <= 2pi sqrt{3}/L in volumes with spatial extentsmore » L satisfying fm <~ L <~ 14 fm.« less

  13. Spin coefficients and gauge fixing in the Newman-Penrose formalism

    NASA Astrophysics Data System (ADS)

    Nerozzi, Andrea

    2017-03-01

    Since its introduction in 1962, the Newman-Penrose formalism has been widely used in analytical and numerical studies of Einstein's equations, like for example for the Teukolsky master equation, or as a powerful wave extraction tool in numerical relativity. Despite the many applications, Einstein's equations in the Newman-Penrose formalism appear complicated and not easily applicable to general studies of spacetimes, mainly because physical and gauge degrees of freedom are mixed in a nontrivial way. In this paper we approach the whole formalism with the goal of expressing the spin coefficients as functions of tetrad invariants once a particular tetrad is chosen. We show that it is possible to do so, and give for the first time a general recipe for the task, as well as an indication of the quantities and identities that are required.

  14. Formal verification of medical monitoring software using Z language: a representative sample.

    PubMed

    Babamir, Seyed Morteza; Borhani, Mehdi

    2012-08-01

    Medical monitoring systems are useful aids assisting physicians in keeping patients under constant surveillance; however, taking sound decision by the systems is a physician concern. As a result, verification of the systems behavior in monitoring patients is a matter of significant. The patient monitoring is undertaken by software in modern medical systems; so, software verification of modern medial systems have been noticed. Such verification can be achieved by the Formal Languages having mathematical foundations. Among others, the Z language is a suitable formal language has been used to formal verification of systems. This study aims to present a constructive method to verify a representative sample of a medical system by which the system is visually specified and formally verified against patient constraints stated in Z Language. Exploiting our past experience in formal modeling Continuous Infusion Insulin Pump (CIIP), we think of the CIIP system as a representative sample of medical systems in proposing our present study. The system is responsible for monitoring diabetic's blood sugar.

  15. "I Treat Him as a Normal Patient": Unveiling the Normalization Coping Strategy Among Formal Caregivers of Persons With Dementia and Its Implications for Person-Centered Care.

    PubMed

    Bentwich, Miriam Ethel; Dickman, Nomy; Oberman, Amitai; Bokek-Cohen, Ya'arit

    2017-11-01

    Currently, 47 million people have dementia, worldwide, often requiring paid care by formal caregivers. Research regarding family caregivers suggests normalization as a model for coping with negative emotional outcomes in caring for a person with dementia (PWD). The study aims to explore whether normalization coping mechanism exists among formal caregivers, reveal differences in its application among cross-cultural caregivers, and examine how this coping mechanism may be related to implementing person-centered care for PWDs. Content analysis of interviews with 20 formal caregivers from three cultural groups (Jews born in Israel [JI], Arabs born in Israel [AI], Russian immigrants [RI]), attending to PWDs. We extracted five normalization modes, revealing AI caregivers had substantially more utterances of normalization expressions than their colleagues. The normalization modes most commonly expressed by AI caregivers relate to the personhood of PWDs. These normalization modes may enhance formal caregivers' ability to employ person-centered care.

  16. Formal Models of the Network Co-occurrence Underlying Mental Operations.

    PubMed

    Bzdok, Danilo; Varoquaux, Gaël; Grisel, Olivier; Eickenberg, Michael; Poupon, Cyril; Thirion, Bertrand

    2016-06-01

    Systems neuroscience has identified a set of canonical large-scale networks in humans. These have predominantly been characterized by resting-state analyses of the task-unconstrained, mind-wandering brain. Their explicit relationship to defined task performance is largely unknown and remains challenging. The present work contributes a multivariate statistical learning approach that can extract the major brain networks and quantify their configuration during various psychological tasks. The method is validated in two extensive datasets (n = 500 and n = 81) by model-based generation of synthetic activity maps from recombination of shared network topographies. To study a use case, we formally revisited the poorly understood difference between neural activity underlying idling versus goal-directed behavior. We demonstrate that task-specific neural activity patterns can be explained by plausible combinations of resting-state networks. The possibility of decomposing a mental task into the relative contributions of major brain networks, the "network co-occurrence architecture" of a given task, opens an alternative access to the neural substrates of human cognition.

  17. Formal Models of the Network Co-occurrence Underlying Mental Operations

    PubMed Central

    Bzdok, Danilo; Varoquaux, Gaël; Grisel, Olivier; Eickenberg, Michael; Poupon, Cyril; Thirion, Bertrand

    2016-01-01

    Systems neuroscience has identified a set of canonical large-scale networks in humans. These have predominantly been characterized by resting-state analyses of the task-unconstrained, mind-wandering brain. Their explicit relationship to defined task performance is largely unknown and remains challenging. The present work contributes a multivariate statistical learning approach that can extract the major brain networks and quantify their configuration during various psychological tasks. The method is validated in two extensive datasets (n = 500 and n = 81) by model-based generation of synthetic activity maps from recombination of shared network topographies. To study a use case, we formally revisited the poorly understood difference between neural activity underlying idling versus goal-directed behavior. We demonstrate that task-specific neural activity patterns can be explained by plausible combinations of resting-state networks. The possibility of decomposing a mental task into the relative contributions of major brain networks, the "network co-occurrence architecture" of a given task, opens an alternative access to the neural substrates of human cognition. PMID:27310288

  18. A Quantitative Approach to the Formal Verification of Real-Time Systems.

    DTIC Science & Technology

    1996-09-01

    Computer Science A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos September 1996 CMU-CS-96-199...ptisiic raieaiSI v Diambimos Lboiamtad _^ A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos...implied, of NSF, the Semiconduc- tor Research Corporation, ARPA or the U.S. government. Keywords: real - time systems , formal verification, symbolic

  19. Ontology design patterns to disambiguate relations between genes and gene products in GENIA

    PubMed Central

    2011-01-01

    Motivation Annotated reference corpora play an important role in biomedical information extraction. A semantic annotation of the natural language texts in these reference corpora using formal ontologies is challenging due to the inherent ambiguity of natural language. The provision of formal definitions and axioms for semantic annotations offers the means for ensuring consistency as well as enables the development of verifiable annotation guidelines. Consistent semantic annotations facilitate the automatic discovery of new information through deductive inferences. Results We provide a formal characterization of the relations used in the recent GENIA corpus annotations. For this purpose, we both select existing axiom systems based on the desired properties of the relations within the domain and develop new axioms for several relations. To apply this ontology of relations to the semantic annotation of text corpora, we implement two ontology design patterns. In addition, we provide a software application to convert annotated GENIA abstracts into OWL ontologies by combining both the ontology of relations and the design patterns. As a result, the GENIA abstracts become available as OWL ontologies and are amenable for automated verification, deductive inferences and other knowledge-based applications. Availability Documentation, implementation and examples are available from http://www-tsujii.is.s.u-tokyo.ac.jp/GENIA/. PMID:22166341

  20. Dissipative hydrodynamics for multi-component systems

    NASA Astrophysics Data System (ADS)

    El, Andrej; Bouras, Ioannis; Wesp, Christian; Xu, Zhe; Greiner, Carsten

    2012-11-01

    Second-order dissipative hydrodynamic equations for each component of a multi-component system are derived using the entropy principle. Comparison of the solutions with kinetic transport results demonstrates validity of the obtained equations. We demonstrate how the shear viscosity of the total system can be calculated in terms of the involved cross-sections and partial densities. The presence of the inter-species interactions leads to a characteristic time dependence of the shear viscosity of the mixture, which also means that the shear viscosity of a mixture cannot be calculated using the Green-Kubo formalism the way it has been done recently. This finding is of interest for understanding of the shear viscosity of a quark-gluon plasma extracted from comparisons of hydrodynamic simulations with experimental results from RHIC and LHC.

  1. Enhancing Credibility of Chemical Safety Studies: Emerging Consensus on Key Assessment Criteria

    PubMed Central

    Conrad, James W.; Becker, Richard A.

    2011-01-01

    Objectives We examined the extent to which consensus exists on the criteria that should be used for assessing the credibility of a scientific work, regardless of its funding source, and explored how these criteria might be implemented. Data sources Three publications, all presented at a session of the 2009 annual meeting of the Society for Risk Analysis, have proposed a range of criteria for evaluating the credibility of scientific studies. At least two other similar sets of criteria have recently been proposed elsewhere. Data extraction/synthesis In this article we review these criteria, highlight the commonalities among them, and integrate them into a list of 10 criteria. We also discuss issues inherent in any attempt to implement the criteria systematically. Conclusions Recommendations by many scientists and policy experts converge on a finite list of criteria for assessing the credibility of a scientific study without regard to funding source. These criteria should be formalized through a consensus process or a governmental initiative that includes discussion and pilot application of a system for reproducibly implementing them. Formal establishment of such a system should enable the debate regarding chemical studies to move beyond funding issues and focus on scientific merit. PMID:21163723

  2. Complexity Reduction in Large Quantum Systems: Fragment Identification and Population Analysis via a Local Optimized Minimal Basis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mohr, Stephan; Masella, Michel; Ratcliff, Laura E.

    We present, within Kohn-Sham Density Functional Theory calculations, a quantitative method to identify and assess the partitioning of a large quantum mechanical system into fragments. We then introduce a simple and efficient formalism (which can be written as generalization of other well-known population analyses) to extract, from first principles, electrostatic multipoles for these fragments. The corresponding fragment multipoles can in this way be seen as reliable (pseudo-) observables. By applying our formalism within the code BigDFT, we show that the usage of a minimal set of in-situ optimized basis functions is of utmost importance for having at the same timemore » a proper fragment definition and an accurate description of the electronic structure. With this approach it becomes possible to simplify the modeling of environmental fragments by a set of multipoles, without notable loss of precision in the description of the active quantum mechanical region. Furthermore, this leads to a considerable reduction of the degrees of freedom by an effective coarsegraining approach, eventually also paving the way towards efficient QM/QM and QM/MM methods coupling together different levels of accuracy.« less

  3. Complexity Reduction in Large Quantum Systems: Fragment Identification and Population Analysis via a Local Optimized Minimal Basis

    DOE PAGES

    Mohr, Stephan; Masella, Michel; Ratcliff, Laura E.; ...

    2017-07-21

    We present, within Kohn-Sham Density Functional Theory calculations, a quantitative method to identify and assess the partitioning of a large quantum mechanical system into fragments. We then introduce a simple and efficient formalism (which can be written as generalization of other well-known population analyses) to extract, from first principles, electrostatic multipoles for these fragments. The corresponding fragment multipoles can in this way be seen as reliable (pseudo-) observables. By applying our formalism within the code BigDFT, we show that the usage of a minimal set of in-situ optimized basis functions is of utmost importance for having at the same timemore » a proper fragment definition and an accurate description of the electronic structure. With this approach it becomes possible to simplify the modeling of environmental fragments by a set of multipoles, without notable loss of precision in the description of the active quantum mechanical region. Furthermore, this leads to a considerable reduction of the degrees of freedom by an effective coarsegraining approach, eventually also paving the way towards efficient QM/QM and QM/MM methods coupling together different levels of accuracy.« less

  4. A Core Plug and Play Architecture for Reusable Flight Software Systems

    NASA Technical Reports Server (NTRS)

    Wilmot, Jonathan

    2006-01-01

    The Flight Software Branch, at Goddard Space Flight Center (GSFC), has been working on a run-time approach to facilitate a formal software reuse process. The reuse process is designed to enable rapid development and integration of high-quality software systems and to more accurately predict development costs and schedule. Previous reuse practices have been somewhat successful when the same teams are moved from project to project. But this typically requires taking the software system in an all-or-nothing approach where useful components cannot be easily extracted from the whole. As a result, the system is less flexible and scalable with limited applicability to new projects. This paper will focus on the rationale behind, and implementation of the run-time executive. This executive is the core for the component-based flight software commonality and reuse process adopted at Goddard.

  5. A Domain-Specific Terminology for Retinopathy of Prematurity and Its Applications in Clinical Settings.

    PubMed

    Zhang, Yinsheng; Zhang, Guoming

    2018-01-01

    A terminology (or coding system) is a formal set of controlled vocabulary in a specific domain. With a well-defined terminology, each concept in the target domain is assigned with a unique code, which can be identified and processed across different medical systems in an unambiguous way. Though there are lots of well-known biomedical terminologies, there is currently no domain-specific terminology for ROP (retinopathy of prematurity). Based on a collection of historical ROP patients' data in the electronic medical record system, we extracted the most frequent terms in the domain and organized them into a hierarchical coding system-ROP Minimal Standard Terminology, which contains 62 core concepts in 4 categories. This terminology has been successfully used to provide highly structured and semantic-rich clinical data in several ROP-related applications.

  6. Formal Techniques for Synchronized Fault-Tolerant Systems

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.; Butler, Ricky W.

    1992-01-01

    We present the formal verification of synchronizing aspects of the Reliable Computing Platform (RCP), a fault-tolerant computing system for digital flight control applications. The RCP uses NMR-style redundancy to mask faults and internal majority voting to purge the effects of transient faults. The system design has been formally specified and verified using the EHDM verification system. Our formalization is based on an extended state machine model incorporating snapshots of local processors clocks.

  7. An Ontology-Based Approach to Incorporate User-Generated Geo-Content Into Sdi

    NASA Astrophysics Data System (ADS)

    Deng, D.-P.; Lemmens, R.

    2011-08-01

    The Web is changing the way people share and communicate information because of emergence of various Web technologies, which enable people to contribute information on the Web. User-Generated Geo-Content (UGGC) is a potential resource of geographic information. Due to the different production methods, UGGC often cannot fit in geographic information model. There is a semantic gap between UGGC and formal geographic information. To integrate UGGC into geographic information, this study conducts an ontology-based process to bridge this semantic gap. This ontology-based process includes five steps: Collection, Extraction, Formalization, Mapping, and Deployment. In addition, this study implements this process on Twitter messages, which is relevant to Japan Earthquake disaster. By using this process, we extract disaster relief information from Twitter messages, and develop a knowledge base for GeoSPARQL queries in disaster relief information.

  8. Proceedings of the Second NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar (Editor)

    2010-01-01

    This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.

  9. Formalizing the Austrian Procedure Catalogue: A 4-step methodological analysis approach.

    PubMed

    Neururer, Sabrina Barbara; Lasierra, Nelia; Peiffer, Karl Peter; Fensel, Dieter

    2016-04-01

    Due to the lack of an internationally accepted and adopted standard for coding health interventions, Austria has established its own country-specific procedure classification system - the Austrian Procedure Catalogue (APC). Even though the APC is an elaborate coding standard for medical procedures, it has shortcomings that limit its usability. In order to enhance usability and usefulness, especially for research purposes and e-health applications, we developed an ontologized version of the APC. In this paper we present a novel four-step approach for the ontology engineering process, which enables accurate extraction of relevant concepts for medical ontologies from written text. The proposed approach for formalizing the APC consists of the following four steps: (1) comparative pre-analysis, (2) definition analysis, (3) typological analysis, and (4) ontology implementation. The first step contained a comparison of the APC to other well-established or elaborate health intervention coding systems in order to identify strengths and weaknesses of the APC. In the second step, a list of definitions of medical terminology used in the APC was obtained. This list of definitions was used as input for Step 3, in which we identified the most important concepts to describe medical procedures using the qualitative typological analysis approach. The definition analysis as well as the typological analysis are well-known and effective methods used in social sciences, but not commonly employed in the computer science or ontology engineering domain. Finally, this list of concepts was used in Step 4 to formalize the APC. The pre-analysis highlighted the major shortcomings of the APC, such as the lack of formal definition, leading to implicitly available, but not directly accessible information (hidden data), or the poor procedural type classification. After performing the definition and subsequent typological analyses, we were able to identify the following main characteristics of health interventions: (1) Procedural type, (2) Anatomical site, (3) Medical device, (4) Pathology, (5) Access, (6) Body system, (7) Population, (8) Aim, (9) Discipline, (10) Technique, and (11) Body Function. These main characteristics were taken as input of classes for the formalization of the APC. We were also able to identify relevant relations between classes. The proposed four-step approach for formalizing the APC provides a novel, systematically developed, strong framework to semantically enrich procedure classifications. Although this methodology was designed to address the particularities of the APC, the included methods are based on generic analysis tasks, and therefore can be re-used to provide a systematic representation of other procedure catalogs or classification systems and hence contribute towards a universal alignment of such representations, if desired. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Formal methods and digital systems validation for airborne systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1993-01-01

    This report has been prepared to supplement a forthcoming chapter on formal methods in the FAA Digital Systems Validation Handbook. Its purpose is as follows: to outline the technical basis for formal methods in computer science; to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used on board aircraft; and to suggest factors for consideration when formal methods are offered in support of certification. These latter factors assume the context for software development and assurance described in RTCA document DO-178B, 'Software Considerations in Airborne Systems and Equipment Certification,' Dec. 1992.

  11. Extraction and Analysis of Display Data

    NASA Technical Reports Server (NTRS)

    Land, Chris; Moye, Kathryn

    2008-01-01

    The Display Audit Suite is an integrated package of software tools that partly automates the detection of Portable Computer System (PCS) Display errors. [PCS is a lap top computer used onboard the International Space Station (ISS).] The need for automation stems from the large quantity of PCS displays (6,000+, with 1,000,000+ lines of command and telemetry data). The Display Audit Suite includes data-extraction tools, automatic error detection tools, and database tools for generating analysis spread sheets. These spread sheets allow engineers to more easily identify many different kinds of possible errors. The Suite supports over 40 independent analyses, 16 NASA Tech Briefs, November 2008 and complements formal testing by being comprehensive (all displays can be checked) and by revealing errors that are difficult to detect via test. In addition, the Suite can be run early in the development cycle to find and correct errors in advance of testing.

  12. Fluid extraction across pumping and permeable walls in the viscous limit

    NASA Astrophysics Data System (ADS)

    Herschlag, G.; Liu, J.-G.; Layton, A. T.

    2016-04-01

    In biological transport mechanisms such as insect respiration and renal filtration, fluid travels along a leaky channel allowing material exchange with systems exterior to the channel. The channels in these systems may undergo peristaltic pumping which is thought to enhance the material exchange. To date, little analytic work has been done to study the effect of pumping on material extraction across the channel walls. In this paper, we examine a fluid extraction model in which fluid flowing through a leaky channel is exchanged with fluid in a reservoir. The channel walls are allowed to contract and expand uniformly, simulating a pumping mechanism. In order to efficiently determine solutions of the model, we derive a formal power series solution for the Stokes equations in a finite channel with uniformly contracting/expanding permeable walls. This flow has been well studied in the case in which the normal velocity at the channel walls is proportional to the wall velocity. In contrast we do not assume flow that is proportional to the wall velocity, but flow that is driven by hydrostatic pressure, and we use Darcy's law to close our system for normal wall velocity. We incorporate our flow solution into a model that tracks the material pressure exterior to the channel. We use this model to examine flux across the channel-reservoir barrier and demonstrate that pumping can either enhance or impede fluid extraction across channel walls. We find that associated with each set of physical flow and pumping parameters, there are optimal reservoir conditions that maximize the amount of material flowing from the channel into the reservoir.

  13. Learning in later life: participation in formal, non-formal and informal activities in a nationally representative Spanish sample.

    PubMed

    Villar, Feliciano; Celdrán, Montserrat

    2013-06-01

    This article examines the participation of Spanish older people in formal, non-formal and informal learning activities and presents a profile of participants in each kind of learning activity. We used data from a nationally representative sample of Spanish people between 60 and 75 years old ( n  = 4,703). The data were extracted from the 2007 Encuesta sobre la Participación de la Población Adulta en Actividades de Aprendizaje (EADA, Survey on Adult Population Involvement in Learning Activities). Overall, only 22.8 % of the sample participated in a learning activity. However, there was wide variation in the participation rates for the different types of activity. Informal activities were far more common than formal ones. Multivariate logistic regression indicated that education level and involvement in social and cultural activities were associated with likelihood of participating, regardless of the type of learning activity. When these variables were taken into account, age did not predict decreasing participation, at least in non-formal and informal activities. Implications for further research, future trends and policies to promote older adult education are discussed.

  14. ccML, a new mark-up language to improve ISO/EN 13606-based electronic health record extracts practical edition.

    PubMed

    Sánchez-de-Madariaga, Ricardo; Muñoz, Adolfo; Cáceres, Jesús; Somolinos, Roberto; Pascual, Mario; Martínez, Ignacio; Salvador, Carlos H; Monteagudo, José Luis

    2013-01-01

    The objective of this paper is to introduce a new language called ccML, designed to provide convenient pragmatic information to applications using the ISO/EN13606 reference model (RM), such as electronic health record (EHR) extracts editors. EHR extracts are presently built using the syntactic and semantic information provided in the RM and constrained by archetypes. The ccML extra information enables the automation of the medico-legal context information edition, which is over 70% of the total in an extract, without modifying the RM information. ccML is defined using a W3C XML schema file. Valid ccML files complement the RM with additional pragmatics information. The ccML language grammar is defined using formal language theory as a single-type tree grammar. The new language is tested using an EHR extracts editor application as proof-of-concept system. Seven ccML PVCodes (predefined value codes) are introduced in this grammar to cope with different realistic EHR edition situations. These seven PVCodes have different interpretation strategies, from direct look up in the ccML file itself, to more complex searches in archetypes or system precomputation. The possibility to declare generic types in ccML gives rise to ambiguity during interpretation. The criterion used to overcome ambiguity is that specificity should prevail over generality. The opposite would make the individual specific element declarations useless. A new mark-up language ccML is introduced that opens up the possibility of providing applications using the ISO/EN13606 RM with the necessary pragmatics information to be practical and realistic.

  15. Spectral Cauchy characteristic extraction of strain, news and gravitational radiation flux

    NASA Astrophysics Data System (ADS)

    Handmer, Casey J.; Szilágyi, Béla; Winicour, Jeffrey

    2016-11-01

    We present a new approach for the Cauchy-characteristic extraction (CCE) of gravitational radiation strain, news function, and the flux of the energy-momentum, supermomentum and angular momentum associated with the Bondi-Metzner-Sachs asymptotic symmetries. In CCE, a characteristic evolution code takes numerical data on an inner worldtube supplied by a Cauchy evolution code, and propagates it outwards to obtain the space-time metric in a neighborhood of null infinity. The metric is first determined in a scrambled form in terms of coordinates determined by the Cauchy formalism. In prior treatments, the waveform is first extracted from this metric and then transformed into an asymptotic inertial coordinate system. This procedure provides the physically proper description of the waveform and the radiated energy but it does not generalize to determine the flux of angular momentum or supermomentum. Here we formulate and implement a new approach which transforms the full metric into an asymptotic inertial frame and provides a uniform treatment of all the radiation fluxes associated with the asymptotic symmetries. Computations are performed and calibrated using the spectral Einstein code.

  16. Machine Learning-based Intelligent Formal Reasoning and Proving System

    NASA Astrophysics Data System (ADS)

    Chen, Shengqing; Huang, Xiaojian; Fang, Jiaze; Liang, Jia

    2018-03-01

    The reasoning system can be used in many fields. How to improve reasoning efficiency is the core of the design of system. Through the formal description of formal proof and the regular matching algorithm, after introducing the machine learning algorithm, the system of intelligent formal reasoning and verification has high efficiency. The experimental results show that the system can verify the correctness of propositional logic reasoning and reuse the propositional logical reasoning results, so as to obtain the implicit knowledge in the knowledge base and provide the basic reasoning model for the construction of intelligent system.

  17. The optical potential on the lattice

    DOE PAGES

    Agadjanov, Dimitri; Doring, Michael; Mai, Maxim; ...

    2016-06-08

    The extraction of hadron-hadron scattering parameters from lattice data by using the Luscher approach becomes increasingly complicated in the presence of inelastic channels. We propose a method for the direct extraction of the complex hadron-hadron optical potential on the lattice, which does not require the use of the multi-channel Luscher formalism. Furthermore, this method is applicable without modifications if some inelastic channels contain three or more particles.

  18. A theoretical framework for improving education in geriatric medicine.

    PubMed

    Boreham, N C

    1983-01-01

    Alternative concepts of learning include a formal system in which part of the medical curriculum is designated as that for geriatric medicine; a non-formal system including conferences, lectures, broadcasts, available to both medical students and physicians; and thirdly, an informal system in which doctors learn medicine through their experience practising the profession. While the most emphasis in medical schools would seem to be on the formal system it is essential that medical educators (if they wish their students in later life to maintain high levels of self-initiated learning) must use all three strategies. The structure of a system of formal teaching for geriatric medicine is examined. An important objective is attitude change and it is in achieving this that geriatricians must be particularly involved in non-formal and informal systems.

  19. A Tool for Requirements-Based Programming

    NASA Technical Reports Server (NTRS)

    Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis; Erickson, John

    2005-01-01

    Absent a general method for mathematically sound, automated transformation of customer requirements into a formal model of the desired system, developers must resort to either manual application of formal methods or to system testing (either manual or automated). While formal methods have afforded numerous successes, they present serious issues, e.g., costs to gear up to apply them (time, expensive staff), and scalability and reproducibility when standards in the field are not settled. The testing path cannot be walked to the ultimate goal, because exhaustive testing is infeasible for all but trivial systems. So system verification remains problematic. System or requirements validation is similarly problematic. The alternatives available today depend on either having a formal model or pursuing enough testing to enable the customer to be certain that system behavior meets requirements. The testing alternative for non-trivial systems always have some system behaviors unconfirmed and therefore is not the answer. To ensure that a formal model is equivalent to the customer s requirements necessitates that the customer somehow fully understands the formal model, which is not realistic. The predominant view that provably correct system development depends on having a formal model of the system leads to a desire for a mathematically sound method to automate the transformation of customer requirements into a formal model. Such a method, an augmentation of requirements-based programming, will be briefly described in this paper, and a prototype tool to support it will be described. The method and tool enable both requirements validation and system verification for the class of systems whose behavior can be described as scenarios. An application of the tool to a prototype automated ground control system for NASA mission is presented.

  20. Formal System Verification - Extension 2

    DTIC Science & Technology

    2012-08-08

    vision of truly trustworthy systems has been to provide a formally verified microkernel basis. We have previously developed the seL4 microkernel...together with a formal proof (in the theorem prover Isabelle/HOL) of its functional correctness [6]. This means that all the behaviours of the seL4 C...source code are included in the high-level, formal specification of the kernel. This work enabled us to provide further formal guarantees about seL4 , in

  1. Systems, methods and apparatus for verification of knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Rash, James L. (Inventor); Gracinin, Denis (Inventor); Erickson, John D. (Inventor); Rouff, Christopher A. (Inventor); Hinchey, Michael G. (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments, domain knowledge is translated into a knowledge-based system. In some embodiments, a formal specification is derived from rules of a knowledge-based system, the formal specification is analyzed, and flaws in the formal specification are used to identify and correct errors in the domain knowledge, from which a knowledge-based system is translated.

  2. Bottom-up approaches to strengthening child protection systems: Placing children, families, and communities at the center.

    PubMed

    Wessells, Michael G

    2015-05-01

    Efforts to strengthen national child protection systems have frequently taken a top-down approach of imposing formal, government-managed services. Such expert-driven approaches are often characterized by low use of formal services and the misalignment of the nonformal and formal aspects of the child protection system. This article examines an alternative approach of community-driven, bottom-up work that enables nonformal-formal collaboration and alignment, greater use of formal services, internally driven social change, and high levels of community ownership. The dominant approach of reliance on expert-driven Child Welfare Committees produces low levels of community ownership. Using an approach developed and tested in rural Sierra Leone, community-driven action, including collaboration and linkages with the formal system, promoted the use of formal services and achieved increased ownership, effectiveness, and sustainability of the system. The field needs less reliance on expert-driven approaches and much wider use of slower, community-driven, bottom-up approaches to child protection. Copyright © 2015 The Author. Published by Elsevier Ltd.. All rights reserved.

  3. Understanding terminological systems. II: Experience with conceptual and formal representation of structure.

    PubMed

    de Keizer, N F; Abu-Hanna, A

    2000-03-01

    This article describes the application of two popular conceptual and formal representation formalisms, as part of a framework for understanding terminological systems. A precise understanding of the structure of a terminological system is essential to assess existing terminological systems, to recognize patterns in various systems and to build new terminological systems. Our experience with the application of this framework to five well-known terminological systems is described.

  4. Skinner-Rusk unified formalism for higher-order systems

    NASA Astrophysics Data System (ADS)

    Prieto-Martínez, Pedro Daniel; Román-Roy, Narciso

    2012-07-01

    The Lagrangian-Hamiltonian unified formalism of R. Skinner and R. Rusk was originally stated for autonomous dynamical systems in classical mechanics. It has been generalized for non-autonomous first-order mechanical systems, first-order and higher-order field theories, and higher-order autonomous systems. In this work we present a generalization of this formalism for higher-order non-autonomous mechanical systems.

  5. A Semantic Approach for Geospatial Information Extraction from Unstructured Documents

    NASA Astrophysics Data System (ADS)

    Sallaberry, Christian; Gaio, Mauro; Lesbegueries, Julien; Loustau, Pierre

    Local cultural heritage document collections are characterized by their content, which is strongly attached to a territory and its land history (i.e., geographical references). Our contribution aims at making the content retrieval process more efficient whenever a query includes geographic criteria. We propose a core model for a formal representation of geographic information. It takes into account characteristics of different modes of expression, such as written language, captures of drawings, maps, photographs, etc. We have developed a prototype that fully implements geographic information extraction (IE) and geographic information retrieval (IR) processes. All PIV prototype processing resources are designed as Web Services. We propose a geographic IE process based on semantic treatment as a supplement to classical IE approaches. We implement geographic IR by using intersection computing algorithms that seek out any intersection between formal geocoded representations of geographic information in a user query and similar representations in document collection indexes.

  6. Quantum crystallography: A perspective.

    PubMed

    Massa, Lou; Matta, Chérif F

    2018-06-30

    Extraction of the complete quantum mechanics from X-ray scattering data is the ultimate goal of quantum crystallography. This article delivers a perspective for that possibility. It is desirable to have a method for the conversion of X-ray diffraction data into an electron density that reflects the antisymmetry of an N-electron wave function. A formalism for this was developed early on for the determination of a constrained idempotent one-body density matrix. The formalism ensures pure-state N-representability in the single determinant sense. Applications to crystals show that quantum mechanical density matrices of large molecules can be extracted from X-ray scattering data by implementing a fragmentation method termed the kernel energy method (KEM). It is shown how KEM can be used within the context of quantum crystallography to derive quantum mechanical properties of biological molecules (with low data-to-parameters ratio). © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  7. Optimization of permanent breast seed implant dosimetry incorporating tissue heterogeneity

    NASA Astrophysics Data System (ADS)

    Mashouf, Shahram

    Seed brachytherapy is currently used for adjuvant radiotherapy of early stage prostate and breast cancer patients. The current standard for calculation of dose around brachytherapy sources is based on the AAPM TG43 formalism, which generates the dose in homogeneous water medium. Recently, AAPM task group no. 186 (TG186) emphasized the importance of accounting for heterogeneities. In this work we introduce an analytical dose calculation algorithm in heterogeneous media using CT images. The advantages over other methods are computational efficiency and the ease of integration into clinical use. An Inhomogeneity Correction Factor (ICF) is introduced as the ratio of absorbed dose in tissue to that in water medium. ICF is a function of tissue properties and independent of the source structure. The ICF is extracted using CT images and the absorbed dose in tissue can then be calculated by multiplying the dose as calculated by the TG43 formalism times ICF. To evaluate the methodology, we compared our results with Monte Carlo simulations as well as experiments in phantoms with known density and atomic compositions. The dose distributions obtained through applying ICF to TG43 protocol agreed very well with those of Monte Carlo simulations and experiments in all phantoms. In all cases, the mean relative error was reduced by at least a factor of two when ICF correction factor was applied to the TG43 protocol. In conclusion we have developed a new analytical dose calculation method, which enables personalized dose calculations in heterogeneous media using CT images. The methodology offers several advantages including the use of standard TG43 formalism, fast calculation time and extraction of the ICF parameters directly from Hounsfield Units. The methodology was implemented into our clinical treatment planning system where a cohort of 140 patients were processed to study the clinical benefits of a heterogeneity corrected dose.

  8. Lagrangian-Hamiltonian unified formalism for autonomous higher order dynamical systems

    NASA Astrophysics Data System (ADS)

    Prieto-Martínez, Pedro Daniel; Román-Roy, Narciso

    2011-09-01

    The Lagrangian-Hamiltonian unified formalism of Skinner and Rusk was originally stated for autonomous dynamical systems in classical mechanics. It has been generalized for non-autonomous first-order mechanical systems, as well as for first-order and higher order field theories. However, a complete generalization to higher order mechanical systems is yet to be described. In this work, after reviewing the natural geometrical setting and the Lagrangian and Hamiltonian formalisms for higher order autonomous mechanical systems, we develop a complete generalization of the Lagrangian-Hamiltonian unified formalism for these kinds of systems, and we use it to analyze some physical models from this new point of view.

  9. Software Tools for Formal Specification and Verification of Distributed Real-Time Systems.

    DTIC Science & Technology

    1997-09-30

    set of software tools for specification and verification of distributed real time systems using formal methods. The task of this SBIR Phase II effort...to be used by designers of real - time systems for early detection of errors. The mathematical complexity of formal specification and verification has

  10. Hyperboloidal evolution of test fields in three spatial dimensions

    NASA Astrophysics Data System (ADS)

    Zenginoǧlu, Anıl; Kidder, Lawrence E.

    2010-06-01

    We present the numerical implementation of a clean solution to the outer boundary and radiation extraction problems within the 3+1 formalism for hyperbolic partial differential equations on a given background. Our approach is based on compactification at null infinity in hyperboloidal scri fixing coordinates. We report numerical tests for the particular example of a scalar wave equation on Minkowski and Schwarzschild backgrounds. We address issues related to the implementation of the hyperboloidal approach for the Einstein equations, such as nonlinear source functions, matching, and evaluation of formally singular terms at null infinity.

  11. CORESAFE: A Formal Approach against Code Replacement Attacks on Cyber Physical Systems

    DTIC Science & Technology

    2018-04-19

    AFRL-AFOSR-JP-TR-2018-0035 CORESAFE:A Formal Approach against Code Replacement Attacks on Cyber Physical Systems Sandeep Shukla INDIAN INSTITUTE OF...Formal Approach against Code Replacement Attacks on Cyber Physical Systems 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER FA2386-16-1-4099 5c.  PROGRAM ELEMENT...Institute of Technology Kanpur India Final Report for AOARD Grant “CORESAFE: A Formal Approach against Code Replacement Attacks on Cyber Physical

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    George, Damien P.; Mooij, Sander; Postma, Marieke, E-mail: dpg39@cam.ac.uk, E-mail: sander.mooij@ing.uchile.cl, E-mail: mpostma@nikhef.nl

    We compute the one-loop renormalization group equations for Standard Model Higgs inflation. The calculation is done in the Einstein frame, using a covariant formalism for the multi-field system. All counterterms, and thus the betafunctions, can be extracted from the radiative corrections to the two-point functions; the calculation of higher n-point functions then serves as a consistency check of the approach. We find that the theory is renormalizable in the effective field theory sense in the small, mid and large field regime. In the large field regime our results differ slightly from those found in the literature, due to a differentmore » treatment of the Goldstone bosons.« less

  13. Multielectron, multisubstrate molecular catalysis of electrochemical reactions: Formal kinetic analysis in the total catalysis regime.

    PubMed

    Costentin, Cyrille; Nocera, Daniel G; Brodsky, Casey N

    2017-10-24

    Cyclic voltammetry responses are derived for two-electron, two-step homogeneous electrocatalytic reactions in the total catalysis regime. The models developed provide a framework for extracting kinetic information from cyclic voltammograms (CVs) obtained in conditions under which the substrate or cosubstrate is consumed in a multielectron redox process, as is particularly prevalent for very active catalysts that promote energy conversion reactions. Such determination of rate constants in the total catalysis regime is a prerequisite for the rational benchmarking of molecular electrocatalysts that promote multielectron conversions of small-molecule reactants. The present analysis is illustrated with experimental systems encompassing various limiting behaviors.

  14. Verifying Hybrid Systems Modeled as Timed Automata: A Case Study

    DTIC Science & Technology

    1997-03-01

    Introduction Researchers have proposed many innovative formal methods for developing real - time systems [9]. Such methods can give system developers and...customers greater con dence that real - time systems satisfy their requirements, especially their crit- ical requirements. However, applying formal methods...specifying and reasoning about real - time systems that is designed to address these challenging problems. Our approach is to build formal reasoning tools

  15. Guidelines for the Treatment of Hypothyroidism: Prepared by the American Thyroid Association Task Force on Thyroid Hormone Replacement

    PubMed Central

    Bianco, Antonio C.; Bauer, Andrew J.; Burman, Kenneth D.; Cappola, Anne R.; Celi, Francesco S.; Cooper, David S.; Kim, Brian W.; Peeters, Robin P.; Rosenthal, M. Sara; Sawka, Anna M.

    2014-01-01

    Background: A number of recent advances in our understanding of thyroid physiology may shed light on why some patients feel unwell while taking levothyroxine monotherapy. The purpose of this task force was to review the goals of levothyroxine therapy, the optimal prescription of conventional levothyroxine therapy, the sources of dissatisfaction with levothyroxine therapy, the evidence on treatment alternatives, and the relevant knowledge gaps. We wished to determine whether there are sufficient new data generated by well-designed studies to provide reason to pursue such therapies and change the current standard of care. This document is intended to inform clinical decision-making on thyroid hormone replacement therapy; it is not a replacement for individualized clinical judgment. Methods: Task force members identified 24 questions relevant to the treatment of hypothyroidism. The clinical literature relating to each question was then reviewed. Clinical reviews were supplemented, when relevant, with related mechanistic and bench research literature reviews, performed by our team of translational scientists. Ethics reviews were provided, when relevant, by a bioethicist. The responses to questions were formatted, when possible, in the form of a formal clinical recommendation statement. When responses were not suitable for a formal clinical recommendation, a summary response statement without a formal clinical recommendation was developed. For clinical recommendations, the supporting evidence was appraised, and the strength of each clinical recommendation was assessed, using the American College of Physicians system. The final document was organized so that each topic is introduced with a question, followed by a formal clinical recommendation. Stakeholder input was received at a national meeting, with some subsequent refinement of the clinical questions addressed in the document. Consensus was achieved for all recommendations by the task force. Results: We reviewed the following therapeutic categories: (i) levothyroxine therapy, (ii) non–levothyroxine-based thyroid hormone therapies, and (iii) use of thyroid hormone analogs. The second category included thyroid extracts, synthetic combination therapy, triiodothyronine therapy, and compounded thyroid hormones. Conclusions: We concluded that levothyroxine should remain the standard of care for treating hypothyroidism. We found no consistently strong evidence for the superiority of alternative preparations (e.g., levothyroxine–liothyronine combination therapy, or thyroid extract therapy, or others) over monotherapy with levothyroxine, in improving health outcomes. Some examples of future research needs include the development of superior biomarkers of euthyroidism to supplement thyrotropin measurements, mechanistic research on serum triiodothyronine levels (including effects of age and disease status, relationship with tissue concentrations, as well as potential therapeutic targeting), and long-term outcome clinical trials testing combination therapy or thyroid extracts (including subgroup effects). Additional research is also needed to develop thyroid hormone analogs with a favorable benefit to risk profile. PMID:25266247

  16. Guidelines for the treatment of hypothyroidism: prepared by the american thyroid association task force on thyroid hormone replacement.

    PubMed

    Jonklaas, Jacqueline; Bianco, Antonio C; Bauer, Andrew J; Burman, Kenneth D; Cappola, Anne R; Celi, Francesco S; Cooper, David S; Kim, Brian W; Peeters, Robin P; Rosenthal, M Sara; Sawka, Anna M

    2014-12-01

    A number of recent advances in our understanding of thyroid physiology may shed light on why some patients feel unwell while taking levothyroxine monotherapy. The purpose of this task force was to review the goals of levothyroxine therapy, the optimal prescription of conventional levothyroxine therapy, the sources of dissatisfaction with levothyroxine therapy, the evidence on treatment alternatives, and the relevant knowledge gaps. We wished to determine whether there are sufficient new data generated by well-designed studies to provide reason to pursue such therapies and change the current standard of care. This document is intended to inform clinical decision-making on thyroid hormone replacement therapy; it is not a replacement for individualized clinical judgment. Task force members identified 24 questions relevant to the treatment of hypothyroidism. The clinical literature relating to each question was then reviewed. Clinical reviews were supplemented, when relevant, with related mechanistic and bench research literature reviews, performed by our team of translational scientists. Ethics reviews were provided, when relevant, by a bioethicist. The responses to questions were formatted, when possible, in the form of a formal clinical recommendation statement. When responses were not suitable for a formal clinical recommendation, a summary response statement without a formal clinical recommendation was developed. For clinical recommendations, the supporting evidence was appraised, and the strength of each clinical recommendation was assessed, using the American College of Physicians system. The final document was organized so that each topic is introduced with a question, followed by a formal clinical recommendation. Stakeholder input was received at a national meeting, with some subsequent refinement of the clinical questions addressed in the document. Consensus was achieved for all recommendations by the task force. We reviewed the following therapeutic categories: (i) levothyroxine therapy, (ii) non-levothyroxine-based thyroid hormone therapies, and (iii) use of thyroid hormone analogs. The second category included thyroid extracts, synthetic combination therapy, triiodothyronine therapy, and compounded thyroid hormones. We concluded that levothyroxine should remain the standard of care for treating hypothyroidism. We found no consistently strong evidence for the superiority of alternative preparations (e.g., levothyroxine-liothyronine combination therapy, or thyroid extract therapy, or others) over monotherapy with levothyroxine, in improving health outcomes. Some examples of future research needs include the development of superior biomarkers of euthyroidism to supplement thyrotropin measurements, mechanistic research on serum triiodothyronine levels (including effects of age and disease status, relationship with tissue concentrations, as well as potential therapeutic targeting), and long-term outcome clinical trials testing combination therapy or thyroid extracts (including subgroup effects). Additional research is also needed to develop thyroid hormone analogs with a favorable benefit to risk profile.

  17. Formalization of the classification pattern: survey of classification modeling in information systems engineering.

    PubMed

    Partridge, Chris; de Cesare, Sergio; Mitchell, Andrew; Odell, James

    2018-01-01

    Formalization is becoming more common in all stages of the development of information systems, as a better understanding of its benefits emerges. Classification systems are ubiquitous, no more so than in domain modeling. The classification pattern that underlies these systems provides a good case study of the move toward formalization in part because it illustrates some of the barriers to formalization, including the formal complexity of the pattern and the ontological issues surrounding the "one and the many." Powersets are a way of characterizing the (complex) formal structure of the classification pattern, and their formalization has been extensively studied in mathematics since Cantor's work in the late nineteenth century. One can use this formalization to develop a useful benchmark. There are various communities within information systems engineering (ISE) that are gradually working toward a formalization of the classification pattern. However, for most of these communities, this work is incomplete, in that they have not yet arrived at a solution with the expressiveness of the powerset benchmark. This contrasts with the early smooth adoption of powerset by other information systems communities to, for example, formalize relations. One way of understanding the varying rates of adoption is recognizing that the different communities have different historical baggage. Many conceptual modeling communities emerged from work done on database design, and this creates hurdles to the adoption of the high level of expressiveness of powersets. Another relevant factor is that these communities also often feel, particularly in the case of domain modeling, a responsibility to explain the semantics of whatever formal structures they adopt. This paper aims to make sense of the formalization of the classification pattern in ISE and surveys its history through the literature, starting from the relevant theoretical works of the mathematical literature and gradually shifting focus to the ISE literature. The literature survey follows the evolution of ISE's understanding of how to formalize the classification pattern. The various proposals are assessed using the classical example of classification; the Linnaean taxonomy formalized using powersets as a benchmark for formal expressiveness. The broad conclusion of the survey is that (1) the ISE community is currently in the early stages of the process of understanding how to formalize the classification pattern, particularly in the requirements for expressiveness exemplified by powersets, and (2) that there is an opportunity to intervene and speed up the process of adoption by clarifying this expressiveness. Given the central place that the classification pattern has in domain modeling, this intervention has the potential to lead to significant improvements.

  18. Formal verification of AI software

    NASA Technical Reports Server (NTRS)

    Rushby, John; Whitehurst, R. Alan

    1989-01-01

    The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.

  19. Field-antifield and BFV formalisms for quadratic systems with open gauge algebras

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nirov, K.S.; Razumov, A.V.

    1992-09-20

    In this paper the Lagrangian field-antifield (BV) and Hamiltonian (BFV) BRST formalisms for the general quadratic systems with open gauge algebra are considered. The equivalence between the Lagrangian and Hamiltonian formalisms is proven.

  20. Working the College System: Six Strategies for Building a Personal Powerbase

    ERIC Educational Resources Information Center

    Simplicio, Joseph S. C.

    2008-01-01

    Within each college system there are prescribed formalized methods for accomplishing tasks and achieving established goals. To truly understand how a college, or any large organization functions, it is vital to understand the basis of the formal structure. Those individuals who understand formal systems within a college can use this knowledge to…

  1. Managing search complexity in linguistic geometry.

    PubMed

    Stilman, B

    1997-01-01

    This paper is a new step in the development of linguistic geometry. This formal theory is intended to discover and generalize the inner properties of human expert heuristics, which have been successful in a certain class of complex control systems, and apply them to different systems. In this paper, we investigate heuristics extracted in the form of hierarchical networks of planning paths of autonomous agents. Employing linguistic geometry tools the dynamic hierarchy of networks is represented as a hierarchy of formal attribute languages. The main ideas of this methodology are shown in the paper on two pilot examples of the solution of complex optimization problems. The first example is a problem of strategic planning for the air combat, in which concurrent actions of four vehicles are simulated as serial interleaving moves. The second example is a problem of strategic planning for the space comb of eight autonomous vehicles (with interleaving moves) that requires generation of the search tree of the depth 25 with the branching factor 30. This is beyond the capabilities of modern and conceivable future computers (employing conventional approaches). In both examples the linguistic geometry tools showed deep and highly selective searches in comparison with conventional search algorithms. For the first example a sketch of the proof of optimality of the solution is considered.

  2. Systems, methods and apparatus for implementation of formal specifications derived from informal requirements

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor); Erickson, John D. (Inventor); Gracinin, Denis (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments an informal specification is translated without human intervention into a formal specification. In some embodiments the formal specification is a process-based specification. In some embodiments, the formal specification is translated into a high-level computer programming language which is further compiled into a set of executable computer instructions.

  3. A Comparitive Study of Subject Knowledge of B.Ed Graduates of Formal and Non-Formal Teacher Education Systems

    ERIC Educational Resources Information Center

    Saif, Perveen; Reba, Amjad; ud Din, Jalal

    2017-01-01

    This study was designed to compare the subject knowledge of B.Ed graduates of formal and non-formal teacher education systems. The population of the study included all teachers from Girls High and Higher Secondary Schools both from private and public sectors from the district of Peshawar. Out of the total population, twenty schools were randomly…

  4. Subsumption principles underlying medical concept systems and their formal reconstruction.

    PubMed Central

    Bernauer, J.

    1994-01-01

    Conventional medical concept systems represent generic concept relations by hierarchical coding principles. Often, these coding principles constrain the concept system and reduce the potential for automatical derivation of subsumption. Formal reconstruction of medical concept systems is an approach that bases on the conceptual representation of meanings and that allows for the application of formal criteria for subsumption. Those criteria must reflect intuitive principles of subordination which are underlying conventional medical concept systems. Particularly these are: The subordinate concept results (1) from adding a specializing criterion to the superordinate concept, (2) from refining the primary category, or a criterion of the superordinate concept, by a concept that is less general, (3) from adding a partitive criterion to a criterion of the superordinate, (4) from refining a criterion by a concept that is less comprehensive, and finally (5) from coordinating the superordinate concept, or one of its criteria. This paper introduces a formalism called BERNWARD that aims at the formal reconstruction of medical concept systems according to these intuitive principles. The automatical derivation of hierarchical relations is primarily supported by explicit generic and explicit partititive hierarchies of concepts, secondly, by two formal criteria that base on the structure of concept descriptions and explicit hierarchical relations between their elements, namely: formal subsumption and part-sensitive subsumption. Formal subsumption takes only generic relations into account, part-sensitive subsumption additionally regards partive relations between criteria. This approach seems to be flexible enough to cope with unforeseeable effects of partitive criteria on subsumption. PMID:7949907

  5. Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering

    NASA Technical Reports Server (NTRS)

    Bolton, Matthew L.; Bass, Ellen J.

    2009-01-01

    Both the human factors engineering (HFE) and formal methods communities are concerned with finding and eliminating problems with safety-critical systems. This work discusses a modeling effort that leveraged methods from both fields to use model checking with HFE practices to perform formal verification of a human-interactive system. Despite the use of a seemingly simple target system, a patient controlled analgesia pump, the initial model proved to be difficult for the model checker to verify in a reasonable amount of time. This resulted in a number of model revisions that affected the HFE architectural, representativeness, and understandability goals of the effort. If formal methods are to meet the needs of the HFE community, additional modeling tools and technological developments are necessary.

  6. ccML, a new mark-up language to improve ISO/EN 13606-based electronic health record extracts practical edition

    PubMed Central

    Sánchez-de-Madariaga, Ricardo; Muñoz, Adolfo; Cáceres, Jesús; Somolinos, Roberto; Pascual, Mario; Martínez, Ignacio; Salvador, Carlos H; Monteagudo, José Luis

    2013-01-01

    Objective The objective of this paper is to introduce a new language called ccML, designed to provide convenient pragmatic information to applications using the ISO/EN13606 reference model (RM), such as electronic health record (EHR) extracts editors. EHR extracts are presently built using the syntactic and semantic information provided in the RM and constrained by archetypes. The ccML extra information enables the automation of the medico-legal context information edition, which is over 70% of the total in an extract, without modifying the RM information. Materials and Methods ccML is defined using a W3C XML schema file. Valid ccML files complement the RM with additional pragmatics information. The ccML language grammar is defined using formal language theory as a single-type tree grammar. The new language is tested using an EHR extracts editor application as proof-of-concept system. Results Seven ccML PVCodes (predefined value codes) are introduced in this grammar to cope with different realistic EHR edition situations. These seven PVCodes have different interpretation strategies, from direct look up in the ccML file itself, to more complex searches in archetypes or system precomputation. Discussion The possibility to declare generic types in ccML gives rise to ambiguity during interpretation. The criterion used to overcome ambiguity is that specificity should prevail over generality. The opposite would make the individual specific element declarations useless. Conclusion A new mark-up language ccML is introduced that opens up the possibility of providing applications using the ISO/EN13606 RM with the necessary pragmatics information to be practical and realistic. PMID:23019241

  7. Photon scattering from a system of multilevel quantum emitters. I. Formalism

    NASA Astrophysics Data System (ADS)

    Das, Sumanta; Elfving, Vincent E.; Reiter, Florentin; Sørensen, Anders S.

    2018-04-01

    We introduce a formalism to solve the problem of photon scattering from a system of multilevel quantum emitters. Our approach provides a direct solution of the scattering dynamics. As such the formalism gives the scattered fields' amplitudes in the limit of a weak incident intensity. Our formalism is equipped to treat both multiemitter and multilevel emitter systems, and is applicable to a plethora of photon-scattering problems, including conditional state preparation by photodetection. In this paper, we develop the general formalism for an arbitrary geometry. In the following paper (part II) S. Das et al. [Phys. Rev. A 97, 043838 (2018), 10.1103/PhysRevA.97.043838], we reduce the general photon-scattering formalism to a form that is applicable to one-dimensional waveguides and show its applicability by considering explicit examples with various emitter configurations.

  8. A perspective on bridging scales and design of models using low-dimensional manifolds and data-driven model inference

    PubMed Central

    Zenil, Hector; Kiani, Narsis A.; Ball, Gordon; Gomez-Cabrero, David

    2016-01-01

    Systems in nature capable of collective behaviour are nonlinear, operating across several scales. Yet our ability to account for their collective dynamics differs in physics, chemistry and biology. Here, we briefly review the similarities and differences between mathematical modelling of adaptive living systems versus physico-chemical systems. We find that physics-based chemistry modelling and computational neuroscience have a shared interest in developing techniques for model reductions aiming at the identification of a reduced subsystem or slow manifold, capturing the effective dynamics. By contrast, as relations and kinetics between biological molecules are less characterized, current quantitative analysis under the umbrella of bioinformatics focuses on signal extraction, correlation, regression and machine-learning analysis. We argue that model reduction analysis and the ensuing identification of manifolds bridges physics and biology. Furthermore, modelling living systems presents deep challenges as how to reconcile rich molecular data with inherent modelling uncertainties (formalism, variables selection and model parameters). We anticipate a new generative data-driven modelling paradigm constrained by identified governing principles extracted from low-dimensional manifold analysis. The rise of a new generation of models will ultimately connect biology to quantitative mechanistic descriptions, thereby setting the stage for investigating the character of the model language and principles driving living systems. This article is part of the themed issue ‘Multiscale modelling at the physics–chemistry–biology interface’. PMID:27698038

  9. Recognition of disturbances with specified morphology in time series. Part 1: Spikes on magnetograms of the worldwide INTERMAGNET network

    NASA Astrophysics Data System (ADS)

    Bogoutdinov, Sh. R.; Gvishiani, A. D.; Agayan, S. M.; Solovyev, A. A.; Kin, E.

    2010-11-01

    The International Real-time Magnetic Observatory Network (INTERMAGNET) is the world's biggest international network of ground-based observatories, providing geomagnetic data almost in real time (within 72 hours of collection) [Kerridge, 2001]. The observation data are rapidly transferred by the observatories participating in the program to regional Geomagnetic Information Nodes (GINs), which carry out a global exchange of data and process the results. The observations of the main (core) magnetic field of the Earth and its study are one of the key problems of geophysics. The INTERMAGNET system is the basis of monitoring the state of the Earth's magnetic field; therefore, the information provided by the system is required to be very reliable. Despite the rigid high-quality standard of the recording devices, they are subject to external effects that affect the quality of the records. Therefore, an objective and formalized recognition with the subsequent remedy of the anomalies (artifacts) that occur on the records is an important task. Expanding on the ideas of Agayan [Agayan et al., 2005] and Gvishiani [Gvishiani et al., 2008a; 2008b], this paper suggests a new algorithm of automatic recognition of anomalies with specified morphology, capable of identifying both physically- and anthropogenically-derived spikes on the magnetograms. The algorithm is constructed using fuzzy logic and, as such, is highly adaptive and universal. The developed algorithmic system formalizes the work of the expert-interpreter in terms of artificial intelligence. This ensures identical processing of large data arrays, almost unattainable manually. Besides the algorithm, the paper also reports on the application of the developed algorithmic system for identifying spikes at the INTERMAGNET observatories. The main achievement of the work is the creation of an algorithm permitting the almost unmanned extraction of spike-free (definitive) magnetograms from preliminary records. This automated system is developed for the first time with the application of fuzzy logic system for geomagnetic measurements. It is important to note that the recognition of time disturbances is formalized and identical. The algorithm presented here appreciably increases the reliability of spike-free INTERMAGNET magnetograms, thus increasing the objectivity of our knowledge of the Earth's magnetic field. At the same time, the created system can accomplish identical, formalized, and retrospective analysis of large archives of digital and digitized magnetograms, accumulated in the system of Worldwide Data Centers. The relevant project has already been initiated as a collaborative initiative of the Worldwide Data Center at Geophysical Center (Russian Academy of Sciences) and the NOAA National Geophysical Data Center (Unite States). Thus, by improving and adding objectivity to both new and historical initial data, the developed algorithmic system may contribute appreciably to improving our understanding of the Earth's magnetic field.

  10. Understanding Rape Survivors' Decisions Not to Seek Help from Formal Social Systems

    ERIC Educational Resources Information Center

    Patterson, Debra; Greeson, Megan; Campbell, Rebecca

    2009-01-01

    Few rape survivors seek help from formal social systems after their assault. The purpose of this study was to examine factors that prevent survivors from seeking help from the legal, medical, and mental health systems and rape crisis centers. In this study, 29 female rape survivors who did not seek any postassault formal help were interviewed…

  11. Formal logic rewrite system bachelor in teaching mathematical informatics

    NASA Astrophysics Data System (ADS)

    Habiballa, Hashim; Jendryscik, Radek

    2017-07-01

    The article presents capabilities of the formal rewrite logic system - Bachelor - for teaching theoretical computer science (mathematical informatics). The system Bachelor enables constructivist approach to teaching and therefore it may enhance the learning process in hard informatics essential disciplines. It brings not only detailed description of formal rewrite process but also it can demonstrate algorithmical principles for logic formulae manipulations.

  12. A Formalization of HIPAA for a Medical Messaging System

    NASA Astrophysics Data System (ADS)

    Lam, Peifung E.; Mitchell, John C.; Sundaram, Sharada

    The complexity of regulations in healthcare, financial services, and other industries makes it difficult for enterprises to design and deploy effective compliance systems. We believe that in some applications, it may be practical to support compliance by using formalized portions of applicable laws to regulate business processes that use information systems. In order to explore this possibility, we use a stratified fragment of Prolog with limited use of negation to formalize a portion of the US Health Insurance Portability and Accountability Act (HIPAA). As part of our study, we also explore the deployment of our formalization in a prototype hospital Web portal messaging system.

  13. Systems, methods and apparatus for pattern matching in procedure development and verification

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor)

    2011-01-01

    Systems, methods and apparatus are provided through which, in some embodiments, a formal specification is pattern-matched from scenarios, the formal specification is analyzed, and flaws in the formal specification are corrected. The systems, methods and apparatus may include pattern-matching an equivalent formal model from an informal specification. Such a model can be analyzed for contradictions, conflicts, use of resources before the resources are available, competition for resources, and so forth. From such a formal model, an implementation can be automatically generated in a variety of notations. The approach can improve the resulting implementation, which, in some embodiments, is provably equivalent to the procedures described at the outset, which in turn can improve confidence that the system reflects the requirements, and in turn reduces system development time and reduces the amount of testing required of a new system. Moreover, in some embodiments, two or more implementations can be "reversed" to appropriate formal models, the models can be combined, and the resulting combination checked for conflicts. Then, the combined, error-free model can be used to generate a new (single) implementation that combines the functionality of the original separate implementations, and may be more likely to be correct.

  14. Formalizing structured file services for the data storage and retrieval subsystem of the data management system for Spacestation Freedom

    NASA Technical Reports Server (NTRS)

    Jamsek, Damir A.

    1993-01-01

    A brief example of the use of formal methods techniques in the specification of a software system is presented. The report is part of a larger effort targeted at defining a formal methods pilot project for NASA. One possible application domain that may be used to demonstrate the effective use of formal methods techniques within the NASA environment is presented. It is not intended to provide a tutorial on either formal methods techniques or the application being addressed. It should, however, provide an indication that the application being considered is suitable for a formal methods by showing how such a task may be started. The particular system being addressed is the Structured File Services (SFS), which is a part of the Data Storage and Retrieval Subsystem (DSAR), which in turn is part of the Data Management System (DMS) onboard Spacestation Freedom. This is a software system that is currently under development for NASA. An informal mathematical development is presented. Section 3 contains the same development using Penelope (23), an Ada specification and verification system. The complete text of the English version Software Requirements Specification (SRS) is reproduced in Appendix A.

  15. Formal Methods Tool Qualification

    NASA Technical Reports Server (NTRS)

    Wagner, Lucas G.; Cofer, Darren; Slind, Konrad; Tinelli, Cesare; Mebsout, Alain

    2017-01-01

    Formal methods tools have been shown to be effective at finding defects in safety-critical digital systems including avionics systems. The publication of DO-178C and the accompanying formal methods supplement DO-333 allows applicants to obtain certification credit for the use of formal methods without providing justification for them as an alternative method. This project conducted an extensive study of existing formal methods tools, identifying obstacles to their qualification and proposing mitigations for those obstacles. Further, it interprets the qualification guidance for existing formal methods tools and provides case study examples for open source tools. This project also investigates the feasibility of verifying formal methods tools by generating proof certificates which capture proof of the formal methods tool's claim, which can be checked by an independent, proof certificate checking tool. Finally, the project investigates the feasibility of qualifying this proof certificate checker, in the DO-330 framework, in lieu of qualifying the model checker itself.

  16. What is the right formalism to search for resonances?

    NASA Astrophysics Data System (ADS)

    Mikhasenko, M.; Pilloni, A.; Nys, J.; Albaladejo, M.; Fernández-Ramírez, C.; Jackura, A.; Mathieu, V.; Sherrill, N.; Skwarnicki, T.; Szczepaniak, A. P.

    2018-03-01

    Hadron decay chains constitute one of the main sources of information on the QCD spectrum. We discuss the differences between several partial wave analysis formalisms used in the literature to build the amplitudes. We match the helicity amplitudes to the covariant tensor basis. Hereby, we pay attention to the analytical properties of the amplitudes and separate singularities of kinematical and dynamical nature. We study the analytical properties of the spin-orbit (LS) formalism, and some of the covariant tensor approaches. In particular, we explicitly build the amplitudes for the B→ ψ π K and B→ \\bar{D}π π decays, and show that the energy dependence of the covariant approach is model dependent. We also show that the usual recursive construction of covariant tensors explicitly violates crossing symmetry, which would lead to different resonance parameters extracted from scattering and decay processes.

  17. Generalized Bondi-Sachs equations for characteristic formalism of numerical relativity

    NASA Astrophysics Data System (ADS)

    Cao, Zhoujian; He, Xiaokai

    2013-11-01

    The Cauchy formalism of numerical relativity has been successfully applied to simulate various dynamical spacetimes without any symmetry assumption. But discovering how to set a mathematically consistent and physically realistic boundary condition is still an open problem for Cauchy formalism. In addition, the numerical truncation error and finite region ambiguity affect the accuracy of gravitational wave form calculation. As to the finite region ambiguity issue, the characteristic extraction method helps much. But it does not solve all of the above issues. Besides the above problems for Cauchy formalism, the computational efficiency is another problem. Although characteristic formalism of numerical relativity suffers the difficulty from caustics in the inner near zone, it has advantages in relation to all of the issues listed above. Cauchy-characteristic matching (CCM) is a possible way to take advantage of characteristic formalism regarding these issues and treat the inner caustics at the same time. CCM has difficulty treating the gauge difference between the Cauchy part and the characteristic part. We propose generalized Bondi-Sachs equations for characteristic formalism for the Cauchy-characteristic matching end. Our proposal gives out a possible same numerical evolution scheme for both the Cauchy part and the characteristic part. And our generalized Bondi-Sachs equations have one adjustable gauge freedom which can be used to relate the gauge used in the Cauchy part. Then these equations can make the Cauchy part and the characteristic part share a consistent gauge condition. So our proposal gives a possible new starting point for Cauchy-characteristic matching.

  18. Using formal specification in the Guidance and Control Software (GCS) experiment. Formal design and verification technology for life critical systems

    NASA Technical Reports Server (NTRS)

    Weber, Doug; Jamsek, Damir

    1994-01-01

    The goal of this task was to investigate how formal methods could be incorporated into a software engineering process for flight-control systems under DO-178B and to demonstrate that process by developing a formal specification for NASA's Guidance and Controls Software (GCS) Experiment. GCS is software to control the descent of a spacecraft onto a planet's surface. The GCS example is simplified from a real example spacecraft, but exhibits the characteristics of realistic spacecraft control software. The formal specification is written in Larch.

  19. Interaction of spatially separated oscillating solitons in biased two-photon photorefractive materials

    NASA Astrophysics Data System (ADS)

    Asif, Noushin; Biswas, Anjan; Jovanoski, Z.; Konar, S.

    2015-01-01

    This paper presents the dynamics of two spatially separated optical solitons in two-photon photorefractive materials. The variational formalism has been employed to derive evolution equations of different parameters which characterize the dynamics of two interacting solitons. This approach yields a system of coupled ordinary differential equations for evolution of different parameters characterizing solitons such as amplitude, spatial width, chirp, center of gravity, etc., which have been subsequently solved adopting numerical method to extract information on their dynamics. Depending on their initial separation and power, solitons are shown to either disperse or compresses individually and attract each other. Dragging and trapping of a probe soliton by another pump have been discussed.

  20. Evaluation of light extraction efficiency for the light-emitting diodes based on the transfer matrix formalism and ray-tracing method

    NASA Astrophysics Data System (ADS)

    Pingbo, An; Li, Wang; Hongxi, Lu; Zhiguo, Yu; Lei, Liu; Xin, Xi; Lixia, Zhao; Junxi, Wang; Jinmin, Li

    2016-06-01

    The internal quantum efficiency (IQE) of the light-emitting diodes can be calculated by the ratio of the external quantum efficiency (EQE) and the light extraction efficiency (LEE). The EQE can be measured experimentally, but the LEE is difficult to calculate due to the complicated LED structures. In this work, a model was established to calculate the LEE by combining the transfer matrix formalism and an in-plane ray tracing method. With the calculated LEE, the IQE was determined and made a good agreement with that obtained by the ABC model and temperature-dependent photoluminescence method. The proposed method makes the determination of the IQE more practical and conventional. Project supported by the National Natural Science Foundation of China (Nos.11574306, 61334009), the China International Science and Technology Cooperation Program (No. 2014DFG62280), and the National High Technology Program of China (No. 2015AA03A101).

  1. Provable Transient Recovery for Frame-Based, Fault-Tolerant Computing Systems

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.; Butler, Ricky W.

    1992-01-01

    We present a formal verification of the transient fault recovery aspects of the Reliable Computing Platform (RCP), a fault-tolerant computing system architecture for digital flight control applications. The RCP uses NMR-style redundancy to mask faults and internal majority voting to purge the effects of transient faults. The system design has been formally specified and verified using the EHDM verification system. Our formalization accommodates a wide variety of voting schemes for purging the effects of transients.

  2. Indigenous Knowledge and Education from the Quechua Community to School: Beyond the Formal/Non-Formal Dichotomy

    ERIC Educational Resources Information Center

    Sumida Huaman, Elizabeth; Valdiviezo, Laura Alicia

    2014-01-01

    In this article, we propose to approach Indigenous education beyond the formal/non-formal dichotomy. We argue that there is a critical need to conscientiously include Indigenous knowledge in education processes from the school to the community; particularly, when formal systems exclude Indigenous cultures and languages. Based on ethnographic…

  3. Connecting Architecture and Implementation

    NASA Astrophysics Data System (ADS)

    Buchgeher, Georg; Weinreich, Rainer

    Software architectures are still typically defined and described independently from implementation. To avoid architectural erosion and drift, architectural representation needs to be continuously updated and synchronized with system implementation. Existing approaches for architecture representation like informal architecture documentation, UML diagrams, and Architecture Description Languages (ADLs) provide only limited support for connecting architecture descriptions and implementations. Architecture management tools like Lattix, SonarJ, and Sotoarc and UML-tools tackle this problem by extracting architecture information directly from code. This approach works for low-level architectural abstractions like classes and interfaces in object-oriented systems but fails to support architectural abstractions not found in programming languages. In this paper we present an approach for linking and continuously synchronizing a formalized architecture representation to an implementation. The approach is a synthesis of functionality provided by code-centric architecture management and UML tools and higher-level architecture analysis approaches like ADLs.

  4. FORMAL MODELING, MONITORING, AND CONTROL OF EMERGENCE IN DISTRIBUTED CYBER PHYSICAL SYSTEMS

    DTIC Science & Technology

    2018-02-23

    FORMAL MODELING, MONITORING, AND CONTROL OF EMERGENCE IN DISTRIBUTED CYBER- PHYSICAL SYSTEMS UNIVERSITY OF TEXAS AT ARLINGTON FEBRUARY 2018 FINAL...COVERED (From - To) APR 2015 – APR 2017 4. TITLE AND SUBTITLE FORMAL MODELING, MONITORING, AND CONTROL OF EMERGENCE IN DISTRIBUTED CYBER- PHYSICAL ...dated 16 Jan 09 13. SUPPLEMENTARY NOTES 14. ABSTRACT This project studied emergent behavior in distributed cyber- physical systems (DCPS). Emergent

  5. Properties of a Formal Method to Model Emergence in Swarm-Based Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    Future space missions will require cooperation between multiple satellites and/or rovers. Developers are proposing intelligent autonomous swarms for these missions, but swarm-based systems are difficult or impossible to test with current techniques. This viewgraph presentation examines the use of formal methods in testing swarm-based systems. The potential usefulness of formal methods in modeling the ANTS asteroid encounter mission is also examined.

  6. Unified formalism for higher order non-autonomous dynamical systems

    NASA Astrophysics Data System (ADS)

    Prieto-Martínez, Pedro Daniel; Román-Roy, Narciso

    2012-03-01

    This work is devoted to giving a geometric framework for describing higher order non-autonomous mechanical systems. The starting point is to extend the Lagrangian-Hamiltonian unified formalism of Skinner and Rusk for these kinds of systems, generalizing previous developments for higher order autonomous mechanical systems and first-order non-autonomous mechanical systems. Then, we use this unified formulation to derive the standard Lagrangian and Hamiltonian formalisms, including the Legendre-Ostrogradsky map and the Euler-Lagrange and the Hamilton equations, both for regular and singular systems. As applications of our model, two examples of regular and singular physical systems are studied.

  7. Platonic Symmetry and Geometric Thinking

    ERIC Educational Resources Information Center

    Zsombor-Murray, Paul

    2007-01-01

    Cubic symmetry is used to build the other four Platonic solids and some formalism from classical geometry is introduced. Initially, the approach is via geometric construction, e.g., the "golden ratio" is necessary to construct an icosahedron with pentagonal faces. Then conventional elementary vector algebra is used to extract quantitative…

  8. NASA Formal Methods Workshop, 1990

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W. (Compiler)

    1990-01-01

    The workshop brought together researchers involved in the NASA formal methods research effort for detailed technical interchange and provided a mechanism for interaction with representatives from the FAA and the aerospace industry. The workshop also included speakers from industry to debrief the formal methods researchers on the current state of practice in flight critical system design, verification, and certification. The goals were: define and characterize the verification problem for ultra-reliable life critical flight control systems and the current state of practice in industry today; determine the proper role of formal methods in addressing these problems, and assess the state of the art and recent progress toward applying formal methods to this area.

  9. Formal mechanization of device interactions with a process algebra

    NASA Technical Reports Server (NTRS)

    Schubert, E. Thomas; Levitt, Karl; Cohen, Gerald C.

    1992-01-01

    The principle emphasis is to develop a methodology to formally verify correct synchronization communication of devices in a composed hardware system. Previous system integration efforts have focused on vertical integration of one layer on top of another. This task examines 'horizontal' integration of peer devices. To formally reason about communication, we mechanize a process algebra in the Higher Order Logic (HOL) theorem proving system. Using this formalization we show how four types of device interactions can be represented and verified to behave as specified. The report also describes the specification of a system consisting of an AVM-1 microprocessor and a memory management unit which were verified in previous work. A proof of correct communication is presented, and the extensions to the system specification to add a direct memory device are discussed.

  10. Perceptions of the value of traditional ecological knowledge to formal school curricula: opportunities and challenges from Malekula Island, Vanuatu

    PubMed Central

    2011-01-01

    Background The integration of traditional ecological knowledge (TEK) into formal school curricula may be a key tool for the revitalisation of biocultural diversity, and has the potential to improve the delivery of educational objectives. This paper explores perceptions of the value of TEK to formal education curricula on Malekula Island, Vanuatu. We conducted 49 interviews with key stakeholders (local TEK experts, educators, and officials) regarding the use of the formal school system to transmit, maintain, and revitalise TEK. Interviews also gathered information on the areas where TEK might add value to school curricula and on the perceived barriers to maintaining and revitalising TEK via formal education programs. Results Participants reported that TEK had eroded on Malekula, and identified the formal school system as a principal driver. Most interviewees believed that if an appropriate format could be developed, TEK could be included in the formal education system. Such an approach has potential to maintain customary knowledge and practice in the focus communities. Participants identified several specific domains of TEK for inclusion in school curricula, including ethnomedical knowledge, agricultural knowledge and practice, and the reinforcement of respect for traditional authority and values. However, interviewees also noted a number of practical and epistemological barriers to teaching TEK in school. These included the cultural diversity of Malekula, tensions between public and private forms of knowledge, and multiple values of TEK within the community. Conclusions TEK has potential to add value to formal education systems in Vanuatu by contextualising the content and process of curricular delivery, and by facilitating character development and self-awareness in students. These benefits are congruent with UNESCO-mandated goals for curricular reform and provide a strong argument for the inclusion of TEK in formal school systems. Such approaches may also assist in the maintenance and revitalisation of at-risk systems of ethnobiological knowledge. However, we urge further research attention to the significant epistemological challenges inherent in including TEK in formal school, particularly as participants noted the potential for such approaches to have negative consequences. PMID:22112326

  11. Perceptions of the value of traditional ecological knowledge to formal school curricula: opportunities and challenges from Malekula Island, Vanuatu.

    PubMed

    McCarter, Joe; Gavin, Michael C

    2011-11-23

    The integration of traditional ecological knowledge (TEK) into formal school curricula may be a key tool for the revitalisation of biocultural diversity, and has the potential to improve the delivery of educational objectives. This paper explores perceptions of the value of TEK to formal education curricula on Malekula Island, Vanuatu. We conducted 49 interviews with key stakeholders (local TEK experts, educators, and officials) regarding the use of the formal school system to transmit, maintain, and revitalise TEK. Interviews also gathered information on the areas where TEK might add value to school curricula and on the perceived barriers to maintaining and revitalising TEK via formal education programs. Participants reported that TEK had eroded on Malekula, and identified the formal school system as a principal driver. Most interviewees believed that if an appropriate format could be developed, TEK could be included in the formal education system. Such an approach has potential to maintain customary knowledge and practice in the focus communities. Participants identified several specific domains of TEK for inclusion in school curricula, including ethnomedical knowledge, agricultural knowledge and practice, and the reinforcement of respect for traditional authority and values. However, interviewees also noted a number of practical and epistemological barriers to teaching TEK in school. These included the cultural diversity of Malekula, tensions between public and private forms of knowledge, and multiple values of TEK within the community. TEK has potential to add value to formal education systems in Vanuatu by contextualising the content and process of curricular delivery, and by facilitating character development and self-awareness in students. These benefits are congruent with UNESCO-mandated goals for curricular reform and provide a strong argument for the inclusion of TEK in formal school systems. Such approaches may also assist in the maintenance and revitalisation of at-risk systems of ethnobiological knowledge. However, we urge further research attention to the significant epistemological challenges inherent in including TEK in formal school, particularly as participants noted the potential for such approaches to have negative consequences.

  12. Formal Methods Case Studies for DO-333

    NASA Technical Reports Server (NTRS)

    Cofer, Darren; Miller, Steven P.

    2014-01-01

    RTCA DO-333, Formal Methods Supplement to DO-178C and DO-278A provides guidance for software developers wishing to use formal methods in the certification of airborne systems and air traffic management systems. The supplement identifies the modifications and additions to DO-178C and DO-278A objectives, activities, and software life cycle data that should be addressed when formal methods are used as part of the software development process. This report presents three case studies describing the use of different classes of formal methods to satisfy certification objectives for a common avionics example - a dual-channel Flight Guidance System. The three case studies illustrate the use of theorem proving, model checking, and abstract interpretation. The material presented is not intended to represent a complete certification effort. Rather, the purpose is to illustrate how formal methods can be used in a realistic avionics software development project, with a focus on the evidence produced that could be used to satisfy the verification objectives found in Section 6 of DO-178C.

  13. What is the right formalism to search for resonances?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mikhasenko, M.; Pilloni, A.; Nys, J.

    Hmore » adron decay chains constitute one of the main sources of information on the QCD spectrum. We discuss the differences between several partial wave analysis formalisms used in the literature to build the amplitudes. We match the helicity amplitudes to the covariant tensor basis. ereby, we pay attention to the analytical properties of the amplitudes and separate singularities of kinematical and dynamical nature. We study the analytical properties of the spin-orbit (LS) formalism, and some of the covariant tensor approaches. In particular, we explicitly build the amplitudes for the B → ψ π K and B → D ¯ π π decays, and show that the energy dependence of the covariant approach is model dependent. We also show that the usual recursive construction of covariant tensors explicitly violates crossing symmetry, which would lead to different resonance parameters extracted from scattering and decay processes.« less

  14. What is the right formalism to search for resonances?

    DOE PAGES

    Mikhasenko, M.; Pilloni, A.; Nys, J.; ...

    2018-03-17

    Hmore » adron decay chains constitute one of the main sources of information on the QCD spectrum. We discuss the differences between several partial wave analysis formalisms used in the literature to build the amplitudes. We match the helicity amplitudes to the covariant tensor basis. ereby, we pay attention to the analytical properties of the amplitudes and separate singularities of kinematical and dynamical nature. We study the analytical properties of the spin-orbit (LS) formalism, and some of the covariant tensor approaches. In particular, we explicitly build the amplitudes for the B → ψ π K and B → D ¯ π π decays, and show that the energy dependence of the covariant approach is model dependent. We also show that the usual recursive construction of covariant tensors explicitly violates crossing symmetry, which would lead to different resonance parameters extracted from scattering and decay processes.« less

  15. Need for Formal Specialization in Pharmacy in Canada: A Survey of Hospital Pharmacists

    PubMed Central

    Penm, Jonathan; MacKinnon, Neil J; Jorgenson, Derek; Ying, Jun; Smith, Jennifer

    2016-01-01

    Background The Blueprint for Pharmacy was a collaborative initiative involving all of the major pharmacy associations in Canada. It aimed to coordinate, facilitate, and be a catalyst for changes required to align pharmacy practice with the health care needs of Canadians. In partial fulfilment of this mandate, a needs assessment for specialist certification for pharmacists was conducted. Objective To conduct a secondary analysis of data from the needs assessment to determine the perceptions of hospital pharmacists regarding a formal certification process for pharmacist specialties in Canada. Methods A survey was developed in consultation with the Blueprint for Pharmacy Specialization Project Advisory Group and other key stakeholders. It was distributed electronically, in English and French, to Canadian pharmacists identified through national and provincial pharmacy organizations (survey period January 15 to February 12, 2015). Data for hospital pharmacists were extracted for this secondary analysis. Multivariable logistic regression analyses were conducted to characterize those respondents who supported the certification process and those intending to become certified if a Canadian process were introduced. Results A total of 640 responses were received from hospital pharmacists. Nearly 85% of the respondents (543/640 [84.8%]) supported a formal certification process for pharmacist specialization, and more than 70% (249/349 [71.3%]) indicated their intention to obtain specialty certification if a Canadian process were introduced. Respondents believed that the main barriers to developing such a system were lack of reimbursement models, the time required, and lack of public awareness of pharmacist specialties. They felt that the most important factors for an optimal certification process were a consistent definition of pharmacist specialty practice and consistent recognition of pharmacist specialty practice across Canada. Multiple regression analysis showed that female respondents were more likely to support a formal certification process (odds ratio [OR] 2.6, 95% confidence interval [CI] 1.2–5.7). Also, those who already specialized in pharmacotherapy were more likely to support mandatory certification (OR 2.6, 95% CI 1.1–6.1). Conclusions Hospital pharmacists who responded to this survey overwhelmingly supported certification for pharmacist specialization in Canada. Questions remain about the feasibility of establishing a pharmacist specialization system in Canada. PMID:27826153

  16. Many-Body Spectral Functions from Steady State Density Functional Theory.

    PubMed

    Jacob, David; Kurth, Stefan

    2018-03-14

    We propose a scheme to extract the many-body spectral function of an interacting many-electron system from an equilibrium density functional theory (DFT) calculation. To this end we devise an ideal scanning tunneling microscope (STM) setup and employ the recently proposed steady-state DFT formalism (i-DFT) which allows one to calculate the steady current through a nanoscopic region coupled to two biased electrodes. In our setup, one of the electrodes serves as a probe ("STM tip"), which is weakly coupled to the system we want to measure. In the ideal STM limit of vanishing coupling to the tip, the system is restored to quasi-equilibrium and the normalized differential conductance yields the exact equilibrium many-body spectral function. Calculating this quantity from i-DFT, we derive an exact relation expressing the interacting spectral function in terms of the Kohn-Sham one. As illustrative examples, we apply our scheme to calculate the spectral functions of two nontrivial model systems, namely the single Anderson impurity model and the Constant Interaction Model.

  17. Development of village doctors in China: financial compensation and health system support.

    PubMed

    Hu, Dan; Zhu, Weiming; Fu, Yaqun; Zhang, Minmin; Zhao, Yang; Hanson, Kara; Martinez-Alvarez, Melisa; Liu, Xiaoyun

    2017-07-01

    Since 1968, China has trained about 1.5 million barefoot doctors in a few years' time to provide basic health services to 0.8 billion rural population. China's Ministry of Health stopped using the term of barefoot doctor in 1985, and changed policy to develop village doctors. Since then, village doctors have kept on playing an irreplaceable role in China's rural health, even though the number of village doctors has fluctuated over the years and they face serious challenges. United Nations declared Sustainable Development Goals in 2015 to achieve universal health coverage by 2030. Under this context, development of Community Health workers (CHWs) has become an emerging policy priority in many resource-poor developing countries. China's experiences and lessons learnt in developing and maintaining village doctors may be useful for these developing countries. This paper aims to synthesis lessons learnt from the Chinese CHW experiences. It summarizes China's experiences in exploring and using strategic partnership between the community and the formal health system to develop CHWs in the two stages, the barefoot doctor stage (1968 -1985) and the village doctor stage (1985-now). Chinese and English literature were searched from PubMed, CNKI and Wanfang. The information extracted from the selected articles were synthesized according to the four partnership strategies for communities and health system to support CHW development, namely 1) joint ownership and design of CHW programmes; 2) collaborative supervision and constructive feedback; 3) a balanced package of incentives, both financial and non-financial; and 4) a practical monitoring system incorporating data from the health system and community. The study found that the townships and villages provided an institutional basis for barefoot doctor policy, while the formal health system, including urban hospitals, county health schools, township health centers, and mobile medical teams provided training to the barefoot doctors. But After 1985, the formal health system played a more dominant role in the CHW system including both selection and training of village doctors. China applied various mechanisms to compensate village doctors in different stages. During 1960s and 1970s, the main income source of barefoot doctors was from their villages' collective economy. After 1985 when the rural collective economy collapsed and barefoot doctors were transformed to village doctors, they depended on user fees, especially from drug sale revenues. In the new century, especially after the new round of health system reform in 2009, government subsidy has become an increasing source of village doctors' income. The barefoot doctor policy has played a significant role in providing basic human resources for health and basic health services to rural populations when rural area had great shortages of health resources. The key experiences for this great achievement are the intersection between the community and the formal health system, and sustained and stable financial compensation to the community health workers.

  18. Hamilton-Jacobi formalism to warm inflationary scenario

    NASA Astrophysics Data System (ADS)

    Sayar, K.; Mohammadi, A.; Akhtari, L.; Saaidi, Kh.

    2017-01-01

    The Hamilton-Jacobi formalism as a powerful method is being utilized to reconsider the warm inflationary scenario, where the scalar field as the main component driving inflation interacts with other fields. Separating the context into strong and weak dissipative regimes, the goal is followed for two popular functions of Γ . Applying slow-rolling approximation, the required perturbation parameters are extracted and, by comparing to the latest Planck data, the free parameters are restricted. The possibility of producing an acceptable inflation is studied where the result shows that for all cases the model could successfully suggest the amplitude of scalar perturbation, scalar spectral index, its running, and the tensor-to-scalar ratio.

  19. Schwinger-Keldysh formalism. Part II: thermal equivariant cohomology

    NASA Astrophysics Data System (ADS)

    Haehl, Felix M.; Loganayagam, R.; Rangamani, Mukund

    2017-06-01

    Causally ordered correlation functions of local operators in near-thermal quantum systems computed using the Schwinger-Keldysh formalism obey a set of Ward identities. These can be understood rather simply as the consequence of a topological (BRST) algebra, called the universal Schwinger-Keldysh superalgebra, as explained in our compan-ion paper [1]. In the present paper we provide a mathematical discussion of this topological algebra. In particular, we argue that the structures can be understood in the language of extended equivariant cohomology. To keep the discussion self-contained, we provide a ba-sic review of the algebraic construction of equivariant cohomology and explain how it can be understood in familiar terms as a superspace gauge algebra. We demonstrate how the Schwinger-Keldysh construction can be succinctly encoded in terms a thermal equivariant cohomology algebra which naturally acts on the operator (super)-algebra of the quantum system. The main rationale behind this exploration is to extract symmetry statements which are robust under renormalization group flow and can hence be used to understand low-energy effective field theory of near-thermal physics. To illustrate the general prin-ciples, we focus on Langevin dynamics of a Brownian particle, rephrasing some known results in terms of thermal equivariant cohomology. As described elsewhere, the general framework enables construction of effective actions for dissipative hydrodynamics and could potentially illumine our understanding of black holes.

  20. Nucleon resonances in exclusive reactions of photo- and electroproduction of mesons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skorodumina, Iu. A.; Burkert, V. D.; Golovach, E. N.

    2015-11-01

    Methods for extracting nucleon resonance parameters from experimental data are reviewed. The formalism for the description of exclusive reactions of meson photo- and electroproduction off nucleons is discussed. Recent experimental data on exclusive meson production in the scattering of electrons and photons off protons are analyzed.

  1. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2004-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  2. Associating Human-Centered Concepts with Social Networks Using Fuzzy Sets

    NASA Astrophysics Data System (ADS)

    Yager, Ronald R.

    The rapidly growing global interconnectivity, brought about to a large extent by the Internet, has dramatically increased the importance and diversity of social networks. Modern social networks cut across a spectrum from benign recreational focused websites such as Facebook to occupationally oriented websites such as LinkedIn to criminally focused groups such as drug cartels to devastation and terror focused groups such as Al-Qaeda. Many organizations are interested in analyzing and extracting information related to these social networks. Among these are governmental police and security agencies as well marketing and sales organizations. To aid these organizations there is a need for technologies to model social networks and intelligently extract information from these models. While established technologies exist for the modeling of relational networks [1-7] few technologies exist to extract information from these, compatible with human perception and understanding. Data bases is an example of a technology in which we have tools for representing our information as well as tools for querying and extracting the information contained. Our goal is in some sense analogous. We want to use the relational network model to represent information, in this case about relationships and interconnections, and then be able to query the social network using intelligent human-centered concepts. To extend our capabilities to interact with social relational networks we need to associate with these network human concepts and ideas. Since human beings predominantly use linguistic terms in which to reason and understand we need to build bridges between human conceptualization and the formal mathematical representation of the social network. Consider for example a concept such as "leader". An analyst may be able to express, in linguistic terms, using a network relevant vocabulary, properties of a leader. Our task is to translate this linguistic description into a mathematical formalism that allows us to determine how true it is that a particular node is a leader. In this work we look at the use of fuzzy set methodologies [8-10] to provide a bridge between the human analyst and the formal model of the network.

  3. Reduced quantum dynamics with arbitrary bath spectral densities: hierarchical equations of motion based on several different bath decomposition schemes.

    PubMed

    Liu, Hao; Zhu, Lili; Bai, Shuming; Shi, Qiang

    2014-04-07

    We investigated applications of the hierarchical equation of motion (HEOM) method to perform high order perturbation calculations of reduced quantum dynamics for a harmonic bath with arbitrary spectral densities. Three different schemes are used to decompose the bath spectral density into analytical forms that are suitable to the HEOM treatment: (1) The multiple Lorentzian mode model that can be obtained by numerically fitting the model spectral density. (2) The combined Debye and oscillatory Debye modes model that can be constructed by fitting the corresponding classical bath correlation function. (3) A new method that uses undamped harmonic oscillator modes explicitly in the HEOM formalism. Methods to extract system-bath correlations were investigated for the above bath decomposition schemes. We also show that HEOM in the undamped harmonic oscillator modes can give detailed information on the partial Wigner transform of the total density operator. Theoretical analysis and numerical simulations of the spin-Boson dynamics and the absorption line shape of molecular dimers show that the HEOM formalism for high order perturbations can serve as an important tool in studying the quantum dissipative dynamics in the intermediate coupling regime.

  4. Reduced quantum dynamics with arbitrary bath spectral densities: Hierarchical equations of motion based on several different bath decomposition schemes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Hao; Zhu, Lili; Bai, Shuming

    2014-04-07

    We investigated applications of the hierarchical equation of motion (HEOM) method to perform high order perturbation calculations of reduced quantum dynamics for a harmonic bath with arbitrary spectral densities. Three different schemes are used to decompose the bath spectral density into analytical forms that are suitable to the HEOM treatment: (1) The multiple Lorentzian mode model that can be obtained by numerically fitting the model spectral density. (2) The combined Debye and oscillatory Debye modes model that can be constructed by fitting the corresponding classical bath correlation function. (3) A new method that uses undamped harmonic oscillator modes explicitly inmore » the HEOM formalism. Methods to extract system-bath correlations were investigated for the above bath decomposition schemes. We also show that HEOM in the undamped harmonic oscillator modes can give detailed information on the partial Wigner transform of the total density operator. Theoretical analysis and numerical simulations of the spin-Boson dynamics and the absorption line shape of molecular dimers show that the HEOM formalism for high order perturbations can serve as an important tool in studying the quantum dissipative dynamics in the intermediate coupling regime.« less

  5. NASA Langley Research and Technology-Transfer Program in Formal Methods

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Caldwell, James L.; Carreno, Victor A.; Holloway, C. Michael; Miner, Paul S.; DiVito, Ben L.

    1995-01-01

    This paper presents an overview of NASA Langley research program in formal methods. The major goals of this work are to make formal methods practical for use on life critical systems, and to orchestrate the transfer of this technology to U.S. industry through use of carefully designed demonstration projects. Several direct technology transfer efforts have been initiated that apply formal methods to critical subsystems of real aerospace computer systems. The research team consists of five NASA civil servants and contractors from Odyssey Research Associates, SRI International, and VIGYAN Inc.

  6. The Lifelong Learning Iceberg of Information Systems Academics--A Study of On-Going Formal and Informal Learning by Academics

    ERIC Educational Resources Information Center

    Davey, Bill; Tatnall, Arthur

    2007-01-01

    This article describes a study that examined the lifelong learning of information systems academics in relation to their normal work. It begins by considering the concept of lifelong learning, its relationship to real-life learning and that lifelong learning should encompass the whole spectrum of formal, non-formal and informal learning. Most…

  7. Formal design and verification of a reliable computing platform for real-time control. Phase 2: Results

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Divito, Ben L.

    1992-01-01

    The design and formal verification of the Reliable Computing Platform (RCP), a fault tolerant computing system for digital flight control applications is presented. The RCP uses N-Multiply Redundant (NMR) style redundancy to mask faults and internal majority voting to flush the effects of transient faults. The system is formally specified and verified using the Ehdm verification system. A major goal of this work is to provide the system with significant capability to withstand the effects of High Intensity Radiated Fields (HIRF).

  8. Reconfigurable Hardware Adapts to Changing Mission Demands

    NASA Technical Reports Server (NTRS)

    2003-01-01

    A new class of computing architectures and processing systems, which use reconfigurable hardware, is creating a revolutionary approach to implementing future spacecraft systems. With the increasing complexity of electronic components, engineers must design next-generation spacecraft systems with new technologies in both hardware and software. Derivation Systems, Inc., of Carlsbad, California, has been working through NASA s Small Business Innovation Research (SBIR) program to develop key technologies in reconfigurable computing and Intellectual Property (IP) soft cores. Founded in 1993, Derivation Systems has received several SBIR contracts from NASA s Langley Research Center and the U.S. Department of Defense Air Force Research Laboratories in support of its mission to develop hardware and software for high-assurance systems. Through these contracts, Derivation Systems began developing leading-edge technology in formal verification, embedded Java, and reconfigurable computing for its PF3100, Derivational Reasoning System (DRS ), FormalCORE IP, FormalCORE PCI/32, FormalCORE DES, and LavaCORE Configurable Java Processor, which are designed for greater flexibility and security on all space missions.

  9. From Informal to Formal: Status and Challenges of Informal Water Infrastructures in Indonesia

    NASA Astrophysics Data System (ADS)

    Maryati, S.; Humaira, A. N. S.; Kipuw, D. M.

    2018-05-01

    Informal water infrastructures in Indonesia have emerged due to the government’s inability or incapacity to guarantee the service of water provision to all communities. Communities have their own mechanisms to meet their water needs and arrange it as a self-supplying or self-governed form of water infrastructure provision. In general, infrastructure provisions in Indonesia are held in the form of public systems (centralized systems) that cover most of the urban communities; communal systems that serve some groups of households limited only to a particular small-scale area; and individual systems. The communal and individual systems are systems that are provided by the communities themselves, sometimes with some intervention by the government. This kind of system is usually built according to lower standards compared to the system built by the government. Informal systems in this study are not defined in terms of their legal aspect, but more in technical terms. The aim of this study was to examine the existing status and challenges in transforming informal water infrastructures to formal infrastructures. Formalizing informal infrastructures is now becoming an issue because of the limitations the government faces in building new formal infrastructures. On the other hand, global and national targets state 100% access to water supplies for the whole population in the near future. Formalizing informal infrastructures seems more realistic than building new infrastructures. The scope of this study were the technical aspects thereof. Making descriptive and comparative analyses was the methodology used. Generally, most of the informal systems do not apply progressive tariffs, do not have storage/reservoirs, do not have water treatment plants, and rarely conduct treatment in accordance with standards and procedures as formal systems do, which leads to dubious access to safe water, especially considering the quality aspect.

  10. Modeling formalisms in Systems Biology

    PubMed Central

    2011-01-01

    Systems Biology has taken advantage of computational tools and high-throughput experimental data to model several biological processes. These include signaling, gene regulatory, and metabolic networks. However, most of these models are specific to each kind of network. Their interconnection demands a whole-cell modeling framework for a complete understanding of cellular systems. We describe the features required by an integrated framework for modeling, analyzing and simulating biological processes, and review several modeling formalisms that have been used in Systems Biology including Boolean networks, Bayesian networks, Petri nets, process algebras, constraint-based models, differential equations, rule-based models, interacting state machines, cellular automata, and agent-based models. We compare the features provided by different formalisms, and discuss recent approaches in the integration of these formalisms, as well as possible directions for the future. PMID:22141422

  11. Non-formal Education in the Philippines: A Fundamental Step towards Lifelong Learning.

    ERIC Educational Resources Information Center

    Gonzales, Ma. Celeste T.; Pijano, Ma. Concepcion V.

    In order to significantly contribute to human resource development, the Philippines must develop an integrated educational system of lifelong learning, with a special emphasis on non-formal education. Despite the value that is placed on formal, or sequential academic schooling, it is non-formal schooling that makes accessible the acquisition of…

  12. From Regulation to Virtue: A Critique of Ethical Formalism in Research Organizations

    ERIC Educational Resources Information Center

    Atkinson, Timothy N.; Butler, Jesse W.

    2012-01-01

    The following article argues that the research compliance system has some flaws that should be addressed, particularly with regard to excessive emphasis of and reliance upon formal regulations in research administration. Ethical formalism, understood here as the use of formal rules for the determination of behavior, is not an optimal perspective…

  13. Formal verification of an avionics microprocessor

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam, K.; Miller, Steven P.

    1995-01-01

    Formal specification combined with mechanical verification is a promising approach for achieving the extremely high levels of assurance required of safety-critical digital systems. However, many questions remain regarding their use in practice: Can these techniques scale up to industrial systems, where are they likely to be useful, and how should industry go about incorporating them into practice? This report discusses a project undertaken to answer some of these questions, the formal verification of the AAMPS microprocessor. This project consisted of formally specifying in the PVS language a rockwell proprietary microprocessor at both the instruction-set and register-transfer levels and using the PVS theorem prover to show that the microcode correctly implemented the instruction-level specification for a representative subset of instructions. Notable aspects of this project include the use of a formal specification language by practicing hardware and software engineers, the integration of traditional inspections with formal specifications, and the use of a mechanical theorem prover to verify a portion of a commercial, pipelined microprocessor that was not explicitly designed for formal verification.

  14. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  15. Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a: system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the ciasses of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.

  16. Calculation of a solid/liquid surface tension: A methodological study

    NASA Astrophysics Data System (ADS)

    Dreher, T.; Lemarchand, C.; Soulard, L.; Bourasseau, E.; Malfreyt, P.; Pineau, N.

    2018-01-01

    The surface tension of a model solid/liquid interface constituted of a graphene sheet surrounded by liquid methane has been computed using molecular dynamics in the Kirkwood-Buff formalism. We show that contrary to the fluid/fluid case, the solid/liquid case can lead to different structurations of the first fluid layer, leading to significantly different values of surface tension. Therefore we present a statistical approach that consists in running a series of molecular simulations of similar systems with different initial conditions, leading to a distribution of surface tensions from which an average value and uncertainty can be extracted. Our results suggest that these distributions converge as the system size increases. Besides we show that surface tension is not particularly sensitive to the choice of the potential energy cutoff and that long-range corrections can be neglected contrary to what we observed in the liquid/vapour interfaces. We have not observed the previously reported commensurability effect.

  17. Classification of parotidectomies: a proposal of the European Salivary Gland Society.

    PubMed

    Quer, M; Guntinas-Lichius, O; Marchal, F; Vander Poorten, V; Chevalier, D; León, X; Eisele, D; Dulguerov, P

    2016-10-01

    The objective of this study is to provide a comprehensive classification system for parotidectomy operations. Data sources include Medline publications, author's experience, and consensus round table at the Third European Salivary Gland Society (ESGS) Meeting. The Medline database was searched with the term "parotidectomy" and "definition". The various definitions of parotidectomy procedures and parotid gland subdivisions extracted. Previous classification systems re-examined and a new classification proposed by a consensus. The ESGS proposes to subdivide the parotid parenchyma in five levels: I (lateral superior), II (lateral inferior), III (deep inferior), IV (deep superior), V (accessory). A new classification is proposed where the type of resection is divided into formal parotidectomy with facial nerve dissection and extracapsular dissection. Parotidectomies are further classified according to the levels removed, as well as the extra-parotid structures ablated. A new classification of parotidectomy procedures is proposed.

  18. Stereo Image Ranging For An Autonomous Robot Vision System

    NASA Astrophysics Data System (ADS)

    Holten, James R.; Rogers, Steven K.; Kabrisky, Matthew; Cross, Steven

    1985-12-01

    The principles of stereo vision for three-dimensional data acquisition are well-known and can be applied to the problem of an autonomous robot vehicle. Coincidental points in the two images are located and then the location of that point in a three-dimensional space can be calculated using the offset of the points and knowledge of the camera positions and geometry. This research investigates the application of artificial intelligence knowledge representation techniques as a means to apply heuristics to relieve the computational intensity of the low level image processing tasks. Specifically a new technique for image feature extraction is presented. This technique, the Queen Victoria Algorithm, uses formal language productions to process the image and characterize its features. These characterized features are then used for stereo image feature registration to obtain the required ranging information. The results can be used by an autonomous robot vision system for environmental modeling and path finding.

  19. Software Validation via Model Animation

    NASA Technical Reports Server (NTRS)

    Dutle, Aaron M.; Munoz, Cesar A.; Narkawicz, Anthony J.; Butler, Ricky W.

    2015-01-01

    This paper explores a new approach to validating software implementations that have been produced from formally-verified algorithms. Although visual inspection gives some confidence that the implementations faithfully reflect the formal models, it does not provide complete assurance that the software is correct. The proposed approach, which is based on animation of formal specifications, compares the outputs computed by the software implementations on a given suite of input values to the outputs computed by the formal models on the same inputs, and determines if they are equal up to a given tolerance. The approach is illustrated on a prototype air traffic management system that computes simple kinematic trajectories for aircraft. Proofs for the mathematical models of the system's algorithms are carried out in the Prototype Verification System (PVS). The animation tool PVSio is used to evaluate the formal models on a set of randomly generated test cases. Output values computed by PVSio are compared against output values computed by the actual software. This comparison improves the assurance that the translation from formal models to code is faithful and that, for example, floating point errors do not greatly affect correctness and safety properties.

  20. Proceedings of the Sixth NASA Langley Formal Methods (LFM) Workshop

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Yvonne (Editor)

    2008-01-01

    Today's verification techniques are hard-pressed to scale with the ever-increasing complexity of safety critical systems. Within the field of aeronautics alone, we find the need for verification of algorithms for separation assurance, air traffic control, auto-pilot, Unmanned Aerial Vehicles (UAVs), adaptive avionics, automated decision authority, and much more. Recent advances in formal methods have made verifying more of these problems realistic. Thus we need to continually re-assess what we can solve now and identify the next barriers to overcome. Only through an exchange of ideas between theoreticians and practitioners from academia to industry can we extend formal methods for the verification of ever more challenging problem domains. This volume contains the extended abstracts of the talks presented at LFM 2008: The Sixth NASA Langley Formal Methods Workshop held on April 30 - May 2, 2008 in Newport News, Virginia, USA. The topics of interest that were listed in the call for abstracts were: advances in formal verification techniques; formal models of distributed computing; planning and scheduling; automated air traffic management; fault tolerance; hybrid systems/hybrid automata; embedded systems; safety critical applications; safety cases; accident/safety analysis.

  1. A new and trustworthy formalism to compute entropy in quantum systems

    NASA Astrophysics Data System (ADS)

    Ansari, Mohammad

    Entropy is nonlinear in density matrix and as such its evaluation in open quantum system has not been fully understood. Recently a quantum formalism was proposed by Ansari and Nazarov that evaluates entropy using parallel time evolutions of multiple worlds. We can use this formalism to evaluate entropy flow in a photovoltaic cells coupled to thermal reservoirs and cavity modes. Recently we studied the full counting statistics of energy transfers in such systems. This rigorously proves a nontrivial correspondence between energy exchanges and entropy changes in quantum systems, which only in systems without entanglement can be simplified to the textbook second law of thermodynamics. We evaluate the flow of entropy using this formalism. In the presence of entanglement, however, interestingly much less information is exchanged than what we expected. This increases the upper limit capacity for information transfer and its conversion to energy for next generation devices in mesoscopic physics.

  2. Helping System Engineers Bridge the Peaks

    NASA Technical Reports Server (NTRS)

    Rungta, Neha; Tkachuk, Oksana; Person, Suzette; Biatek, Jason; Whalen, Michael W.; Castle, Joseph; Castle, JosephGundy-Burlet, Karen

    2014-01-01

    In our experience at NASA, system engineers generally follow the Twin Peaks approach when developing safety-critical systems. However, iterations between the peaks require considerable manual, and in some cases duplicate, effort. A significant part of the manual effort stems from the fact that requirements are written in English natural language rather than a formal notation. In this work, we propose an approach that enables system engineers to leverage formal requirements and automated test generation to streamline iterations, effectively "bridging the peaks". The key to the approach is a formal language notation that a) system engineers are comfortable with, b) is supported by a family of automated V&V tools, and c) is semantically rich enough to describe the requirements of interest. We believe the combination of formalizing requirements and providing tool support to automate the iterations will lead to a more efficient Twin Peaks implementation at NASA.

  3. Unchartered innovation? Local reforms of national formal water management in the Mkoji sub-catchment, Tanzania

    NASA Astrophysics Data System (ADS)

    Mehari, Abraham; Koppen, Barbara Van; McCartney, Matthew; Lankford, Bruce

    Tanzania is currently attempting to improve water resources management through formal water rights and water fees systems, and formal institutions. The water rights system is expected to facilitate water allocation. The water fees system aims at cost-recovery for water resources management services. To enhance community involvement in water management, Water User Associations (WUAs) are being established and, in areas with growing upstream-downstream conflicts, apex bodies of all users along the stressed river stretch. The Mkoji sub-catchment (MSC) in the Rufiji basin is one of the first where these formal water management systems are being attempted. This paper analyzes the effectiveness of these systems in the light of their expected merits and the consequences of the juxtaposition of contemporary laws with traditional approaches. The study employed mainly qualitative, but also quantitative approaches on social and technical variables. Major findings were: (1) a good mix of formal (water fees and WUAs) and traditional (rotation-based water sharing, the Zamu) systems improved village-level water management services and reduced intra-scheme conflicts; (2) the water rights system has not brought abstractions into line with allocations and (3) so far, the MSC Apex body failed to mitigate inter-scheme conflicts. A more sophisticated design of allocation infrastructure and institutions is recommended.

  4. Properties of a Formal Method for Prediction of Emergent Behaviors in Swarm-based Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James

    2004-01-01

    Autonomous intelligent swarms of satellites are being proposed for NASA missions that have complex behaviors and interactions. The emergent properties of swarms make these missions powerful, but at the same time more difficult to design and assure that proper behaviors will emerge. This paper gives the results of research into formal methods techniques for verification and validation of NASA swarm-based missions. Multiple formal methods were evaluated to determine their effectiveness in modeling and assuring the behavior of swarms of spacecraft. The NASA ANTS mission was used as an example of swarm intelligence for which to apply the formal methods. This paper will give the evaluation of these formal methods and give partial specifications of the ANTS mission using four selected methods. We then give an evaluation of the methods and the needed properties of a formal method for effective specification and prediction of emergent behavior in swarm-based systems.

  5. Formal Specification of Information Systems Requirements.

    ERIC Educational Resources Information Center

    Kampfner, Roberto R.

    1985-01-01

    Presents a formal model for specification of logical requirements of computer-based information systems that incorporates structural and dynamic aspects based on two separate models: the Logical Information Processing Structure and the Logical Information Processing Network. The model's role in systems development is discussed. (MBR)

  6. 30 CFR 843.15 - Informal public hearing.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., “mining” includes (1) extracting coal from the earth or from coal waste piles and transporting it within... section shall be delivered to such person by an authorized representative or sent by certified mail to... of the mine. (e) Section 554 of Title 5 of the United States Code, regarding requirements for formal...

  7. Formal methods for dependable real-time systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1993-01-01

    The motivation for using formal methods to specify and reason about real time properties is outlined and approaches that were proposed and used are sketched. The formal verifications of clock synchronization algorithms are concluded as showing that mechanically supported reasoning about complex real time behavior is feasible. However, there was significant increase in the effectiveness of verification systems since those verifications were performed, at it is to be expected that verifications of comparable difficulty will become fairly routine. The current challenge lies in developing perspicuous and economical approaches to the formalization and specification of real time properties.

  8. Formal methods and their role in digital systems validation for airborne systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1995-01-01

    This report is based on one prepared as a chapter for the FAA Digital Systems Validation Handbook (a guide to assist FAA certification specialists with advanced technology issues). Its purpose is to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used in critical applications; and to suggest factors for consideration when formal methods are offered in support of certification. The presentation concentrates on the rationale for formal methods and on their contribution to assurance for critical applications within a context such as that provided by DO-178B (the guidelines for software used on board civil aircraft); it is intended as an introduction for those to whom these topics are new.

  9. Photocarrier extraction in GaAsSb/GaAsN type-II QW superlattice solar cells

    NASA Astrophysics Data System (ADS)

    Aeberhard, U.; Gonzalo, A.; Ulloa, J. M.

    2018-05-01

    Photocarrier transport and extraction in GaAsSb/GaAsN type-II quantum well superlattices are investigated by means of inelastic quantum transport calculations based on the non-equilibrium Green's function formalism. Evaluation of the local density of states and the spectral current flow enables the identification of different regimes for carrier localization, transport, and extraction as a function of configurational parameters. These include the number of periods, the thicknesses of the individual layers in one period, the built-in electric field, and the temperature of operation. The results for the carrier extraction efficiency are related to experimental data for different symmetric GaAsSb/GaAsN type-II quantum well superlattice solar cell devices and provide a qualitative explanation for the experimentally observed dependence of photovoltaic device performance on the period thickness.

  10. Formal reasoning about systems biology using theorem proving

    PubMed Central

    Hasan, Osman; Siddique, Umair; Tahar, Sofiène

    2017-01-01

    System biology provides the basis to understand the behavioral properties of complex biological organisms at different levels of abstraction. Traditionally, analysing systems biology based models of various diseases have been carried out by paper-and-pencil based proofs and simulations. However, these methods cannot provide an accurate analysis, which is a serious drawback for the safety-critical domain of human medicine. In order to overcome these limitations, we propose a framework to formally analyze biological networks and pathways. In particular, we formalize the notion of reaction kinetics in higher-order logic and formally verify some of the commonly used reaction based models of biological networks using the HOL Light theorem prover. Furthermore, we have ported our earlier formalization of Zsyntax, i.e., a deductive language for reasoning about biological networks and pathways, from HOL4 to the HOL Light theorem prover to make it compatible with the above-mentioned formalization of reaction kinetics. To illustrate the usefulness of the proposed framework, we present the formal analysis of three case studies, i.e., the pathway leading to TP53 Phosphorylation, the pathway leading to the death of cancer stem cells and the tumor growth based on cancer stem cells, which is used for the prognosis and future drug designs to treat cancer patients. PMID:28671950

  11. BRST Formalism for Systems with Higher Order Derivatives of Gauge Parameters

    NASA Astrophysics Data System (ADS)

    Nirov, Kh. S.

    For a wide class of mechanical systems, invariant under gauge transformations with arbitrary higher order time derivatives of gauge parameters, the equivalence of Lagrangian and Hamiltonian BRST formalisms is proved. It is shown that the Ostrogradsky formalism establishes the natural rules to relate the BFV ghost canonical pairs with the ghosts and antighosts introduced by the Lagrangian approach. Explicit relation between corresponding gauge-fixing terms is obtained.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.

    Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.

  13. Formal Assurance for Cognitive Architecture Based Autonomous Agent

    NASA Technical Reports Server (NTRS)

    Bhattacharyya, Siddhartha; Eskridge, Thomas; Neogi, Natasha; Carvalho, Marco

    2017-01-01

    Autonomous systems are designed and deployed in different modeling paradigms. These environments focus on specific concepts in designing the system. We focus our effort in the use of cognitive architectures to design autonomous agents to collaborate with humans to accomplish tasks in a mission. Our research focuses on introducing formal assurance methods to verify the behavior of agents designed in Soar, by translating the agent to the formal verification environment Uppaal.

  14. Qualitative review of usability problems in health information systems for radiology.

    PubMed

    Dias, Camila Rodrigues; Pereira, Marluce Rodrigues; Freire, André Pimenta

    2017-12-01

    Radiology processes are commonly supported by Radiology Information System (RIS), Picture Archiving and Communication System (PACS) and other software for radiology. However, these information technologies can present usability problems that affect the performance of radiologists and physicians, especially considering the complexity of the tasks involved. The purpose of this study was to extract, classify and analyze qualitatively the usability problems in PACS, RIS and other software for radiology. A systematic review was performed to extract usability problems reported in empirical usability studies in the literature. The usability problems were categorized as violations of Nielsen and Molich's usability heuristics. The qualitative analysis indicated the causes and the effects of the identified usability problems. From the 431 papers initially identified, 10 met the study criteria. The analysis of the papers identified 90 instances of usability problems, classified into categories corresponding to established usability heuristics. The five heuristics with the highest number of instances of usability problems were "Flexibility and efficiency of use", "Consistency and standards", "Match between system and the real world", "Recognition rather than recall" and "Help and documentation", respectively. These problems can make the interaction time consuming, causing delays in tasks, dissatisfaction, frustration, preventing users from enjoying all the benefits and functionalities of the system, as well as leading to more errors and difficulties in carrying out clinical analyses. Furthermore, the present paper showed a lack of studies performed on systems for radiology, especially usability evaluations using formal methods of evaluation involving the final users. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Framework for integration of informal waste management sector with the formal sector in Pakistan.

    PubMed

    Masood, Maryam; Barlow, Claire Y

    2013-10-01

    Historically, waste pickers around the globe have utilised urban solid waste as a principal source of livelihood. Formal waste management sectors usually perceive the informal waste collection/recycling networks as backward, unhygienic and generally incompatible with modern waste management systems. It is proposed here that through careful planning and administration, these seemingly troublesome informal networks can be integrated into formal waste management systems in developing countries, providing mutual benefits. A theoretical framework for integration based on a case study in Lahore, Pakistan, is presented. The proposed solution suggests that the municipal authority should draw up and agree on a formal work contract with the group of waste pickers already operating in the area. The proposed system is assessed using the integration radar framework to classify and analyse possible intervention points between the sectors. The integration of the informal waste workers with the formal waste management sector is not a one dimensional or single step process. An ideal solution might aim for a balanced focus on all four categories of intervention, although this may be influenced by local conditions. Not all the positive benefits will be immediately apparent, but it is expected that as the acceptance of such projects increases over time, the informal recycling economy will financially supplement the formal system in many ways.

  16. Mathematical formula recognition using graph grammar

    NASA Astrophysics Data System (ADS)

    Lavirotte, Stephane; Pottier, Loic

    1998-04-01

    This paper describes current results of Ofr, a system for extracting and understanding mathematical expressions in documents. Such a tool could be really useful to be able to re-use knowledge in scientific books which are not available in electronic form. We currently also study use of this system for direct input of formulas with a graphical tablet for computer algebra system softwares. Existing solutions for mathematical recognition have problems to analyze 2D expressions like vectors and matrices. This is because they often try to use extended classical grammar to analyze formulas, relatively to baseline. But a lot of mathematical notations do not respect rules for such a parsing and that is the reason why they fail to extend text parsing technic. We investigate graph grammar and graph rewriting as a solution to recognize 2D mathematical notations. Graph grammar provide a powerful formalism to describe structural manipulations of multi-dimensional data. The main two problems to solve are ambiguities between rules of grammar and construction of graph.

  17. Modeling mechanical cardiopulmonary interactions for virtual environments.

    PubMed

    Kaye, J M

    1997-01-01

    We have developed a computer system for modeling mechanical cardiopulmonary behavior in an interactive, 3D virtual environment. The system consists of a compact, scalar description of cardiopulmonary mechanics, with an emphasis on respiratory mechanics, that drives deformable 3D anatomy to simulate mechanical behaviors of and interactions between physiological systems. Such an environment can be used to facilitate exploration of cardiopulmonary physiology, particularly in situations that are difficult to reproduce clinically. We integrate 3D deformable body dynamics with new, formal models of (scalar) cardiorespiratory physiology, associating the scalar physiological variables and parameters with corresponding 3D anatomy. Our approach is amenable to modeling patient-specific circumstances in two ways. First, using CT scan data, we apply semi-automatic methods for extracting and reconstructing the anatomy to use in our simulations. Second, our scalar models are defined in terms of clinically-measurable, patient-specific parameters. This paper describes our approach and presents a sample of results showing normal breathing and acute effects of pneumothoraces.

  18. Mise en Scene: Conversion of Scenarios to CSP Traces for the Requirements-to-Design-to-Code Project

    NASA Technical Reports Server (NTRS)

    Carter. John D.; Gardner, William B.; Rash, James L.; Hinchey, Michael G.

    2007-01-01

    The "Requirements-to-Design-to-Code" (R2D2C) project at NASA's Goddard Space Flight Center is based on deriving a formal specification expressed in Communicating Sequential Processes (CSP) notation from system requirements supplied in the form of CSP traces. The traces, in turn, are to be extracted from scenarios, a user-friendly medium often used to describe the required behavior of computer systems under development. This work, called Mise en Scene, defines a new scenario medium (Scenario Notation Language, SNL) suitable for control-dominated systems, coupled with a two-stage process for automatic translation of scenarios to a new trace medium (Trace Notation Language, TNL) that encompasses CSP traces. Mise en Scene is offered as an initial solution to the problem of the scenarios-to-traces "D2" phase of R2D2C. A survey of the "scenario" concept and some case studies are also provided.

  19. Process Algebra Approach for Action Recognition in the Maritime Domain

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terry

    2011-01-01

    The maritime environment poses a number of challenges for autonomous operation of surface boats. Among these challenges are the highly dynamic nature of the environment, the onboard sensing and reasoning requirements for obeying the navigational rules of the road, and the need for robust day/night hazard detection and avoidance. Development of full mission level autonomy entails addressing these challenges, coupled with inference of the tactical and strategic intent of possibly adversarial vehicles in the surrounding environment. This paper introduces PACIFIC (Process Algebra Capture of Intent From Information Content), an onboard system based on formal process algebras that is capable of extracting actions/activities from sensory inputs and reasoning within a mission context to ensure proper responses. PACIFIC is part of the Behavior Engine in CARACaS (Cognitive Architecture for Robotic Agent Command and Sensing), a system that is currently running on a number of U.S. Navy unmanned surface and underwater vehicles. Results from a series of experimental studies that demonstrate the effectiveness of the system are also presented.

  20. Utility of DNA barcoding for rapid and accurate assessment of bat diversity in Malaysia in the absence of formally described species.

    PubMed

    Wilson, J-J; Sing, K-W; Halim, M R A; Ramli, R; Hashim, R; Sofian-Azirun, M

    2014-02-19

    Bats are important flagship species for biodiversity research; however, diversity in Southeast Asia is considerably underestimated in the current checklists and field guides. Incorporation of DNA barcoding into surveys has revealed numerous species-level taxa overlooked by conventional methods. Inclusion of these taxa in inventories provides a more informative record of diversity, but is problematic as these species lack formal description. We investigated how frequently documented, but undescribed, bat taxa are encountered in Peninsular Malaysia. We discuss whether a barcode library provides a means of recognizing and recording these taxa across biodiversity inventories. Tissue was sampled from bats trapped at Pasir Raja, Dungun Terengganu, Peninsular Malaysia. The DNA was extracted and the COI barcode region amplified and sequenced. We identified 9 species-level taxa within our samples, based on analysis of the DNA barcodes. Six specimens matched to four previously documented taxa considered candidate species but currently lacking formal taxonomic status. This study confirms the high diversity of bats within Peninsular Malaysia (9 species in 13 samples) and demonstrates how DNA barcoding allows for inventory and documentation of known taxa lacking formal taxonomic status.

  1. QSAR DataBank - an approach for the digital organization and archiving of QSAR model information

    PubMed Central

    2014-01-01

    Background Research efforts in the field of descriptive and predictive Quantitative Structure-Activity Relationships or Quantitative Structure–Property Relationships produce around one thousand scientific publications annually. All the materials and results are mainly communicated using printed media. The printed media in its present form have obvious limitations when they come to effectively representing mathematical models, including complex and non-linear, and large bodies of associated numerical chemical data. It is not supportive of secondary information extraction or reuse efforts while in silico studies poses additional requirements for accessibility, transparency and reproducibility of the research. This gap can and should be bridged by introducing domain-specific digital data exchange standards and tools. The current publication presents a formal specification of the quantitative structure-activity relationship data organization and archival format called the QSAR DataBank (QsarDB for shorter, or QDB for shortest). Results The article describes QsarDB data schema, which formalizes QSAR concepts (objects and relationships between them) and QsarDB data format, which formalizes their presentation for computer systems. The utility and benefits of QsarDB have been thoroughly tested by solving everyday QSAR and predictive modeling problems, with examples in the field of predictive toxicology, and can be applied for a wide variety of other endpoints. The work is accompanied with open source reference implementation and tools. Conclusions The proposed open data, open source, and open standards design is open to public and proprietary extensions on many levels. Selected use cases exemplify the benefits of the proposed QsarDB data format. General ideas for future development are discussed. PMID:24910716

  2. Anisotropy and probe-medium interactions in the microrheology of nematic fluids.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cordoba, Andres; Stieger, Tillmann; Mazza, Marco G.

    2016-01-01

    A theoretical formalism is presented to analyze and interpret microrheology experiments in anisotropic fluids with nematic order. The predictions of that approach are examined in the context of a simple coarse-grained molecular model which is simulated using nonequilibrium molecular dynamics calculations. The proposed formalism is used to study the effect of confinement, the type of anchoring at the probe-particle surface, and the strength of the nematic field on the rheological response functions obtained from probe-particle active microrheology. As expected, a stronger nematic field leads to increased anisotropy in the rheological response of the material. It is also found that themore » defect structures that arise around the probe particle, which are determined by the type of anchoring and the particle size, have a significant effect on the rheological response observed in microrheology simulations. Independent estimates of the bulk dynamic modulus of the model nematic fluid considered here are obtained from small-amplitude oscillatory shear simulations with Lees Edwards boundary conditions. The results of simulations indicate that the dynamic modulus extracted from particle-probe microrheology is different from that obtained in the absence of the particle, but that the differences decrease as the size of the defect also decreases. Importantly, the results of the nematic microrheology theory proposed here are in much closer agreement with simulations than those from earlier formalisms conceived for isotropic fluids. As such, it is anticipated that the theoretical framework advanced in this study could provide a useful tool for interpretation of microrheology experiments in systems such as liquid crystals and confined macromolecular solutions or gels.« less

  3. Statistical mechanics of few-particle systems: exact results for two useful models

    NASA Astrophysics Data System (ADS)

    Miranda, Enrique N.

    2017-11-01

    The statistical mechanics of small clusters (n ˜ 10-50 elements) of harmonic oscillators and two-level systems is studied exactly, following the microcanonical, canonical and grand canonical formalisms. For clusters with several hundred particles, the results from the three formalisms coincide with those found in the thermodynamic limit. However, for clusters formed by a few tens of elements, the three ensembles yield different results. For a cluster with a few tens of harmonic oscillators, when the heat capacity per oscillator is evaluated within the canonical formalism, it reaches a limit value equal to k B , as in the thermodynamic case, while within the microcanonical formalism the limit value is k B (1-1/n). This difference could be measured experimentally. For a cluster with a few tens of two-level systems, the heat capacity evaluated within the canonical and microcanonical ensembles also presents differences that could be detected experimentally. Both the microcanonical and grand canonical formalism show that the entropy is non-additive for systems this small, while the canonical ensemble reaches the opposite conclusion. These results suggest that the microcanonical ensemble is the most appropriate for dealing with systems with tens of particles.

  4. Trauma systems and the costs of trauma care.

    PubMed Central

    Goldfarb, M G; Bazzoli, G J; Coffey, R M

    1996-01-01

    OBJECTIVE. This study examines the cost of providing trauma services in trauma centers organized by publicly administered trauma systems, compared to hospitals not part of a formal trauma system. DATA SOURCES AND STUDY SETTING. Secondary administrative discharge abstracts for a national sample of severely injured trauma patients in 44 trauma centers and 60 matched control hospitals for the year 1987 were used. STUDY DESIGN. Retrospective univariate and multivariate analyses were conducted to examine the impact of formal trauma systems and trauma center designation on the costs of treating trauma patients. Key dependent variables included length of stay, charge per day per patient, and charge per hospital stay. Key impact variables were type of trauma system and level of trauma designation. Control variables included patient, hospital, and community characteristics. DATA COLLECTION/EXTRACTION METHODS. Data were selected for hospitals based on (1) a large national hospital discharge database, the Hospital Cost and Utilization Project, 1980-1987 (HCUP-2) and (2) a special survey of trauma systems and trauma designation undertaken by the Hospital Research and Educational Trust of the American Hospital Association. PRINCIPAL FINDINGS. The results show that publicly designated Level I trauma centers, which are the focal point of most trauma systems, have the highest charge per case, the highest average charge per day, and similar or longer average lengths of stay than other hospitals. These findings persist after controlling for patient injury and health status, and for demographic characteristics and hospital and community characteristics. CONCLUSIONS. Prior research shows that severely injured trauma patients have greater chances of survival when treated in specialized trauma centers. However, findings here should be of concern to the many states developing trauma systems since the high costs of Level I centers support limiting the number of centers designated at this level and/or reconsidering the requirements placed on these centers. PMID:8617611

  5. Direct extraction of electron parameters from magnetoconductance analysis in mesoscopic ring array structures

    NASA Astrophysics Data System (ADS)

    Sawada, A.; Faniel, S.; Mineshige, S.; Kawabata, S.; Saito, K.; Kobayashi, K.; Sekine, Y.; Sugiyama, H.; Koga, T.

    2018-05-01

    We report an approach for examining electron properties using information about the shape and size of a nanostructure as a measurement reference. This approach quantifies the spin precession angles per unit length directly by considering the time-reversal interferences on chaotic return trajectories within mesoscopic ring arrays (MRAs). Experimentally, we fabricated MRAs using nanolithography in InGaAs quantum wells which had a gate-controllable spin-orbit interaction (SOI). As a result, we observed an Onsager symmetry related to relativistic magnetic fields, which provided us with indispensable information for the semiclassical billiard ball simulation. Our simulations, developed based on the real-space formalism of the weak localization/antilocalization effect including the degree of freedom for electronic spin, reproduced the experimental magnetoconductivity (MC) curves with high fidelity. The values of five distinct electron parameters (Fermi wavelength, spin precession angles per unit length for two different SOIs, impurity scattering length, and phase coherence length) were thereby extracted from a single MC curve. The methodology developed here is applicable to wide ranges of nanomaterials and devices, providing a diagnostic tool for exotic properties of two-dimensional electron systems.

  6. Mining emotional profiles using e-mail messages for earlier warnings of potential terrorist activities

    NASA Astrophysics Data System (ADS)

    Galitsky, Boris; Kovalerchuk, Boris

    2006-04-01

    We develop a software system Text Scanner for Emotional Distress (TSED) for helping to detect email messages which are suspicious of coming from people under strong emotional distress. It has been confirmed by multiple studies that terrorist attackers have experienced a substantial emotional distress at some points before committing a terrorist attack. Therefore, if an individual in emotional distress can be detected on the basis of email texts, some preventive measures can be taken. The proposed detection machinery is based on extraction and classification of emotional profiles from emails. An emotional profile is a formal representation of a sequence of emotional states through a textual discourse where communicative actions are attached to these emotional states. The issues of extraction of emotional profiles from text and reasoning about it are discussed and illustrated. We then develop an inductive machine learning and reasoning framework to relate an emotional profile to the class "Emotional distress" or "No emotional distress", given a training dataset where the class is assigned by an expert. TSED's machine learning is evaluated using the database of structured customer complaints.

  7. Mars Colony in situ resource utilization: An integrated architecture and economics model

    NASA Astrophysics Data System (ADS)

    Shishko, Robert; Fradet, René; Do, Sydney; Saydam, Serkan; Tapia-Cortez, Carlos; Dempster, Andrew G.; Coulton, Jeff

    2017-09-01

    This paper reports on our effort to develop an ensemble of specialized models to explore the commercial potential of mining water/ice on Mars in support of a Mars Colony. This ensemble starts with a formal systems architecting framework to describe a Mars Colony and capture its artifacts' parameters and technical attributes. The resulting database is then linked to a variety of ;downstream; analytic models. In particular, we integrated an extraction process (i.e., ;mining;) model, a simulation of the colony's environmental control and life support infrastructure known as HabNet, and a risk-based economics model. The mining model focuses on the technologies associated with in situ resource extraction, processing, storage and handling, and delivery. This model computes the production rate as a function of the systems' technical parameters and the local Mars environment. HabNet simulates the fundamental sustainability relationships associated with establishing and maintaining the colony's population. The economics model brings together market information, investment and operating costs, along with measures of market uncertainty and Monte Carlo techniques, with the objective of determining the profitability of commercial water/ice in situ mining operations. All told, over 50 market and technical parameters can be varied in order to address ;what-if; questions, including colony location.

  8. Automatic Human Movement Assessment With Switching Linear Dynamic System: Motion Segmentation and Motor Performance.

    PubMed

    de Souza Baptista, Roberto; Bo, Antonio P L; Hayashibe, Mitsuhiro

    2017-06-01

    Performance assessment of human movement is critical in diagnosis and motor-control rehabilitation. Recent developments in portable sensor technology enable clinicians to measure spatiotemporal aspects to aid in the neurological assessment. However, the extraction of quantitative information from such measurements is usually done manually through visual inspection. This paper presents a novel framework for automatic human movement assessment that executes segmentation and motor performance parameter extraction in time-series of measurements from a sequence of human movements. We use the elements of a Switching Linear Dynamic System model as building blocks to translate formal definitions and procedures from human movement analysis. Our approach provides a method for users with no expertise in signal processing to create models for movements using labeled dataset and later use it for automatic assessment. We validated our framework on preliminary tests involving six healthy adult subjects that executed common movements in functional tests and rehabilitation exercise sessions, such as sit-to-stand and lateral elevation of the arms and five elderly subjects, two of which with limited mobility, that executed the sit-to-stand movement. The proposed method worked on random motion sequences for the dual purpose of movement segmentation (accuracy of 72%-100%) and motor performance assessment (mean error of 0%-12%).

  9. Proceedings 3rd NASA/IEEE Workshop on Formal Approaches to Agent-Based Systems (FAABS-III)

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael (Editor); Rash, James (Editor); Truszkowski, Walt (Editor); Rouff, Christopher (Editor)

    2004-01-01

    These preceedings contain 18 papers and 4 poster presentation, covering topics such as: multi-agent systems, agent-based control, formalism, norms, as well as physical and biological models of agent-based systems. Some applications presented in the proceedings include systems analysis, software engineering, computer networks and robot control.

  10. Industry Strength Tool and Technology for Automated Synthesis of Safety-Critical Applications from Formal Specifications

    DTIC Science & Technology

    2015-11-01

    28 2.3.4 Input/Output Automata ...various other modeling frameworks such as I/O Automata , Kahn Process Networks, Petri-nets, Multi-dimensional SDF, etc. are also used for designing...Formal Ideally suited to model DSP applications 3 Petri Nets Graphical Formal Used for modeling distributed systems 4 I/O Automata Both Formal

  11. Enabling Requirements-Based Programming for Highly-Dependable Complex Parallel and Distributed Systems

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    The manual application of formal methods in system specification has produced successes, but in the end, despite any claims and assertions by practitioners, there is no provable relationship between a manually derived system specification or formal model and the customer's original requirements. Complex parallel and distributed system present the worst case implications for today s dearth of viable approaches for achieving system dependability. No avenue other than formal methods constitutes a serious contender for resolving the problem, and so recognition of requirements-based programming has come at a critical juncture. We describe a new, NASA-developed automated requirement-based programming method that can be applied to certain classes of systems, including complex parallel and distributed systems, to achieve a high degree of dependability.

  12. Medical Named Entity Recognition for Indonesian Language Using Word Representations

    NASA Astrophysics Data System (ADS)

    Rahman, Arief

    2018-03-01

    Nowadays, Named Entity Recognition (NER) system is used in medical texts to obtain important medical information, like diseases, symptoms, and drugs. While most NER systems are applied to formal medical texts, informal ones like those from social media (also called semi-formal texts) are starting to get recognition as a gold mine for medical information. We propose a theoretical Named Entity Recognition (NER) model for semi-formal medical texts in our medical knowledge management system by comparing two kinds of word representations: cluster-based word representation and distributed representation.

  13. Optimal tuning of a confined Brownian information engine.

    PubMed

    Park, Jong-Min; Lee, Jae Sung; Noh, Jae Dong

    2016-03-01

    A Brownian information engine is a device extracting mechanical work from a single heat bath by exploiting the information on the state of a Brownian particle immersed in the bath. As for engines, it is important to find the optimal operating condition that yields the maximum extracted work or power. The optimal condition for a Brownian information engine with a finite cycle time τ has been rarely studied because of the difficulty in finding the nonequilibrium steady state. In this study, we introduce a model for the Brownian information engine and develop an analytic formalism for its steady-state distribution for any τ. We find that the extracted work per engine cycle is maximum when τ approaches infinity, while the power is maximum when τ approaches zero.

  14. Towards extracting the timelike pion form factor on CLS twoflavour ensembles

    NASA Astrophysics Data System (ADS)

    Erben, Felix; Green, Jeremy; Mohler, Daniel; Wittig, Hartmut

    2018-03-01

    Results are presented from an ongoing study of the ρ resonance. The focus is on CLS 2-flavour ensembles generated using O(a) improved Wilson fermions with pion masses ranging from 265 to 437 MeV. The energy levels are extracted by solving the GEVP of correlator matrices, created with the distillation approach involving ρ and ππ interpolators. The study is done in the centre-of-mass frame and several moving frames. One aim of this work is to extract the timelike pion form factor after applying the Lüscher formalism. We therefore plan to integrate this study with the existing Mainz programme for the calculation of the hadronic vacuum polarization contribution to the muon g - 2.

  15. Structure of multiphoton quantum optics. I. Canonical formalism and homodyne squeezed states

    NASA Astrophysics Data System (ADS)

    dell'Anno, Fabio; de Siena, Silvio; Illuminati, Fabrizio

    2004-03-01

    We introduce a formalism of nonlinear canonical transformations for general systems of multiphoton quantum optics. For single-mode systems the transformations depend on a tunable free parameter, the homodyne local-oscillator angle; for n -mode systems they depend on n heterodyne mixing angles. The canonical formalism realizes nontrivial mixing of pairs of conjugate quadratures of the electromagnetic field in terms of homodyne variables for single-mode systems, and in terms of heterodyne variables for multimode systems. In the first instance the transformations yield nonquadratic model Hamiltonians of degenerate multiphoton processes and define a class of non-Gaussian, nonclassical multiphoton states that exhibit properties of coherence and squeezing. We show that such homodyne multiphoton squeezed states are generated by unitary operators with a nonlinear time evolution that realizes the homodyne mixing of a pair of conjugate quadratures. Tuning of the local-oscillator angle allows us to vary at will the statistical properties of such states. We discuss the relevance of the formalism for the study of degenerate (up-)down-conversion processes. In a companion paper [

    F. Dell’Anno, S. De Siena, and F. Illuminati, 69, 033813 (2004)
    ], we provide the extension of the nonlinear canonical formalism to multimode systems, we introduce the associated heterodyne multiphoton squeezed states, and we discuss their possible experimental realization.

  16. Structure of multiphoton quantum optics. I. Canonical formalism and homodyne squeezed states

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dell'Anno, Fabio; De Siena, Silvio; Illuminati, Fabrizio

    2004-03-01

    We introduce a formalism of nonlinear canonical transformations for general systems of multiphoton quantum optics. For single-mode systems the transformations depend on a tunable free parameter, the homodyne local-oscillator angle; for n-mode systems they depend on n heterodyne mixing angles. The canonical formalism realizes nontrivial mixing of pairs of conjugate quadratures of the electromagnetic field in terms of homodyne variables for single-mode systems, and in terms of heterodyne variables for multimode systems. In the first instance the transformations yield nonquadratic model Hamiltonians of degenerate multiphoton processes and define a class of non-Gaussian, nonclassical multiphoton states that exhibit properties of coherencemore » and squeezing. We show that such homodyne multiphoton squeezed states are generated by unitary operators with a nonlinear time evolution that realizes the homodyne mixing of a pair of conjugate quadratures. Tuning of the local-oscillator angle allows us to vary at will the statistical properties of such states. We discuss the relevance of the formalism for the study of degenerate (up-)down-conversion processes. In a companion paper [F. Dell'Anno, S. De Siena, and F. Illuminati, 69, 033813 (2004)], we provide the extension of the nonlinear canonical formalism to multimode systems, we introduce the associated heterodyne multiphoton squeezed states, and we discuss their possible experimental realization.« less

  17. Formal Methods in Air Traffic Management: The Case of Unmanned Aircraft Systems

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar A.

    2015-01-01

    As the technological and operational capabilities of unmanned aircraft systems (UAS) continue to grow, so too does the need to introduce these systems into civil airspace. Unmanned Aircraft Systems Integration in the National Airspace System is a NASA research project that addresses the integration of civil UAS into non-segregated airspace operations. One of the major challenges of this integration is the lack of an onboard pilot to comply with the legal requirement that pilots see and avoid other aircraft. The need to provide an equivalent to this requirement for UAS has motivated the development of a detect and avoid (DAA) capability to provide the appropriate situational awareness and maneuver guidance in avoiding and remaining well clear of traffic aircraft. Formal methods has played a fundamental role in the development of this capability. This talk reports on the formal methods work conducted under NASA's Safe Autonomous System Operations project in support of the development of DAA for UAS. This work includes specification of low-level and high-level functional requirements, formal verification of algorithms, and rigorous validation of software implementations. The talk also discusses technical challenges in formal methods research in the context of the development and safety analysis of advanced air traffic management concepts.

  18. Understanding visualization: a formal approach using category theory and semiotics.

    PubMed

    Vickers, Paul; Faith, Joe; Rossiter, Nick

    2013-06-01

    This paper combines the vocabulary of semiotics and category theory to provide a formal analysis of visualization. It shows how familiar processes of visualization fit the semiotic frameworks of both Saussure and Peirce, and extends these structures using the tools of category theory to provide a general framework for understanding visualization in practice, including: Relationships between systems, data collected from those systems, renderings of those data in the form of representations, the reading of those representations to create visualizations, and the use of those visualizations to create knowledge and understanding of the system under inspection. The resulting framework is validated by demonstrating how familiar information visualization concepts (such as literalness, sensitivity, redundancy, ambiguity, generalizability, and chart junk) arise naturally from it and can be defined formally and precisely. This paper generalizes previous work on the formal characterization of visualization by, inter alia, Ziemkiewicz and Kosara and allows us to formally distinguish properties of the visualization process that previous work does not.

  19. Evaluating a Control System Architecture Based on a Formally Derived AOCS Model

    NASA Astrophysics Data System (ADS)

    Ilic, Dubravka; Latvala, Timo; Varpaaniemi, Kimmo; Vaisanen, Pauli; Troubitsyna, Elena; Laibinis, Linas

    2010-08-01

    Attitude & Orbit Control System (AOCS) refers to a wider class of control systems which are used to determine and control the attitude of the spacecraft while in orbit, based on the information obtained from various sensors. In this paper, we propose an approach to evaluate a typical (yet somewhat simplified) AOCS architecture using formal development - based on the Event-B method. As a starting point, an Ada specification of the AOCS is translated into a formal specification and further refined to incorporate all the details of its original source code specification. This way we are able not only to evaluate the Ada specification by expressing and verifying specific system properties in our formal models, but also to determine how well the chosen modelling framework copes with the level of detail required for an actual implementation and code generation from the derived models.

  20. The Formal Semantics of PVS

    NASA Technical Reports Server (NTRS)

    Owre, Sam; Shankar, Natarajan

    1999-01-01

    A specification language is a medium for expressing what is computed rather than how it is computed. Specification languages share some features with programming languages but are also different in several important ways. For our purpose, a specification language is a logic within which the behavior of computational systems can be formalized. Although a specification can be used to simulate the behavior of such systems, we mainly use specifications to state and prove system properties with mechanical assistance. We present the formal semantics of the specification language of SRI's Prototype Verification System (PVS). This specification language is based on the simply typed lambda calculus. The novelty in PVS is that it contains very expressive language features whose static analysis (e.g., typechecking) requires the assistance of a theorem prover. The formal semantics illuminates several of the design considerations underlying PVS, the interaction between theorem proving and typechecking.

  1. Mending the Gap, An Effort to Aid the Transfer of Formal Methods Technology

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly

    2009-01-01

    Formal methods can be applied to many of the development and verification activities required for civil avionics software. RTCA/DO-178B, Software Considerations in Airborne Systems and Equipment Certification, gives a brief description of using formal methods as an alternate method of compliance with the objectives of that standard. Despite this, the avionics industry at large has been hesitant to adopt formal methods, with few developers have actually used formal methods for certification credit. Why is this so, given the volume of evidence of the benefits of formal methods? This presentation will explore some of the challenges to using formal methods in a certification context and describe the effort by the Formal Methods Subgroup of RTCA SC-205/EUROCAE WG-71 to develop guidance to make the use of formal methods a recognized approach.

  2. Formalizing Probabilistic Safety Claims

    NASA Technical Reports Server (NTRS)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  3. Cost Analysis of Non-Formal ETV Systems: A Case Study of the "Extra-Scolaire" System in the Ivory Coast.

    ERIC Educational Resources Information Center

    Klees, Steven J.

    Building on previous evaluations of the ETV systems--both formal and informal--of the Ivory Coast, this study examines the system costs of the "Extra Scolaire" (E/S) program for rural adults. Educational television is utilized through the Ivorian primary system, and battery operated televisions have been widely distributed to schools in…

  4. 7 CFR 1726.201 - Formal competitive bidding.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... AGRICULTURE ELECTRIC SYSTEM CONSTRUCTION POLICIES AND PROCEDURES Procurement Procedures § 1726.201 Formal... formal competitive bidding: (a) Selection of qualified bidders. The borrower (acting through its engineer... § 1726.23). (b) Invitations to bid. The borrower (acting through its engineer, if applicable) is...

  5. 7 CFR 1726.201 - Formal competitive bidding.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... AGRICULTURE ELECTRIC SYSTEM CONSTRUCTION POLICIES AND PROCEDURES Procurement Procedures § 1726.201 Formal... formal competitive bidding: (a) Selection of qualified bidders. The borrower (acting through its engineer... § 1726.23). (b) Invitations to bid. The borrower (acting through its engineer, if applicable) is...

  6. 7 CFR 1726.201 - Formal competitive bidding.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... AGRICULTURE ELECTRIC SYSTEM CONSTRUCTION POLICIES AND PROCEDURES Procurement Procedures § 1726.201 Formal... formal competitive bidding: (a) Selection of qualified bidders. The borrower (acting through its engineer... § 1726.23). (b) Invitations to bid. The borrower (acting through its engineer, if applicable) is...

  7. 7 CFR 1726.201 - Formal competitive bidding.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... AGRICULTURE ELECTRIC SYSTEM CONSTRUCTION POLICIES AND PROCEDURES Procurement Procedures § 1726.201 Formal... formal competitive bidding: (a) Selection of qualified bidders. The borrower (acting through its engineer... § 1726.23). (b) Invitations to bid. The borrower (acting through its engineer, if applicable) is...

  8. 7 CFR 1726.201 - Formal competitive bidding.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... AGRICULTURE ELECTRIC SYSTEM CONSTRUCTION POLICIES AND PROCEDURES Procurement Procedures § 1726.201 Formal... formal competitive bidding: (a) Selection of qualified bidders. The borrower (acting through its engineer... § 1726.23). (b) Invitations to bid. The borrower (acting through its engineer, if applicable) is...

  9. On the Equivalence of Formal Grammars and Machines.

    ERIC Educational Resources Information Center

    Lund, Bruce

    1991-01-01

    Explores concepts of formal language and automata theory underlying computational linguistics. A computational formalism is described known as a "logic grammar," with which computational systems process linguistic data, with examples in declarative and procedural semantics and definite clause grammars. (13 references) (CB)

  10. Toward the Characterization of Non-Formal Pedagogy.

    ERIC Educational Resources Information Center

    Silberman-Keller, Diana

    This study examined characteristic attributes of non-formal education and the non-formal pedagogy directing its teaching and learning processes. Data were collected on organizational and pedagogical characteristics in several out-of-school organizations (youth movements, youth organizations, community centers, bypass educational systems, local…

  11. Breathing New Life into Education for Life: A Reconceptualisation of Non-Formal Education with a Focus on the Melanesian Pacific.

    ERIC Educational Resources Information Center

    Reymer, Christina

    Melanesian education systems generally reflect the biases of their former Western colonial masters in that formal education is regarded as a means of preparing for employment in a formal market economy. This bias is evident in resource allocation, with formal education getting the lion's share of education spending. Focusing on the market economy…

  12. Family Literacy and the New Canadian: Formal, Non-Formal and Informal Learning: The Case of Literacy, Essential Skills and Language Learning in Canada

    ERIC Educational Resources Information Center

    Eaton, Sarah Elaine

    2011-01-01

    This paper examines literacy and language learning across the lifespan within the context of immigrants in the Canadian context. It explores the process of improving literacy skills and acquiring second or third language skills through the systems of formal, non-formal and informal learning, as defined by the OECD [Organisation for Economic…

  13. ARIES: Acquisition of Requirements and Incremental Evolution of Specifications

    NASA Technical Reports Server (NTRS)

    Roberts, Nancy A.

    1993-01-01

    This paper describes a requirements/specification environment specifically designed for large-scale software systems. This environment is called ARIES (Acquisition of Requirements and Incremental Evolution of Specifications). ARIES provides assistance to requirements analysts for developing operational specifications of systems. This development begins with the acquisition of informal system requirements. The requirements are then formalized and gradually elaborated (transformed) into formal and complete specifications. ARIES provides guidance to the user in validating formal requirements by translating them into natural language representations and graphical diagrams. ARIES also provides ways of analyzing the specification to ensure that it is correct, e.g., testing the specification against a running simulation of the system to be built. Another important ARIES feature, especially when developing large systems, is the sharing and reuse of requirements knowledge. This leads to much less duplication of effort. ARIES combines all of its features in a single environment that makes the process of capturing a formal specification quicker and easier.

  14. Demonstration of Steam Injection/Extraction Treatment of a DNAPL Source Zone at Launch Complex 34 in Cape Canaveral Air Force Station, Final Innovative Technology Evaluation Report

    EPA Science Inventory

    The Interagency DNAPL Consortium (IDC) was formally established in 1999 by the U.S. Department of Energy, U.S. Environmental Protection Agency, the U.S. Department of Defense, and the National Aeronautics and Space Administration. The IDC performed five remediation techniques: ...

  15. On the formalization and reuse of scientific research.

    PubMed

    King, Ross D; Liakata, Maria; Lu, Chuan; Oliver, Stephen G; Soldatova, Larisa N

    2011-10-07

    The reuse of scientific knowledge obtained from one investigation in another investigation is basic to the advance of science. Scientific investigations should therefore be recorded in ways that promote the reuse of the knowledge they generate. The use of logical formalisms to describe scientific knowledge has potential advantages in facilitating such reuse. Here, we propose a formal framework for using logical formalisms to promote reuse. We demonstrate the utility of this framework by using it in a worked example from biology: demonstrating cycles of investigation formalization [F] and reuse [R] to generate new knowledge. We first used logic to formally describe a Robot scientist investigation into yeast (Saccharomyces cerevisiae) functional genomics [f(1)]. With Robot scientists, unlike human scientists, the production of comprehensive metadata about their investigations is a natural by-product of the way they work. We then demonstrated how this formalism enabled the reuse of the research in investigating yeast phenotypes [r(1) = R(f(1))]. This investigation found that the removal of non-essential enzymes generally resulted in enhanced growth. The phenotype investigation was then formally described using the same logical formalism as the functional genomics investigation [f(2) = F(r(1))]. We then demonstrated how this formalism enabled the reuse of the phenotype investigation to investigate yeast systems-biology modelling [r(2) = R(f(2))]. This investigation found that yeast flux-balance analysis models fail to predict the observed changes in growth. Finally, the systems biology investigation was formalized for reuse in future investigations [f(3) = F(r(2))]. These cycles of reuse are a model for the general reuse of scientific knowledge.

  16. Applying formal methods and object-oriented analysis to existing flight software

    NASA Technical Reports Server (NTRS)

    Cheng, Betty H. C.; Auernheimer, Brent

    1993-01-01

    Correctness is paramount for safety-critical software control systems. Critical software failures in medical radiation treatment, communications, and defense are familiar to the public. The significant quantity of software malfunctions regularly reported to the software engineering community, the laws concerning liability, and a recent NRC Aeronautics and Space Engineering Board report additionally motivate the use of error-reducing and defect detection software development techniques. The benefits of formal methods in requirements driven software development ('forward engineering') is well documented. One advantage of rigorously engineering software is that formal notations are precise, verifiable, and facilitate automated processing. This paper describes the application of formal methods to reverse engineering, where formal specifications are developed for a portion of the shuttle on-orbit digital autopilot (DAP). Three objectives of the project were to: demonstrate the use of formal methods on a shuttle application, facilitate the incorporation and validation of new requirements for the system, and verify the safety-critical properties to be exhibited by the software.

  17. NASA Langley's Formal Methods Research in Support of the Next Generation Air Transportation System

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Munoz, Cesar A.

    2008-01-01

    This talk will provide a brief introduction to the formal methods developed at NASA Langley and the National Institute for Aerospace (NIA) for air traffic management applications. NASA Langley's formal methods research supports the Interagency Joint Planning and Development Office (JPDO) effort to define and develop the 2025 Next Generation Air Transportation System (NGATS). The JPDO was created by the passage of the Vision 100 Century of Aviation Reauthorization Act in Dec 2003. The NGATS vision calls for a major transformation of the nation s air transportation system that will enable growth to 3 times the traffic of the current system. The transformation will require an unprecedented level of safety-critical automation used in complex procedural operations based on 4-dimensional (4D) trajectories that enable dynamic reconfiguration of airspace scalable to geographic and temporal demand. The goal of our formal methods research is to provide verification methods that can be used to insure the safety of the NGATS system. Our work has focused on the safety assessment of concepts of operation and fundamental algorithms for conflict detection and resolution (CD&R) and self- spacing in the terminal area. Formal analysis of a concept of operations is a novel area of application of formal methods. Here one must establish that a system concept involving aircraft, pilots, and ground resources is safe. The formal analysis of algorithms is a more traditional endeavor. However, the formal analysis of ATM algorithms involves reasoning about the interaction of algorithmic logic and aircraft trajectories defined over an airspace. These trajectories are described using 2D and 3D vectors and are often constrained by trigonometric relations. Thus, in many cases it has been necessary to unload the full power of an advanced theorem prover. The verification challenge is to establish that the safety-critical algorithms produce valid solutions that are guaranteed to maintain separation under all possible scenarios. Current research has assumed perfect knowledge of the location of other aircraft in the vicinity so absolute guarantees are possible, but increasingly we are relaxing the assumptions to allow incomplete, inaccurate, and/or faulty information from communication sources.

  18. Systems engineering principles for the design of biomedical signal processing systems.

    PubMed

    Faust, Oliver; Acharya U, Rajendra; Sputh, Bernhard H C; Min, Lim Choo

    2011-06-01

    Systems engineering aims to produce reliable systems which function according to specification. In this paper we follow a systems engineering approach to design a biomedical signal processing system. We discuss requirements capturing, specification definition, implementation and testing of a classification system. These steps are executed as formal as possible. The requirements, which motivate the system design, are based on diabetes research. The main requirement for the classification system is to be a reliable component of a machine which controls diabetes. Reliability is very important, because uncontrolled diabetes may lead to hyperglycaemia (raised blood sugar) and over a period of time may cause serious damage to many of the body systems, especially the nerves and blood vessels. In a second step, these requirements are refined into a formal CSP‖ B model. The formal model expresses the system functionality in a clear and semantically strong way. Subsequently, the proven system model was translated into an implementation. This implementation was tested with use cases and failure cases. Formal modeling and automated model checking gave us deep insight in the system functionality. This insight enabled us to create a reliable and trustworthy implementation. With extensive tests we established trust in the reliability of the implementation. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  19. Formal System Verification for Trustworthy Embedded Systems

    DTIC Science & Technology

    2011-04-19

    microkernel basis. We had previously achieved code- level formal verification of the seL4 microkernel [3]. In the present project, over 12 months with 0.6 FTE...project, we designed and implemented a secure network access device (SAC) on top of the verified seL4 microkernel. The device allows a trusted front...Engelhardt, Rafal Kolan- ski, Michael Norrish, Thomas Sewell, Harvey Tuch, and Simon Winwood. seL4 : Formal verification of an OS kernel. CACM, 53(6):107

  20. Formal Methods Specification and Analysis Guidebook for the Verification of Software and Computer Systems. Volume 2; A Practitioner's Companion

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This guidebook, the second of a two-volume series, is intended to facilitate the transfer of formal methods to the avionics and aerospace community. The 1st volume concentrates on administrative and planning issues [NASA-95a], and the second volume focuses on the technical issues involved in applying formal methods to avionics and aerospace software systems. Hereafter, the term "guidebook" refers exclusively to the second volume of the series. The title of this second volume, A Practitioner's Companion, conveys its intent. The guidebook is written primarily for the nonexpert and requires little or no prior experience with formal methods techniques and tools. However, it does attempt to distill some of the more subtle ingredients in the productive application of formal methods. To the extent that it succeeds, those conversant with formal methods will also nd the guidebook useful. The discussion is illustrated through the development of a realistic example, relevant fragments of which appear in each chapter. The guidebook focuses primarily on the use of formal methods for analysis of requirements and high-level design, the stages at which formal methods have been most productively applied. Although much of the discussion applies to low-level design and implementation, the guidebook does not discuss issues involved in the later life cycle application of formal methods.

  1. Formal implementation of a performance evaluation model for the face recognition system.

    PubMed

    Shin, Yong-Nyuo; Kim, Jason; Lee, Yong-Jun; Shin, Woochang; Choi, Jin-Young

    2008-01-01

    Due to usability features, practical applications, and its lack of intrusiveness, face recognition technology, based on information, derived from individuals' facial features, has been attracting considerable attention recently. Reported recognition rates of commercialized face recognition systems cannot be admitted as official recognition rates, as they are based on assumptions that are beneficial to the specific system and face database. Therefore, performance evaluation methods and tools are necessary to objectively measure the accuracy and performance of any face recognition system. In this paper, we propose and formalize a performance evaluation model for the biometric recognition system, implementing an evaluation tool for face recognition systems based on the proposed model. Furthermore, we performed evaluations objectively by providing guidelines for the design and implementation of a performance evaluation system, formalizing the performance test process.

  2. Software Design for Real-Time Systems on Parallel Computers: Formal Specifications.

    DTIC Science & Technology

    1996-04-01

    This research investigated the important issues related to the analysis and design of real - time systems targeted to parallel architectures. In...particular, the software specification models for real - time systems on parallel architectures were evaluated. A survey of current formal methods for...uniprocessor real - time systems specifications was conducted to determine their extensibility in specifying real - time systems on parallel architectures. In

  3. A Linguistic Truth-Valued Temporal Reasoning Formalism and Its Implementation

    NASA Astrophysics Data System (ADS)

    Lu, Zhirui; Liu, Jun; Augusto, Juan C.; Wang, Hui

    Temporality and uncertainty are important features of many real world systems. Solving problems in such systems requires the use of formal mechanism such as logic systems, statistical methods or other reasoning and decision-making methods. In this paper, we propose a linguistic truth-valued temporal reasoning formalism to enable the management of both features concurrently using a linguistic truth valued logic and a temporal logic. We also provide a backward reasoning algorithm which allows the answering of user queries. A simple but realistic scenario in a smart home application is used to illustrate our work.

  4. The Measurement Process in the Generalized Contexts Formalism for Quantum Histories

    NASA Astrophysics Data System (ADS)

    Losada, Marcelo; Vanni, Leonardo; Laura, Roberto

    2016-02-01

    In the interpretations of quantum mechanics involving quantum histories there is no collapse postulate and the measurement is considered as a quantum interaction between the measured system and the measured instrument. For two consecutive non ideal measurements on the same system, we prove that both pointer indications at the end of each measurement are compatible properties in our generalized context formalism for quantum histories. Inmediately after the first measurement an effective state for the measured system is deduced from the formalism, generalizing the state that would be obtained by applying the state collapse postulate.

  5. Non-Formal Education--A Worthwhile Alternative to the Formal Education in India? Case Studies from Ganjam, Orissa. Reprints and Miniprints, No. 757.

    ERIC Educational Resources Information Center

    Svensson, Anna

    This report discusses the advantages and disadvantages of non-formal education (NFE) compared to the formal school system in Ganjam, a rural district on the east coast of Orissa, India. The aim of the research was to investigate whether or not NFE, would be a worthy target of aid from the Swedish aid organization SIDA (Swedish International…

  6. Location priority for non-formal early childhood education school based on promethee method and map visualization

    NASA Astrophysics Data System (ADS)

    Ayu Nurul Handayani, Hemas; Waspada, Indra

    2018-05-01

    Non-formal Early Childhood Education (non-formal ECE) is an education that is held for children under 4 years old. The implementation in District of Banyumas, Non-formal ECE is monitored by The District Government of Banyumas and helped by Sanggar Kegiatan Belajar (SKB) Purwokerto as one of the organizer of Non-formal Education. The government itself has a program for distributing ECE to all villages in Indonesia. However, The location to construct the ECE school in several years ahead is not arranged yet. Therefore, for supporting that program, a decision support system is made to give some recommendation villages for constructing The ECE building. The data are projected based on Brown’s Double Exponential Smoothing Method and utilizing Preference Ranking Organization Method for Enrichment Evaluation (Promethee) to generate priority order. As the recommendations system, it generates map visualization which is colored according to the priority level of sub-district and village area. The system was tested with black box testing, Promethee testing, and usability testing. The results showed that the system functionality and Promethee algorithm were working properly, and the user was satisfied.

  7. Automatically Grading Customer Confidence in a Formal Specification.

    ERIC Educational Resources Information Center

    Shukur, Zarina; Burke, Edmund; Foxley, Eric

    1999-01-01

    Describes an automatic grading system for a formal methods computer science course that is able to evaluate a formal specification written in the Z language. Quality is measured by considering first, specification correctness (syntax, semantics, and satisfaction of customer requirements), and second, specification maintainability (comparison of…

  8. IDEF3 Formalization Report

    DTIC Science & Technology

    1991-10-01

    SUBJECT TERMS 15. NUMBER OF PAGES engineering management information systems method formalization 60 information engineering process modeling 16 PRICE...CODE information systems requirements definition methods knowlede acquisition methods systems engineering 17. SECURITY CLASSIFICATION ji. SECURITY... Management , Inc., Santa Monica, California. CORYNEN, G. C., 1975, A Mathematical Theory of Modeling and Simula- tion. Ph.D. Dissertation, Department

  9. Initiating Formal Requirements Specifications with Object-Oriented Models

    NASA Technical Reports Server (NTRS)

    Ampo, Yoko; Lutz, Robyn R.

    1994-01-01

    This paper reports results of an investigation into the suitability of object-oriented models as an initial step in developing formal specifications. The requirements for two critical system-level software modules were used as target applications. It was found that creating object-oriented diagrams prior to formally specifying the requirements enhanced the accuracy of the initial formal specifications and reduced the effort required to produce them. However, the formal specifications incorporated some information not found in the object-oriented diagrams, such as higher-level strategy or goals of the software.

  10. A linguistic geometry for space applications

    NASA Technical Reports Server (NTRS)

    Stilman, Boris

    1994-01-01

    We develop a formal theory, the so-called Linguistic Geometry, in order to discover the inner properties of human expert heuristics, which were successful in a certain class of complex control systems, and apply them to different systems. This research relies on the formalization of search heuristics of high-skilled human experts which allow for the decomposition of complex system into the hierarchy of subsystems, and thus solve intractable problems reducing the search. The hierarchy of subsystems is represented as a hierarchy of formal attribute languages. This paper includes a formal survey of the Linguistic Geometry, and new example of a solution of optimization problem for the space robotic vehicles. This example includes actual generation of the hierarchy of languages, some details of trajectory generation and demonstrates the drastic reduction of search in comparison with conventional search algorithms.

  11. A Comparative Study of Pre-Service Education for Preschool Teachers in China and the United States

    ERIC Educational Resources Information Center

    Gong, Xin; Wang, Pengcheng

    2017-01-01

    This study provides a comparative analysis of the pre-service education system for preschool educators in China and the United States. Based on collected data and materials (literature, policy documents, and statistical data), we compare two areas of pre-service training: (1) the formal system; (2) the informal system. In the formal system, most…

  12. Planform: an application and database of graph-encoded planarian regenerative experiments.

    PubMed

    Lobo, Daniel; Malone, Taylor J; Levin, Michael

    2013-04-15

    Understanding the mechanisms governing the regeneration capabilities of many organisms is a fundamental interest in biology and medicine. An ever-increasing number of manipulation and molecular experiments are attempting to discover a comprehensive model for regeneration, with the planarian flatworm being one of the most important model species. Despite much effort, no comprehensive, constructive, mechanistic models exist yet, and it is now clear that computational tools are needed to mine this huge dataset. However, until now, there is no database of regenerative experiments, and the current genotype-phenotype ontologies and databases are based on textual descriptions, which are not understandable by computers. To overcome these difficulties, we present here Planform (Planarian formalization), a manually curated database and software tool for planarian regenerative experiments, based on a mathematical graph formalism. The database contains more than a thousand experiments from the main publications in the planarian literature. The software tool provides the user with a graphical interface to easily interact with and mine the database. The presented system is a valuable resource for the regeneration community and, more importantly, will pave the way for the application of novel artificial intelligence tools to extract knowledge from this dataset. The database and software tool are freely available at http://planform.daniel-lobo.com.

  13. Formalizing New Navigation Requirements for NASA's Space Shuttle

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.

    1996-01-01

    We describe a recent NASA-sponsored pilot project intended to gauge the effectiveness of using formal methods in Space Shuttle software requirements analysis. Several Change Requests (CRs) were selected as promising targets to demonstrate the utility of formal methods in this demanding application domain. A CR to add new navigation capabilities to the Shuttle, based on Global Positioning System (GPS) technology, is the focus of this industrial usage report. Portions of the GPS CR were modeled using the language of SRI's Prototype Verification System (PVS). During a limited analysis conducted on the formal specifications, numerous requirements issues were discovered. We present a summary of these encouraging results and conclusions we have drawn from the pilot project.

  14. Ontology or formal ontology

    NASA Astrophysics Data System (ADS)

    Žáček, Martin

    2017-07-01

    Ontology or formal ontology? Which word is correct? The aim of this article is to introduce correct terms and explain their basis. Ontology describes a particular area of interest (domain) in a formal way - defines the classes of objects that are in that area, and relationships that may exist between them. Meaning of ontology consists mainly in facilitating communication between people, improve collaboration of software systems and in the improvement of systems engineering. Ontology in all these areas offer the possibility of unification of view, maintaining consistency and unambiguity.

  15. The Archival Photograph and Its Meaning: Formalisms for Modeling Images

    ERIC Educational Resources Information Center

    Benson, Allen C.

    2009-01-01

    This article explores ontological principles and their potential applications in the formal description of archival photographs. Current archival descriptive practices are reviewed and the larger question is addressed: do archivists who are engaged in describing photographs need a more formalized system of representation, or do existing encoding…

  16. What can formal methods offer to digital flight control systems design

    NASA Technical Reports Server (NTRS)

    Good, Donald I.

    1990-01-01

    Formal methods research begins to produce methods which will enable mathematic modeling of the physical behavior of digital hardware and software systems. The development of these methods directly supports the NASA mission of increasing the scope and effectiveness of flight system modeling capabilities. The conventional, continuous mathematics that is used extensively in modeling flight systems is not adequate for accurate modeling of digital systems. Therefore, the current practice of digital flight control system design has not had the benefits of extensive mathematical modeling which are common in other parts of flight system engineering. Formal methods research shows that by using discrete mathematics, very accurate modeling of digital systems is possible. These discrete modeling methods will bring the traditional benefits of modeling to digital hardware and hardware design. Sound reasoning about accurate mathematical models of flight control systems can be an important part of reducing risk of unsafe flight control.

  17. System for line drawings interpretation

    NASA Astrophysics Data System (ADS)

    Boatto, L.; Consorti, Vincenzo; Del Buono, Monica; Eramo, Vincenzo; Esposito, Alessandra; Melcarne, F.; Meucci, Mario; Mosciatti, M.; Tucci, M.; Morelli, Arturo

    1992-08-01

    This paper describes an automatic system that extracts information from line drawings, in order to feed CAD or GIS systems. The line drawings that we analyze contain interconnected thin lines, dashed lines, text, and symbols. Characters and symbols may overlap with lines. Our approach is based on the properties of the run representation of a binary image that allow giving the image a graph structure. Using this graph structure, several algorithms have been designed to identify, directly in the raster image, straight segments, dashed lines, text, symbols, hatching lines, etc. Straight segments and dashed lines are converted into vectors, with high accuracy and good noise immunity. Characters and symbols are recognized by means of a recognizer, specifically developed for this application, designed to be insensitive to rotation and scaling. Subsequent processing steps include an `intelligent'' search through the graph in order to detect closed polygons, dashed lines, text strings, and other higher-level logical entities, followed by the identification of relationships (adjacency, inclusion, etc.) between them. Relationships are further translated into a formal description of the drawing. The output of the system can be used as input to a Geographic Information System package. The system is currently used by the Italian Land Register Authority to process cadastral maps.

  18. ADHD: From Intervention to Implementation

    ERIC Educational Resources Information Center

    Chaban, Peter

    2010-01-01

    Attention deficit/hyperactivity disorder (ADHD), a chronic neurological disorder, is not formally recognized in the educational systems across Canada. As a result, there is little opportunity for collaboration or sharing of information between the medical/research community and the educational system. Because ADHD is not formally identified,…

  19. Crisis Management for Secondary Education: A Survey of Secondary Education Directors in Greece

    ERIC Educational Resources Information Center

    Savelides, Socrates; Mihiotis, Athanassios; Koutsoukis, Nikitas-Spiros

    2015-01-01

    Purpose: The Greek secondary education system lacks a formal crisis management system. The purpose of this paper is to address this problem as follows: elicit current crisis management practices, outline features for designing a formal crisis management system in Greece. Design/methodology/approach: The research is based on a survey conducted with…

  20. Pensive Professionalism: The Role of 'Required Reflection' on a Professional Doctorate

    ERIC Educational Resources Information Center

    Cunningham, Bryan

    2018-01-01

    This short paper examines the origins and nature of the reflective writing that is presently required on one part-taught doctorate in education (EdD) programme. It explores the various ways in which EdD candidates have engaged with self-reflection, using a number of extracts from writing submitted for formal assessments (including of the doctoral…

  1. A Simple Method for Nucleon-Nucleon Cross Sections in a Nucleus

    NASA Technical Reports Server (NTRS)

    Tripathi, R. K.; Cucinotta, Francis A.; Wilson, John W.

    1999-01-01

    A simple reliable formalism is presented for obtaining nucleon-nucleon cross sections within a nucleus in nuclear collisions for a given projectile and target nucleus combination at a given energy for use in transport, Monte Carlo, and other calculations. The method relies on extraction of these values from experiments and has been tested and found to give excellent results.

  2. META-GLARE: A meta-system for defining your own computer interpretable guideline system-Architecture and acquisition.

    PubMed

    Bottrighi, Alessio; Terenziani, Paolo

    2016-09-01

    Several different computer-assisted management systems of computer interpretable guidelines (CIGs) have been developed by the Artificial Intelligence in Medicine community. Each CIG system is characterized by a specific formalism to represent CIGs, and usually provides a manager to acquire, consult and execute them. Though there are several commonalities between most formalisms in the literature, each formalism has its own peculiarities. The goal of our work is to provide a flexible support to the extension or definition of CIGs formalisms, and of their acquisition and execution engines. Instead of defining "yet another CIG formalism and its manager", we propose META-GLARE (META Guideline Acquisition, Representation, and Execution), a "meta"-system to define new CIG systems. In this paper, META-GLARE, a meta-system to define new CIG systems, is presented. We try to capture the commonalities among current CIG approaches, by providing (i) a general manager for the acquisition, consultation and execution of hierarchical graphs (representing the control flow of actions in CIGs), parameterized over the types of nodes and of arcs constituting it, and (ii) a library of different elementary components of guidelines nodes (actions) and arcs, in which each type definition involves the specification of how objects of this type can be acquired, consulted and executed. We provide generality and flexibility, by allowing free aggregations of such elementary components to define new primitive node and arc types. We have drawn several experiments, in which we have used META-GLARE to build a CIG system (Experiment 1 in Section 8), or to extend it (Experiments 2 and 3). Such experiments show that META-GLARE provides a useful and easy-to-use support to such tasks. For instance, re-building the Guideline Acquisition, Representation, and Execution (GLARE) system using META-GLARE required less than one day (Experiment 1). META-GLARE is a meta-system for CIGs supporting fast prototyping. Since META-GLARE provides acquisition and execution engines that are parametric over the specific CIG formalism, it supports easy update and construction of CIG systems. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. How compatible are participatory ergonomics programs with occupational health and safety management systems?

    PubMed

    Yazdani, Amin; Neumann, W Patrick; Imbeau, Daniel; Bigelow, Philip; Pagell, Mark; Theberge, Nancy; Hilbrecht, Margo; Wells, Richard

    2015-03-01

    Musculoskeletal disorders (MSD) are a major cause of pain, disability, and costs. Prevention of MSD at work is frequently described in terms of implementing an ergonomics program, often a participatory ergonomics (PE) program. Most other workplace injury prevention activities take place under the umbrella of a formal or informal occupational health and safety management system (OHSMS). This study assesses the similarities and differences between OHSMS and PE as such knowledge could help improve MSD prevention activities. Methods Using the internationally recognized Occupational Health and Safety Assessment Series (OHSAS 18001), 21 OHSMS elements were extracted. In order to define PE operationally, we identified the 20 most frequently cited papers on PE and extracted content relevant to each of the OHSAS 18001 elements. The PE literature provided a substantial amount of detail on five elements: (i) hazard identification, risk assessment and determining controls; (ii) resources, roles, responsibility, accountability, and authority; (iii) competence, training and awareness; (iv) participation and consultation; and (v) performance measurement and monitoring. However, of the 21 OHSAS elements, the PE literature was silent on 8 and provided few details on 8 others. The PE literature did not speak to many elements described in OHSMS and even when it did, the language used was often different. This may negatively affect the effectiveness and sustainability of PE initiatives within organizations. It is expected that paying attention to the approaches and language used in management system frameworks could make prevention of MSD activities more effective and sustainable.

  4. Modeling and Verification of Dependable Electronic Power System Architecture

    NASA Astrophysics Data System (ADS)

    Yuan, Ling; Fan, Ping; Zhang, Xiao-fang

    The electronic power system can be viewed as a system composed of a set of concurrently interacting subsystems to generate, transmit, and distribute electric power. The complex interaction among sub-systems makes the design of electronic power system complicated. Furthermore, in order to guarantee the safe generation and distribution of electronic power, the fault tolerant mechanisms are incorporated in the system design to satisfy high reliability requirements. As a result, the incorporation makes the design of such system more complicated. We propose a dependable electronic power system architecture, which can provide a generic framework to guide the development of electronic power system to ease the development complexity. In order to provide common idioms and patterns to the system *designers, we formally model the electronic power system architecture by using the PVS formal language. Based on the PVS model of this system architecture, we formally verify the fault tolerant properties of the system architecture by using the PVS theorem prover, which can guarantee that the system architecture can satisfy high reliability requirements.

  5. XML-based data model and architecture for a knowledge-based grid-enabled problem-solving environment for high-throughput biological imaging.

    PubMed

    Ahmed, Wamiq M; Lenz, Dominik; Liu, Jia; Paul Robinson, J; Ghafoor, Arif

    2008-03-01

    High-throughput biological imaging uses automated imaging devices to collect a large number of microscopic images for analysis of biological systems and validation of scientific hypotheses. Efficient manipulation of these datasets for knowledge discovery requires high-performance computational resources, efficient storage, and automated tools for extracting and sharing such knowledge among different research sites. Newly emerging grid technologies provide powerful means for exploiting the full potential of these imaging techniques. Efficient utilization of grid resources requires the development of knowledge-based tools and services that combine domain knowledge with analysis algorithms. In this paper, we first investigate how grid infrastructure can facilitate high-throughput biological imaging research, and present an architecture for providing knowledge-based grid services for this field. We identify two levels of knowledge-based services. The first level provides tools for extracting spatiotemporal knowledge from image sets and the second level provides high-level knowledge management and reasoning services. We then present cellular imaging markup language, an extensible markup language-based language for modeling of biological images and representation of spatiotemporal knowledge. This scheme can be used for spatiotemporal event composition, matching, and automated knowledge extraction and representation for large biological imaging datasets. We demonstrate the expressive power of this formalism by means of different examples and extensive experimental results.

  6. ADGS-2100 Adaptive Display and Guidance System Window Manager Analysis

    NASA Technical Reports Server (NTRS)

    Whalen, Mike W.; Innis, John D.; Miller, Steven P.; Wagner, Lucas G.

    2006-01-01

    Recent advances in modeling languages have made it feasible to formally specify and analyze the behavior of large system components. Synchronous data flow languages, such as Lustre, SCR, and RSML-e are particularly well suited to this task, and commercial versions of these tools such as SCADE and Simulink are growing in popularity among designers of safety critical systems, largely due to their ability to automatically generate code from the models. At the same time, advances in formal analysis tools have made it practical to formally verify important properties of these models to ensure that design defects are identified and corrected early in the lifecycle. This report describes how these tools have been applied to the ADGS-2100 Adaptive Display and Guidance Window Manager being developed by Rockwell Collins Inc. This work demonstrates how formal methods can be easily and cost-efficiently used to remove defects early in the design cycle.

  7. Beyond formalism

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1991-01-01

    The ongoing debate over the role of formalism and formal specifications in software features many speakers with diverse positions. Yet, in the end, they share the conviction that the requirements of a software system can be unambiguously specified, that acceptable software is a product demonstrably meeting the specifications, and that the design process can be carried out with little interaction between designers and users once the specification has been agreed to. This conviction is part of a larger paradigm prevalent in American management thinking, which holds that organizations are systems that can be precisely specified and optimized. This paradigm, which traces historically to the works of Frederick Taylor in the early 1900s, is no longer sufficient for organizations and software systems today. In the domain of software, a new paradigm, called user-centered design, overcomes the limitations of pure formalism. Pioneered in Scandinavia, user-centered design is spreading through Europe and is beginning to make its way into the U.S.

  8. A Formal Approach to Requirements-Based Programming

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    No significant general-purpose method is currently available to mechanically transform system requirements into a provably equivalent model. The widespread use of such a method represents a necessary step toward high-dependability system engineering for numerous application domains. Current tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" unfilled by such tools and methods is that the formal models cannot be proven to be equivalent to the requirements. We offer a method for mechanically transforming requirements into a provably equivalent formal model that can be used as the basis for code generation and other transformations. This method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. Finally, we describe further application areas we are investigating for use of the approach.

  9. Partition-free theory of time-dependent current correlations in nanojunctions in response to an arbitrary time-dependent bias

    NASA Astrophysics Data System (ADS)

    Ridley, Michael; MacKinnon, Angus; Kantorovich, Lev

    2017-04-01

    Working within the nonequilibrium Green's function formalism, a formula for the two-time current correlation function is derived for the case of transport through a nanojunction in response to an arbitrary time-dependent bias. The one-particle Hamiltonian and the wide-band limit approximation are assumed, enabling us to extract all necessary Green's functions and self-energies for the system, extending the analytic work presented previously [Ridley et al., Phys. Rev. B 91, 125433 (2015), 10.1103/PhysRevB.91.125433]. We show that our expression for the two-time correlation function generalizes the Büttiker theory of shot and thermal noise on the current through a nanojunction to the time-dependent bias case including the transient regime following the switch-on. Transient terms in the correlation function arise from an initial state that does not assume (as is usually done) that the system is initially uncoupled, i.e., our approach is partition free. We show that when the bias loses its time dependence, the long-time limit of the current correlation function depends on the time difference only, as in this case an ideal steady state is reached. This enables derivation of known results for the single-frequency power spectrum and for the zero-frequency limit of this power spectrum. In addition, we present a technique which facilitates fast calculations of the transient quantum noise, valid for arbitrary temperature, time, and voltage scales. We apply this formalism to a molecular wire system for both dc and ac biases, and find a signature of the traversal time for electrons crossing the wire in the time-dependent cross-lead current correlations.

  10. Formal Integrals and Noether Operators of Nonlinear Hyperbolic Partial Differential Systems Admitting a Rich Set of Symmetries

    NASA Astrophysics Data System (ADS)

    Startsev, Sergey Ya.

    2017-05-01

    The paper is devoted to hyperbolic (generally speaking, non-Lagrangian and nonlinear) partial differential systems possessing a full set of differential operators that map any function of one independent variable into a symmetry of the corresponding system. We demonstrate that a system has the above property if and only if this system admits a full set of formal integrals (i.e., differential operators which map symmetries into integrals of the system). As a consequence, such systems possess both direct and inverse Noether operators (in the terminology of a work by B. Fuchssteiner and A.S. Fokas who have used these terms for operators that map cosymmetries into symmetries and perform transformations in the opposite direction). Systems admitting Noether operators are not exhausted by Euler-Lagrange systems and the systems with formal integrals. In particular, a hyperbolic system admits an inverse Noether operator if a differential substitution maps this system into a system possessing an inverse Noether operator.

  11. Empowering out of School Youth through Non-Formal Education in Kenya

    ERIC Educational Resources Information Center

    Mualuko, Ndiku Judah

    2008-01-01

    Non-formal education, defined as any organized educational activity outside the established formal system whether operating separately or as an important feature of some broader activity that is intended to serve identifiable learning clienteles and learning objective is of great importance to society. It emerged out of the feeling that formal…

  12. Critical Analysis on Open Source LMSs Using FCA

    ERIC Educational Resources Information Center

    Sumangali, K.; Kumar, Ch. Aswani

    2013-01-01

    The objective of this paper is to apply Formal Concept Analysis (FCA) to identify the best open source Learning Management System (LMS) for an E-learning environment. FCA is a mathematical framework that represents knowledge derived from a formal context. In constructing the formal context, LMSs are treated as objects and their features as…

  13. The generation of gravitational waves. 1. Weak-field sources: A plug-in-and-grind formalism

    NASA Technical Reports Server (NTRS)

    Thorne, K. S.; Kovacs, S. J.

    1974-01-01

    A plug-in-and-grind formalism is derived for calculating the gravitational waves emitted by any system with weak internal gravitational fields. If the internal fields have negligible influence on the system's motions, then the formalism reduces to standard linearized theory. Whether or not gravity affects the motions, if the motions are slow and internal stresses are weak, then the new formalism reduces to the standard quadrupole-moment formalism. In the general case the new formalism expresses the radiation in terms of a retarded Green's function for slightly curved spacetime, and then breaks the Green's-function integral into five easily understood pieces: direct radiation, produced directly by the motions of the sources; whump radiation, produced by the the gravitational stresses of the source; transition radiation, produced by a time-changing time delay (Shapiro effect) in the propagation of the nonradiative, 1/r field of the source; focussing radiation produced when one portion of the source focusses, in a time-dependent way, the nonradiative field of another portion of the source, and tail radiation, produced by backscatter of the nonradiative field in regions of focussing.

  14. Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B

    NASA Technical Reports Server (NTRS)

    Yeganefard, Sanaz; Butler, Michael; Rezazadeh, Abdolbaghi

    2010-01-01

    Recently a set of guidelines, or cookbook, has been developed for modelling and refinement of control problems in Event-B. The Event-B formal method is used for system-level modelling by defining states of a system and events which act on these states. It also supports refinement of models. This cookbook is intended to systematize the process of modelling and refining a control problem system by distinguishing environment, controller and command phenomena. Our main objective in this paper is to investigate and evaluate the usefulness and effectiveness of this cookbook by following it throughout the formal modelling of cruise control system found in cars. The outcomes are identifying the benefits of the cookbook and also giving guidance to its future users.

  15. Deformation quantization of fermi fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galaviz, I.; Garcia-Compean, H.; Departamento de Fisica, Centro de Investigacion y de Estudios Avanzados del IPN, P.O. Box 14-740, 07000 Mexico, D.F.

    2008-04-15

    Deformation quantization for any Grassmann scalar free field is described via the Weyl-Wigner-Moyal formalism. The Stratonovich-Weyl quantizer, the Moyal *-product and the Wigner functional are obtained by extending the formalism proposed recently in [I. Galaviz, H. Garcia-Compean, M. Przanowski, F.J. Turrubiates, Weyl-Wigner-Moyal Formalism for Fermi Classical Systems, arXiv:hep-th/0612245] to the fermionic systems of infinite number of degrees of freedom. In particular, this formalism is applied to quantize the Dirac free field. It is observed that the use of suitable oscillator variables facilitates considerably the procedure. The Stratonovich-Weyl quantizer, the Moyal *-product, the Wigner functional, the normal ordering operator, and finally,more » the Dirac propagator have been found with the use of these variables.« less

  16. An Ontology for State Analysis: Formalizing the Mapping to SysML

    NASA Technical Reports Server (NTRS)

    Wagner, David A.; Bennett, Matthew B.; Karban, Robert; Rouquette, Nicolas; Jenkins, Steven; Ingham, Michel

    2012-01-01

    State Analysis is a methodology developed over the last decade for architecting, designing and documenting complex control systems. Although it was originally conceived for designing robotic spacecraft, recent applications include the design of control systems for large ground-based telescopes. The European Southern Observatory (ESO) began a project to design the European Extremely Large Telescope (E-ELT), which will require coordinated control of over a thousand articulated mirror segments. The designers are using State Analysis as a methodology and the Systems Modeling Language (SysML) as a modeling and documentation language in this task. To effectively apply the State Analysis methodology in this context it became necessary to provide ontological definitions of the concepts and relations in State Analysis and greater flexibility through a mapping of State Analysis into a practical extension of SysML. The ontology provides the formal basis for verifying compliance with State Analysis semantics including architectural constraints. The SysML extension provides the practical basis for applying the State Analysis methodology with SysML tools. This paper will discuss the method used to develop these formalisms (the ontology), the formalisms themselves, the mapping to SysML and approach to using these formalisms to specify a control system and enforce architectural constraints in a SysML model.

  17. Formal design specification of a Processor Interface Unit

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.

    1992-01-01

    This report describes work to formally specify the requirements and design of a processor interface unit (PIU), a single-chip subsystem providing memory-interface bus-interface, and additional support services for a commercial microprocessor within a fault-tolerant computer system. This system, the Fault-Tolerant Embedded Processor (FTEP), is targeted towards applications in avionics and space requiring extremely high levels of mission reliability, extended maintenance-free operation, or both. The need for high-quality design assurance in such applications is an undisputed fact, given the disastrous consequences that even a single design flaw can produce. Thus, the further development and application of formal methods to fault-tolerant systems is of critical importance as these systems see increasing use in modern society.

  18. Model-Driven Test Generation of Distributed Systems

    NASA Technical Reports Server (NTRS)

    Easwaran, Arvind; Hall, Brendan; Schweiker, Kevin

    2012-01-01

    This report describes a novel test generation technique for distributed systems. Utilizing formal models and formal verification tools, spe cifically the Symbolic Analysis Laboratory (SAL) tool-suite from SRI, we present techniques to generate concurrent test vectors for distrib uted systems. These are initially explored within an informal test validation context and later extended to achieve full MC/DC coverage of the TTEthernet protocol operating within a system-centric context.

  19. Fourth NASA Langley Formal Methods Workshop

    NASA Technical Reports Server (NTRS)

    Holloway, C. Michael (Compiler); Hayhurst, Kelly J. (Compiler)

    1997-01-01

    This publication consists of papers presented at NASA Langley Research Center's fourth workshop on the application of formal methods to the design and verification of life-critical systems. Topic considered include: Proving properties of accident; modeling and validating SAFER in VDM-SL; requirement analysis of real-time control systems using PVS; a tabular language for system design; automated deductive verification of parallel systems. Also included is a fundamental hardware design in PVS.

  20. State Event Models for the Formal Analysis of Human-Machine Interactions

    NASA Technical Reports Server (NTRS)

    Combefis, Sebastien; Giannakopoulou, Dimitra; Pecheur, Charles

    2014-01-01

    The work described in this paper was motivated by our experience with applying a framework for formal analysis of human-machine interactions (HMI) to a realistic model of an autopilot. The framework is built around a formally defined conformance relation called "fullcontrol" between an actual system and the mental model according to which the system is operated. Systems are well-designed if they can be described by relatively simple, full-control, mental models for their human operators. For this reason, our framework supports automated generation of minimal full-control mental models for HMI systems, where both the system and the mental models are described as labelled transition systems (LTS). The autopilot that we analysed has been developed in the NASA Ames HMI prototyping tool ADEPT. In this paper, we describe how we extended the models that our HMI analysis framework handles to allow adequate representation of ADEPT models. We then provide a property-preserving reduction from these extended models to LTSs, to enable application of our LTS-based formal analysis algorithms. Finally, we briefly discuss the analyses we were able to perform on the autopilot model with our extended framework.

  1. Defining the IEEE-854 floating-point standard in PVS

    NASA Technical Reports Server (NTRS)

    Miner, Paul S.

    1995-01-01

    A significant portion of the ANSI/IEEE-854 Standard for Radix-Independent Floating-Point Arithmetic is defined in PVS (Prototype Verification System). Since IEEE-854 is a generalization of the ANSI/IEEE-754 Standard for Binary Floating-Point Arithmetic, the definition of IEEE-854 in PVS also formally defines much of IEEE-754. This collection of PVS theories provides a basis for machine checked verification of floating-point systems. This formal definition illustrates that formal specification techniques are sufficiently advanced that is is reasonable to consider their use in the development of future standards.

  2. Towards an Automated Development Methodology for Dependable Systems with Application to Sensor Networks

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    A general-purpose method to mechanically transform system requirements into a probably equivalent model has yet to appeal: Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a probably equivalent implementation are valuable but not su8cient. The "gap" unfilled by such tools and methods is that their. formal models cannot be proven to be equivalent to the system requirements as originated by the customel: For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a probably equivalent formal model that can be used as the basis for code generation and other transformations.

  3. REQUIREMENTS PATTERNS FOR FORMAL CONTRACTS IN ARCHITECTURAL ANALYSIS AND DESIGN LANGUAGE (AADL) MODELS

    DTIC Science & Technology

    2017-04-17

    Cyberphysical Systems, Formal Methods , Requirements Patterns, AADL, Assume Guarantee Reasoning Environment 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...5 3. Methods , Assumptions, and Procedures...Rockwell Collins has been addressing these challenges by developing compositional reasoning methods that permit the verification of systems that exceed

  4. Unified formalism for the generalized kth-order Hamilton-Jacobi problem

    NASA Astrophysics Data System (ADS)

    Colombo, Leonardo; de Léon, Manuel; Prieto-Martínez, Pedro Daniel; Román-Roy, Narciso

    2014-08-01

    The geometric formulation of the Hamilton-Jacobi theory enables us to generalize it to systems of higher-order ordinary differential equations. In this work we introduce the unified Lagrangian-Hamiltonian formalism for the geometric Hamilton-Jacobi theory on higher-order autonomous dynamical systems described by regular Lagrangian functions.

  5. Development of German-English Machine Translation System.

    ERIC Educational Resources Information Center

    Lehmann, Winifred P.; Stachowitz, Rolf

    This report documents efforts over a five-month period toward completion of a pilot system for machine translation of German scientific and technical literature into English. The report is divided into three areas: grammar formalism, programming, and linguistics. Work on grammar formalism concentrated mainly on increasing the power of the…

  6. Automated Theorem Proving in High-Quality Software Design

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Swanson, Keith (Technical Monitor)

    2001-01-01

    The amount and complexity of software developed during the last few years has increased tremendously. In particular, programs are being used more and more in embedded systems (from car-brakes to plant-control). Many of these applications are safety-relevant, i.e. a malfunction of hardware or software can cause severe damage or loss. Tremendous risks are typically present in the area of aviation, (nuclear) power plants or (chemical) plant control. Here, even small problems can lead to thousands of casualties and huge financial losses. Large financial risks also exist when computer systems are used in the area of telecommunication (telephone, electronic commerce) or space exploration. Computer applications in this area are not only subject to safety considerations, but also security issues are important. All these systems must be designed and developed to guarantee high quality with respect to safety and security. Even in an industrial setting which is (or at least should be) aware of the high requirements in Software Engineering, many incidents occur. For example, the Warshaw Airbus crash, was caused by an incomplete requirements specification. Uncontrolled reuse of an Ariane 4 software module was the reason for the Ariane 5 disaster. Some recent incidents in the telecommunication area, like illegal "cloning" of smart-cards of D2GSM handies, or the extraction of (secret) passwords from German T-online users show that also in this area serious flaws can happen. Due to the inherent complexity of computer systems, most authors claim that only a rigorous application of formal methods in all stages of the software life cycle can ensure high quality of the software and lead to real safe and secure systems. In this paper, we will have a look, in how far automated theorem proving can contribute to a more widespread application of formal methods and their tools, and what automated theorem provers (ATPs) must provide in order to be useful.

  7. Development and Evaluation of a Low Fertility Ontology for Analyzing Social Data in Korea.

    PubMed

    Lee, Ji-Hyun; Park, Hyeoun-Ae; Song, Tae-Min

    2016-01-01

    The purpose of this study is to develop a low fertility ontology for collecting and analyzing social data. A low fertility ontology was developed according to Ontology Development 101 and formally represented using Protégé. The content coverage of the ontology was evaluated using 1,387 narratives posted by the public and 63 narratives posted by public servants. Six super-classes of the ontology were developed based on Bronfenbrenner's ecological system theory with an individual in the center and environmental systems impacting their as surroundings. In total, 568 unique concepts were extracted from the narratives. Out of these concepts, 424(74.6%) concepts were lexically or semantically mapped, 67(11.8%) were either broadly or narrowly mapped to the ontology concepts. Remaining 77(13.6%) concepts were not mapped to any of the ontology concepts. This ontology can be used as a framework to understand low fertility problems using social data in Korea.

  8. (abstract) Formal Inspection Technology Transfer Program

    NASA Technical Reports Server (NTRS)

    Welz, Linda A.; Kelly, John C.

    1993-01-01

    A Formal Inspection Technology Transfer Program, based on the inspection process developed by Michael Fagan at IBM, has been developed at JPL. The goal of this program is to support organizations wishing to use Formal Inspections to improve the quality of software and system level engineering products. The Technology Transfer Program provides start-up materials and assistance to help organizations establish their own Formal Inspection program. The course materials and certified instructors associated with the Technology Transfer Program have proven to be effective in classes taught at other NASA centers as well as at JPL. Formal Inspections (NASA tailored Fagan Inspections) are a set of technical reviews whose objective is to increase quality and reduce the cost of software development by detecting and correcting errors early. A primary feature of inspections is the removal of engineering errors before they amplify into larger and more costly problems downstream in the development process. Note that the word 'inspection' is used differently in software than in a manufacturing context. A Formal Inspection is a front-end quality enhancement technique, rather than a task conducted just prior to product shipment for the purpose of sorting defective systems (manufacturing usage). Formal Inspections are supporting and in agreement with the 'total quality' approach being adopted by many NASA centers.

  9. Bio-Inspired Genetic Algorithms with Formalized Crossover Operators for Robotic Applications.

    PubMed

    Zhang, Jie; Kang, Man; Li, Xiaojuan; Liu, Geng-Yang

    2017-01-01

    Genetic algorithms are widely adopted to solve optimization problems in robotic applications. In such safety-critical systems, it is vitally important to formally prove the correctness when genetic algorithms are applied. This paper focuses on formal modeling of crossover operations that are one of most important operations in genetic algorithms. Specially, we for the first time formalize crossover operations with higher-order logic based on HOL4 that is easy to be deployed with its user-friendly programing environment. With correctness-guaranteed formalized crossover operations, we can safely apply them in robotic applications. We implement our technique to solve a path planning problem using a genetic algorithm with our formalized crossover operations, and the results show the effectiveness of our technique.

  10. The Creative Power of Formal Analogies in Physics: The Case of Albert Einstein

    ERIC Educational Resources Information Center

    Gingras, Yves

    2015-01-01

    In order to show how formal analogies between different physical systems play an important conceptual work in physics, this paper analyzes the evolution of Einstein's thoughts on the structure of radiation from the point of view of the formal analogies he used as "lenses" to "see" through the "black box" of Planck's…

  11. Cost-Benefit Analysis of U.S. Copyright Formalities. Final Report.

    ERIC Educational Resources Information Center

    King Research, Inc., Rockville, MD.

    This study of the feasibility of conducting a cost-benefit analysis in the complex environment of the formalities used in the United States as part of its administration of the copyright law focused on the formalities of copyright notice, deposit, registration, and recordation. The U.S. system is also compared with the less centralized copyright…

  12. Can Non-Formal Education Keep Working Children in School? A Case Study from Punjab, India

    ERIC Educational Resources Information Center

    Sud, Pamela

    2010-01-01

    This paper analyses the effectiveness of non-formal schools for working children in Jalandhar, Punjab, India, in mainstreaming child labourers into the formal education system through incentivised, informal schooling. Using a family fixed effects model and sibling data as an equivalent population comparison group, I find that the non-formal…

  13. A systematic approach to embedded biomedical decision making.

    PubMed

    Song, Zhe; Ji, Zhongkai; Ma, Jian-Guo; Sputh, Bernhard; Acharya, U Rajendra; Faust, Oliver

    2012-11-01

    An embedded decision making is a key feature for many biomedical systems. In most cases human life directly depends on correct decisions made by these systems, therefore they have to work reliably. This paper describes how we applied systems engineering principles to design a high performance embedded classification system in a systematic and well structured way. We introduce the structured design approach by discussing requirements capturing, specifications refinement, implementation and testing. Thereby, we follow systems engineering principles and execute each of these processes as formal as possible. The requirements, which motivate the system design, describe an automated decision making system for diagnostic support. These requirements are refined into the implementation of a support vector machine (SVM) algorithm which enables us to integrate automated decision making in embedded systems. With a formal model we establish functionality, stability and reliability of the system. Furthermore, we investigated different parallel processing configurations of this computationally complex algorithm. We found that, by adding SVM processes, an almost linear speedup is possible. Once we established these system properties, we translated the formal model into an implementation. The resulting implementation was tested using XMOS processors with both normal and failure cases, to build up trust in the implementation. Finally, we demonstrated that our parallel implementation achieves the speedup, predicted by the formal model. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  14. The explosion at institute: modeling and analyzing the situation awareness factor.

    PubMed

    Naderpour, Mohsen; Lu, Jie; Zhang, Guangquan

    2014-12-01

    In 2008 a runaway chemical reaction caused an explosion at a methomyl unit in West Virginia, USA, killing two employees, injuring eight people, evacuating more than 40,000 residents adjacent to the facility, disrupting traffic on a nearby highway and causing significant business loss and interruption. Although the accident was formally investigated, the role of the situation awareness (SA) factor, i.e., a correct understanding of the situation, and appropriate models to maintain SA, remain unexplained. This paper extracts details of abnormal situations within the methomyl unit and models them into a situational network using dynamic Bayesian networks. A fuzzy logic system is used to resemble the operator's thinking when confronted with these abnormal situations. The combined situational network and fuzzy logic system make it possible for the operator to assess such situations dynamically to achieve accurate SA. The findings show that the proposed structure provides a useful graphical model that facilitates the inclusion of prior background knowledge and the updating of this knowledge when new information is available from monitoring systems. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Untangling the proximate causes and underlying drivers of deforestation and forest degradation in Myanmar.

    PubMed

    Lim, Cheng Ling; Prescott, Graham W; De Alban, Jose Don T; Ziegler, Alan D; Webb, Edward L

    2017-12-01

    Political transitions often trigger substantial environmental changes. In particular, deforestation can result from the complex interplay among the components of a system-actors, institutions, and existing policies-adapting to new opportunities. A dynamic conceptual map of system components is particularly useful for systems in which multiple actors, each with different worldviews and motivations, may be simultaneously trying to alter different facets of the system, unaware of the impacts on other components. In Myanmar, a global biodiversity hotspot with the largest forest area in mainland Southeast Asia, ongoing political and economic reforms are likely to change the dynamics of deforestation drivers. A fundamental conceptual map of these dynamics is therefore a prerequisite for interventions to reduce deforestation. We used a system-dynamics approach and causal-network analysis to determine the proximate causes and underlying drivers of forest loss and degradation in Myanmar from 1995 to 2016 and to articulate the linkages among them. Proximate causes included infrastructure development, timber extraction, and agricultural expansion. These were stimulated primarily by formal agricultural, logging, mining, and hydropower concessions and economic investment and social issues relating to civil war and land tenure. Reform of land laws, the link between natural resource extraction and civil war, and the allocation of agricultural concessions will influence the extent of future forest loss and degradation in Myanmar. The causal-network analysis identified priority areas for policy interventions, for example, creating a public registry of land-concession holders to deter corruption in concession allocation. We recommend application of this analytical approach to other countries, particularly those undergoing political transition, to inform policy interventions to reduce forest loss and degradation. © 2017 The Authors. Conservation Biology published by Wiley Periodicals, Inc. on behalf of Society for Conservation Biology.

  16. Detection of Anomalous Insiders in Collaborative Environments via Relational Analysis of Access Logs

    PubMed Central

    Chen, You; Malin, Bradley

    2014-01-01

    Collaborative information systems (CIS) are deployed within a diverse array of environments, ranging from the Internet to intelligence agencies to healthcare. It is increasingly the case that such systems are applied to manage sensitive information, making them targets for malicious insiders. While sophisticated security mechanisms have been developed to detect insider threats in various file systems, they are neither designed to model nor to monitor collaborative environments in which users function in dynamic teams with complex behavior. In this paper, we introduce a community-based anomaly detection system (CADS), an unsupervised learning framework to detect insider threats based on information recorded in the access logs of collaborative environments. CADS is based on the observation that typical users tend to form community structures, such that users with low a nity to such communities are indicative of anomalous and potentially illicit behavior. The model consists of two primary components: relational pattern extraction and anomaly detection. For relational pattern extraction, CADS infers community structures from CIS access logs, and subsequently derives communities, which serve as the CADS pattern core. CADS then uses a formal statistical model to measure the deviation of users from the inferred communities to predict which users are anomalies. To empirically evaluate the threat detection model, we perform an analysis with six months of access logs from a real electronic health record system in a large medical center, as well as a publicly-available dataset for replication purposes. The results illustrate that CADS can distinguish simulated anomalous users in the context of real user behavior with a high degree of certainty and with significant performance gains in comparison to several competing anomaly detection models. PMID:25485309

  17. Lagrange multiplier and Wess-Zumino variable as extra dimensions in the torus universe

    NASA Astrophysics Data System (ADS)

    Nejad, Salman Abarghouei; Dehghani, Mehdi; Monemzadeh, Majid

    2018-01-01

    We study the effect of the simplest geometry which is imposed via the topology of the universe by gauging non-relativistic particle model on torus and 3-torus with the help of symplectic formalism of constrained systems. Also, we obtain generators of gauge transformations for gauged models. Extracting corresponding Poisson structure of existed constraints, we show the effect of the shape of the universe on canonical structure of phase-spaces of models and suggest some phenomenology to prove the topology of the universe and probable non-commutative structure of the space. In addition, we show that the number of extra dimensions in the phase-spaces of gauged embedded models are exactly two. Moreover, in classical form, we talk over modification of Newton's second law in order to study the origin of the terms appeared in the gauged theory.

  18. Spectral properties from Matsubara Green's function approach: Application to molecules

    NASA Astrophysics Data System (ADS)

    Schüler, M.; Pavlyukh, Y.

    2018-03-01

    We present results for many-body perturbation theory for the one-body Green's function at finite temperatures using the Matsubara formalism. Our method relies on the accurate representation of the single-particle states in standard Gaussian basis sets, allowing to efficiently compute, among other observables, quasiparticle energies and Dyson orbitals of atoms and molecules. In particular, we challenge the second-order treatment of the Coulomb interaction by benchmarking its accuracy for a well-established test set of small molecules, which includes also systems where the usual Hartree-Fock treatment encounters difficulties. We discuss different schemes how to extract quasiparticle properties and assess their range of applicability. With an accurate solution and compact representation, our method is an ideal starting point to study electron dynamics in time-resolved experiments by the propagation of the Kadanoff-Baym equations.

  19. Active Thermal Extraction and Temperature Sensing of Near-field Thermal Radiation

    DOE PAGES

    Ding, D.; Kim, T.; Minnich, A. J.

    2016-09-06

    Recently, we proposed an active thermal extraction (ATX) scheme that enables thermally populated surface phonon polaritons to escape into the far-field. The concept is based on a fluorescence upconversion process that also occurs in laser cooling of solids (LCS). Here, we present a generalized analysis of our scheme using the theoretical framework for LCS. We show that both LCS and ATX can be described with the same mathematical formalism by replacing the electron-phonon coupling parameter in LCS with the electron-photon coupling parameter in ATX. Using this framework, we compare the ideal efficiency and power extracted for the two schemes andmore » examine the parasitic loss mechanisms. As a result, this work advances the application of ATX to manipulate near-field thermal radiation for applications such as temperature sensing and active radiative cooling.« less

  20. Systematic Model-in-the-Loop Test of Embedded Control Systems

    NASA Astrophysics Data System (ADS)

    Krupp, Alexander; Müller, Wolfgang

    Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.

  1. Third NASA Langley Formal Methods Workshop

    NASA Technical Reports Server (NTRS)

    Holloway, C. Michael (Compiler)

    1995-01-01

    This publication constitutes the proceedings of NASA Langley Research Center's third workshop on the application of formal methods to the design and verification of life-critical systems. This workshop brought together formal methods researchers, industry engineers, and academicians to discuss the potential of NASA-sponsored formal methods and to investigate new opportunities for applying these methods to industry problems. contained herein are copies of the material presented at the workshop, summaries of many of the presentations, a complete list of attendees, and a detailed summary of the Langley formal methods program. Much of this material is available electronically through the World-Wide Web via the following URL.

  2. Exact solutions for kinetic models of macromolecular dynamics.

    PubMed

    Chemla, Yann R; Moffitt, Jeffrey R; Bustamante, Carlos

    2008-05-15

    Dynamic biological processes such as enzyme catalysis, molecular motor translocation, and protein and nucleic acid conformational dynamics are inherently stochastic processes. However, when such processes are studied on a nonsynchronized ensemble, the inherent fluctuations are lost, and only the average rate of the process can be measured. With the recent development of methods of single-molecule manipulation and detection, it is now possible to follow the progress of an individual molecule, measuring not just the average rate but the fluctuations in this rate as well. These fluctuations can provide a great deal of detail about the underlying kinetic cycle that governs the dynamical behavior of the system. However, extracting this information from experiments requires the ability to calculate the general properties of arbitrarily complex theoretical kinetic schemes. We present here a general technique that determines the exact analytical solution for the mean velocity and for measures of the fluctuations. We adopt a formalism based on the master equation and show how the probability density for the position of a molecular motor at a given time can be solved exactly in Fourier-Laplace space. With this analytic solution, we can then calculate the mean velocity and fluctuation-related parameters, such as the randomness parameter (a dimensionless ratio of the diffusion constant and the velocity) and the dwell time distributions, which fully characterize the fluctuations of the system, both commonly used kinetic parameters in single-molecule measurements. Furthermore, we show that this formalism allows calculation of these parameters for a much wider class of general kinetic models than demonstrated with previous methods.

  3. 3-D linear inversion of gravity data: method and application to Basse-Terre volcanic island, Guadeloupe, Lesser Antilles

    NASA Astrophysics Data System (ADS)

    Barnoud, Anne; Coutant, Olivier; Bouligand, Claire; Gunawan, Hendra; Deroussi, Sébastien

    2016-04-01

    We use a Bayesian formalism combined with a grid node discretization for the linear inversion of gravimetric data in terms of 3-D density distribution. The forward modelling and the inversion method are derived from seismological inversion techniques in order to facilitate joint inversion or interpretation of density and seismic velocity models. The Bayesian formulation introduces covariance matrices on model parameters to regularize the ill-posed problem and reduce the non-uniqueness of the solution. This formalism favours smooth solutions and allows us to specify a spatial correlation length and to perform inversions at multiple scales. We also extract resolution parameters from the resolution matrix to discuss how well our density models are resolved. This method is applied to the inversion of data from the volcanic island of Basse-Terre in Guadeloupe, Lesser Antilles. A series of synthetic tests are performed to investigate advantages and limitations of the methodology in this context. This study results in the first 3-D density models of the island of Basse-Terre for which we identify: (i) a southward decrease of densities parallel to the migration of volcanic activity within the island, (ii) three dense anomalies beneath Petite Plaine Valley, Beaugendre Valley and the Grande-Découverte-Carmichaël-Soufrière Complex that may reflect the trace of former major volcanic feeding systems, (iii) shallow low-density anomalies in the southern part of Basse-Terre, especially around La Soufrière active volcano, Piton de Bouillante edifice and along the western coast, reflecting the presence of hydrothermal systems and fractured and altered rocks.

  4. Formal Verification of Complex Systems based on SysML Functional Requirements

    DTIC Science & Technology

    2014-12-23

    Formal Verification of Complex Systems based on SysML Functional Requirements Hoda Mehrpouyan1, Irem Y. Tumer2, Chris Hoyle2, Dimitra Giannakopoulou3...requirements for design of complex engineered systems. The proposed ap- proach combines a SysML modeling approach to document and structure safety requirements...methods and tools to support the integration of safety into the design solution. 2.1. SysML for Complex Engineered Systems Traditional methods and tools

  5. Links Between the Intuitive Sense of Number and Formal Mathematics Ability.

    PubMed

    Feigenson, Lisa; Libertus, Melissa E; Halberda, Justin

    2013-06-01

    Humans share with other animals a system for thinking about numbers in an imprecise and intuitive way. The Approximate Number System (ANS) that underlies this thinking is present throughout the lifespan, is entirely nonverbal, and supports basic numerical computations like comparing, adding, and subtracting quantities. Humans, unlike other animals, also have a system for representing exact numbers. This linguistically mediated system is slowly mastered over the course of many years and provides the basis for most of our formal mathematical thought. A growing body of evidence suggests that the nonverbal ANS and the culturally invented system of exact numbers are fundamentally linked. In this article, we review evidence for this relationship, describing how group and individual differences in the ANS correlate with and even predict formal math ability. In this way, we illustrate how a system of ancient core knowledge may serve as a foundation for more complex mathematical thought.

  6. DRS: Derivational Reasoning System

    NASA Technical Reports Server (NTRS)

    Bose, Bhaskar

    1995-01-01

    The high reliability requirements for airborne systems requires fault-tolerant architectures to address failures in the presence of physical faults, and the elimination of design flaws during the specification and validation phase of the design cycle. Although much progress has been made in developing methods to address physical faults, design flaws remain a serious problem. Formal methods provides a mathematical basis for removing design flaws from digital systems. DRS (Derivational Reasoning System) is a formal design tool based on advanced research in mathematical modeling and formal synthesis. The system implements a basic design algebra for synthesizing digital circuit descriptions from high level functional specifications. DRS incorporates an executable specification language, a set of correctness preserving transformations, verification interface, and a logic synthesis interface, making it a powerful tool for realizing hardware from abstract specifications. DRS integrates recent advances in transformational reasoning, automated theorem proving and high-level CAD synthesis systems in order to provide enhanced reliability in designs with reduced time and cost.

  7. Crisis crowdsourcing framework: designing strategic configurations of crowdsourcing for the emergency management domain

    USGS Publications Warehouse

    Liu, Sophia B.

    2014-01-01

    Crowdsourcing is not a new practice but it is a concept that has gained significant attention during recent disasters. Drawing from previous work in the crisis informatics, disaster sociology, and computer-supported cooperative work (CSCW) literature, the paper first explains recent conceptualizations of crowdsourcing and how crowdsourcing is a way of leveraging disaster convergence. The CSCW concept of “articulation work” is introduced as an interpretive frame for extracting the salient dimensions of “crisis crowdsourcing.” Then, a series of vignettes are presented to illustrate the evolution of crisis crowdsourcing that spontaneously emerged after the 2010 Haiti earthquake and evolved to more established forms of public engagement during crises. The best practices extracted from the vignettes clarified the efforts to formalize crisis crowdsourcing through the development of innovative interfaces designed to support the articulation work needed to facilitate spontaneous volunteer efforts. Extracting these best practices led to the development of a conceptual framework that unpacks the key dimensions of crisis crowdsourcing. The Crisis Crowdsourcing Framework is a systematic, problem-driven approach to determining the why, who, what, when, where, and how aspects of a crowdsourcing system. The framework also draws attention to the social, technological, organizational, and policy (STOP) interfaces that need to be designed to manage the articulation work involved with reducing the complexity of coordinating across these key dimensions. An example of how to apply the framework to design a crowdsourcing system is offered with with a discussion on the implications for applying this framework as well as the limitations of this framework. Innovation is occurring at the social, technological, organizational, and policy interfaces enabling crowdsourcing to be operationalized and integrated into official products and services.

  8. Towards automatic musical instrument timbre recognition

    NASA Astrophysics Data System (ADS)

    Park, Tae Hong

    This dissertation is comprised of two parts---focus on issues concerning research and development of an artificial system for automatic musical instrument timbre recognition and musical compositions. The technical part of the essay includes a detailed record of developed and implemented algorithms for feature extraction and pattern recognition. A review of existing literature introducing historical aspects surrounding timbre research, problems associated with a number of timbre definitions, and highlights of selected research activities that have had significant impact in this field are also included. The developed timbre recognition system follows a bottom-up, data-driven model that includes a pre-processing module, feature extraction module, and a RBF/EBF (Radial/Elliptical Basis Function) neural network-based pattern recognition module. 829 monophonic samples from 12 instruments have been chosen from the Peter Siedlaczek library (Best Service) and other samples from the Internet and personal collections. Significant emphasis has been put on feature extraction development and testing to achieve robust and consistent feature vectors that are eventually passed to the neural network module. In order to avoid a garbage-in-garbage-out (GIGO) trap and improve generality, extra care was taken in designing and testing the developed algorithms using various dynamics, different playing techniques, and a variety of pitches for each instrument with inclusion of attack and steady-state portions of a signal. Most of the research and development was conducted in Matlab. The compositional part of the essay includes brief introductions to "A d'Ess Are ," "Aboji," "48 13 N, 16 20 O," and "pH-SQ." A general outline pertaining to the ideas and concepts behind the architectural designs of the pieces including formal structures, time structures, orchestration methods, and pitch structures are also presented.

  9. Formal Verification of the AAMP-FV Microcode

    NASA Technical Reports Server (NTRS)

    Miller, Steven P.; Greve, David A.; Wilding, Matthew M.; Srivas, Mandayam

    1999-01-01

    This report describes the experiences of Collins Avionics & Communications and SRI International in formally specifying and verifying the microcode in a Rockwell proprietary microprocessor, the AAMP-FV, using the PVS verification system. This project built extensively on earlier experiences using PVS to verify the microcode in the AAMP5, a complex, pipelined microprocessor designed for use in avionics displays and global positioning systems. While the AAMP5 experiment demonstrated the technical feasibility of formal verification of microcode, the steep learning curve encountered left unanswered the question of whether it could be performed at reasonable cost. The AAMP-FV project was conducted to determine whether the experience gained on the AAMP5 project could be used to make formal verification of microcode cost effective for safety-critical and high volume devices.

  10. Verification of Emergent Behaviors in Swarm-based Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James

    2004-01-01

    The emergent properties of swarms make swarm-based missions powerful, but at the same time more difficult to design and to assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of swarm-based missions. The Autonomous Nano-Technology Swarm (ANTS) mission is being used as an example and case study for swarm-based missions to experiment and test current formal methods with intelligent swarms. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior. This paper introduces how intelligent swarm technology is being proposed for NASA missions, and gives the results of a comparison of several formal methods and approaches for specifying intelligent swarm-based systems and their effectiveness for predicting emergent behavior.

  11. Using Formal Methods to Assist in the Requirements Analysis of the Space Shuttle GPS Change Request

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.; Roberts, Larry W.

    1996-01-01

    We describe a recent NASA-sponsored pilot project intended to gauge the effectiveness of using formal methods in Space Shuttle software requirements analysis. Several Change Requests (CR's) were selected as promising targets to demonstrate the utility of formal methods in this application domain. A CR to add new navigation capabilities to the Shuttle, based on Global Positioning System (GPS) technology, is the focus of this report. Carried out in parallel with the Shuttle program's conventional requirements analysis process was a limited form of analysis based on formalized requirements. Portions of the GPS CR were modeled using the language of SRI's Prototype Verification System (PVS). During the formal methods-based analysis, numerous requirements issues were discovered and submitted as official issues through the normal requirements inspection process. Shuttle analysts felt that many of these issues were uncovered earlier than would have occurred with conventional methods. We present a summary of these encouraging results and conclusions we have drawn from the pilot project.

  12. Grievance and Redress: Chicano Access to the Criminal Justice System.

    ERIC Educational Resources Information Center

    Geilhufe, Nancy L.

    Focusing on the processes involved in making formal bureaucracies responsive, the study examined: the structure of formal grievance and redress procedures within the criminal justice system in San Jose, California; and the informal strategies used by politically active members of the Chicano community to extend and strengthen these channels. The…

  13. Evolution and Revolution of Adult Learning: Exposition of Open and Distance Learning in Nigeria

    ERIC Educational Resources Information Center

    Umezulike, Nneka A.

    2015-01-01

    The educational system has witnessed a number of laudable programs since inception in both formal and non-formal systems of education programs that were set up to empower adult educational skills, knowledge, decision-making processes.Correspondence education transformed into distance education which--with the advent of information and…

  14. 48 CFR 1552.214-71 - Contract award-other factors-formal advertising.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Contract award-other factors-formal advertising. 1552.214-71 Section 1552.214-71 Federal Acquisition Regulations System ENVIRONMENTAL PROTECTION AGENCY CLAUSES AND FORMS SOLICITATION PROVISIONS AND CONTRACT CLAUSES Texts of Provisions and Clauses 1552.214-71 Contract...

  15. Governing Knowledge: The Formalization Dilemma in the Governance of the Public Sciences

    ERIC Educational Resources Information Center

    Woelert, Peter

    2015-01-01

    This paper offers a conceptually novel contribution to the understanding of the distinctive governance challenges arising from the increasing reliance on formalized knowledge in the governance of research activities. It uses the current Australian research governance system as an example--a system which exhibits a comparatively strong degree of…

  16. How Teachers Learn: The Roles of Formal, Informal, and Independent Learning

    ERIC Educational Resources Information Center

    Jones, W. Monty; Dexter, Sara

    2014-01-01

    A qualitative study of math and science teachers at two middle schools identifies how their system for learning to integrate technology into their teaching goes beyond what school leaders typically consider when planning for teachers' learning. In addition to (a) the district-initiated, or formal, system of professional development (PD) and…

  17. An Exploration of the Formal Agricultural Education System in Trinidad and Tobago

    ERIC Educational Resources Information Center

    Hurst, Sara D.; Conner, Nathan W.; Stripling, Christopher T.; Blythe, Jessica; Giorgi, Aaron; Rubenstein, Eric D.; Futrell, Angel; Jenkins, Jenny; Roberts, T. Grady

    2015-01-01

    A team of nine researchers from the United States spent 10 days exploring the formal agricultural education system in Trinidad and Tobago from primary education through postgraduate education. Data were collected from interviews and observations from students, teachers/instructors, and agricultural producers. The team concluded that (a) the people…

  18. Well-Being and Support Systems of Taiwanese Mothers of Young Children with Developmental Disabilities

    ERIC Educational Resources Information Center

    Ho, Tzu-Hua

    2013-01-01

    This study investigated the influences of children's adaptive skills, problem behaviors, and parent support systems (informal support and formal professional support) on maternal well-being (health and stress) in Taiwanese mothers of young children with developmental disabilities. The study examined the moderating effects of formal support and…

  19. Towards the Formal Verification of a Distributed Real-Time Automotive System

    NASA Technical Reports Server (NTRS)

    Endres, Erik; Mueller, Christian; Shadrin, Andrey; Tverdyshev, Sergey

    2010-01-01

    We present the status of a project which aims at building, formally and pervasively verifying a distributed automotive system. The target system is a gate-level model which consists of several interconnected electronic control units with independent clocks. This model is verified against the specification as seen by a system programmer. The automotive system is implemented on several FPGA boards. The pervasive verification is carried out using combination of interactive theorem proving (Isabelle/HOL) and model checking (LTL).

  20. From a Proven Correct Microkernel to Trustworthy Large Systems

    NASA Astrophysics Data System (ADS)

    Andronick, June

    The seL4 microkernel was the world's first general-purpose operating system kernel with a formal, machine-checked proof of correctness. The next big step in the challenge of building truly trustworthy systems is to provide a framework for developing secure systems on top of seL4. This paper first gives an overview of seL4's correctness proof, together with its main implications and assumptions, and then describes our approach to provide formal security guarantees for large, complex systems.

  1. Effective Learning in Non-Formal Education. Program of Studies in Non-Formal Education. Team Reports.

    ERIC Educational Resources Information Center

    Ward, Ted W.; Herzog, William A., Jr.

    This document is part of a series dealing with nonformal education. Introductory information is included in document SO 008 058. The focus of this report is on the learning effectiveness of nonformal education. Chapter 1 compares effective learning in a formal and nonformal environment. Chapter 2 develops a systems model for designers of learning…

  2. A Non-Formal Student Laboratory as a Place for Innovation in Education for Sustainability for All Students

    ERIC Educational Resources Information Center

    Affeldt, Fiona; Weitz, Katharina; Siol, Antje; Markic, Silvija; Eilks, Ingo

    2015-01-01

    In many Western countries, non-formal education has become increasingly recognized as a valuable addition to the traditional educational system. In recent years, a special form of non-formal student laboratories (Schülerlabor) has emerged in Germany to promote primary and secondary practical science learning. This paper describes a developmental…

  3. Toward a mathematical formalism of performance, task difficulty, and activation

    NASA Technical Reports Server (NTRS)

    Samaras, George M.

    1988-01-01

    The rudiments of a mathematical formalism for handling operational, physiological, and psychological concepts are developed for use by the man-machine system design engineer. The formalism provides a framework for developing a structured, systematic approach to the interface design problem, using existing mathematical tools, and simplifying the problem of telling a machine how to measure and use performance.

  4. Applications of Formal Methods to Specification and Safety of Avionics Software

    NASA Technical Reports Server (NTRS)

    Hoover, D. N.; Guaspari, David; Humenn, Polar

    1996-01-01

    This report treats several topics in applications of formal methods to avionics software development. Most of these topics concern decision tables, an orderly, easy-to-understand format for formally specifying complex choices among alternative courses of action. The topics relating to decision tables include: generalizations fo decision tables that are more concise and support the use of decision tables in a refinement-based formal software development process; a formalism for systems of decision tables with behaviors; an exposition of Parnas tables for users of decision tables; and test coverage criteria and decision tables. We outline features of a revised version of ORA's decision table tool, Tablewise, which will support many of the new ideas described in this report. We also survey formal safety analysis of specifications and software.

  5. Developmental trauma disorder: pros and cons of including formal criteria in the psychiatric diagnostic systems

    PubMed Central

    2013-01-01

    Background This article reviews the current debate on developmental trauma disorder (DTD) with respect to formalizing its diagnostic criteria. Victims of abuse, neglect, and maltreatment in childhood often develop a wide range of age-dependent psychopathologies with various mental comorbidities. The supporters of a formal DTD diagnosis argue that post-traumatic stress disorder (PTSD) does not cover all consequences of severe and complex traumatization in childhood. Discussion Traumatized individuals are difficult to treat, but clinical experience has shown that they tend to benefit from specific trauma therapy. A main argument against inclusion of formal DTD criteria into existing diagnostic systems is that emphasis on the etiology of the disorder might force current diagnostic systems to deviate from their purely descriptive nature. Furthermore, comorbidities and biological aspects of the disorder may be underdiagnosed using the DTD criteria. Summary Here, we discuss arguments for and against the proposal of DTD criteria and address implications and consequences for the clinical practice. PMID:23286319

  6. Unmanned Aircraft Systems in the National Airspace System: A Formal Methods Perspective

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar A.; Dutle, Aaron; Narkawicz, Anthony; Upchurch, Jason

    2016-01-01

    As the technological and operational capabilities of unmanned aircraft systems (UAS) have grown, so too have international efforts to integrate UAS into civil airspace. However, one of the major concerns that must be addressed in realizing this integration is that of safety. For example, UAS lack an on-board pilot to comply with the legal requirement that pilots see and avoid other aircraft. This requirement has motivated the development of a detect and avoid (DAA) capability for UAS that provides situational awareness and maneuver guidance to UAS operators to aid them in avoiding and remaining well clear of other aircraft in the airspace. The NASA Langley Research Center Formal Methods group has played a fundamental role in the development of this capability. This article gives a selected survey of the formal methods work conducted in support of the development of a DAA concept for UAS. This work includes specification of low-level and high-level functional requirements, formal verification of algorithms, and rigorous validation of software implementations.

  7. Is Life Unique?

    PubMed Central

    Abel, David L.

    2011-01-01

    Is life physicochemically unique? No. Is life unique? Yes. Life manifests innumerable formalisms that cannot be generated or explained by physicodynamics alone. Life pursues thousands of biofunctional goals, not the least of which is staying alive. Neither physicodynamics, nor evolution, pursue goals. Life is largely directed by linear digital programming and by the Prescriptive Information (PI) instantiated particularly into physicodynamically indeterminate nucleotide sequencing. Epigenomic controls only compound the sophistication of these formalisms. Life employs representationalism through the use of symbol systems. Life manifests autonomy, homeostasis far from equilibrium in the harshest of environments, positive and negative feedback mechanisms, prevention and correction of its own errors, and organization of its components into Sustained Functional Systems (SFS). Chance and necessity—heat agitation and the cause-and-effect determinism of nature’s orderliness—cannot spawn formalisms such as mathematics, language, symbol systems, coding, decoding, logic, organization (not to be confused with mere self-ordering), integration of circuits, computational success, and the pursuit of functionality. All of these characteristics of life are formal, not physical. PMID:25382119

  8. The Formalism of Quantum Mechanics Specified by Covariance Properties

    NASA Astrophysics Data System (ADS)

    Nisticò, G.

    2009-03-01

    The known methods, due for instance to G.W. Mackey and T.F. Jordan, which exploit the transformation properties with respect to the Euclidean and Galileian group to determine the formalism of the Quantum Theory of a localizable particle, fail in the case that the considered transformations are not symmetries of the physical system. In the present work we show that the formalism of standard Quantum Mechanics for a particle without spin can be completely recovered by exploiting the covariance properties with respect to the group of Euclidean transformations, without requiring that these transformations are symmetries of the physical system.

  9. Review of Estelle and LOTOS with respect to critical computer applications

    NASA Technical Reports Server (NTRS)

    Bown, Rodney L.

    1991-01-01

    Man rated NASA space vehicles seem to represent a set of ultimate critical computer applications. These applications require a high degree of security, integrity, and safety. A variety of formal and/or precise modeling techniques are becoming available for the designer of critical systems. The design phase of the software engineering life cycle includes the modification of non-development components. A review of the Estelle and LOTOS formal description languages is presented. Details of the languages and a set of references are provided. The languages were used to formally describe some of the Open System Interconnect (OSI) protocols.

  10. Homeland security application of the Army Soft Target Exploitation and Fusion (STEF) system

    NASA Astrophysics Data System (ADS)

    Antony, Richard T.; Karakowski, Joseph A.

    2010-04-01

    A fusion system that accommodates both text-based extracted information along with more conventional sensor-derived input has been developed and demonstrated in a terrorist attack scenario as part of the Empire Challenge (EC) 09 Exercise. Although the fusion system was developed to support Army military analysts, the system, based on a set of foundational fusion principles, has direct applicability to department of homeland security (DHS) & defense, law enforcement, and other applications. Several novel fusion technologies and applications were demonstrated in EC09. One such technology is location normalization that accommodates both fuzzy semantic expressions such as behind Library A, across the street from the market place, as well as traditional spatial representations. Additionally, the fusion system provides a range of fusion products not supported by traditional fusion algorithms. Many of these additional capabilities have direct applicability to DHS. A formal test of the fusion system was performed during the EC09 exercise. The system demonstrated that it was able to (1) automatically form tracks, (2) help analysts visualize behavior of individuals over time, (3) link key individuals based on both explicit message-based information as well as discovered (fusion-derived) implicit relationships, and (4) suggest possible individuals of interest based on their association with High Value Individuals (HVI) and user-defined key locations.

  11. A Benchmark for Comparing Different Approaches for Specifying and Verifying Real-Time Systems

    DTIC Science & Technology

    1993-01-01

    To be considered correct or useful, real - time systems must deliver results within specified time intervals, either without exception or with high...probability. Recently, a large number of formal methods have been invented for specifying and verifying real - time systems . It has been suggested that...these formal methods need to be tested out on actual real - time systems . Such testing will allow the scalability of the methods to be assessed and also

  12. Formal Specifications for an Electrical Power Grid System Stability and Reliability

    DTIC Science & Technology

    2015-09-01

    expressed in this thesis are those of the author and do not reflect the official policy or position of the Department of Defense or the U.S. Government. IRB...analyze the power grid system requirements and express the critical runtime behavior using first-order logic. First, we identify observable...Verification System, and Type systems to name a few [5]. Theorem proving’s specification dimension is dependent on the expressive power of the formal

  13. Formal Verification at System Level

    NASA Astrophysics Data System (ADS)

    Mazzini, S.; Puri, S.; Mari, F.; Melatti, I.; Tronci, E.

    2009-05-01

    System Level Analysis calls for a language comprehensible to experts with different background and yet precise enough to support meaningful analyses. SysML is emerging as an effective balance between such conflicting goals. In this paper we outline some the results obtained as for SysML based system level functional formal verification by an ESA/ESTEC study, with a collaboration among INTECS and La Sapienza University of Roma. The study focuses on SysML based system level functional requirements techniques.

  14. Symbolic LTL Compilation for Model Checking: Extended Abstract

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Y.; Vardi, Moshe Y.

    2007-01-01

    In Linear Temporal Logic (LTL) model checking, we check LTL formulas representing desired behaviors against a formal model of the system designed to exhibit these behaviors. To accomplish this task, the LTL formulas must be translated into automata [21]. We focus on LTL compilation by investigating LTL satisfiability checking via a reduction to model checking. Having shown that symbolic LTL compilation algorithms are superior to explicit automata construction algorithms for this task [16], we concentrate here on seeking a better symbolic algorithm.We present experimental data comparing algorithmic variations such as normal forms, encoding methods, and variable ordering and examine their effects on performance metrics including processing time and scalability. Safety critical systems, such as air traffic control, life support systems, hazardous environment controls, and automotive control systems, pervade our daily lives, yet testing and simulation alone cannot adequately verify their reliability [3]. Model checking is a promising approach to formal verification for safety critical systems which involves creating a formal mathematical model of the system and translating desired safety properties into a formal specification for this model. The complement of the specification is then checked against the system model. When the model does not satisfy the specification, model-checking tools accompany this negative answer with a counterexample, which points to an inconsistency between the system and the desired behaviors and aids debugging efforts.

  15. Lifelong learning as an instrument for human capital development in Benin

    NASA Astrophysics Data System (ADS)

    Biao, Idowu

    2015-10-01

    A review of the Benin education system shows that it is still heavily school-based. Yet, a high level of wastage is currently being recorded at school level (about 50% success rate at primary level, about 40% success rate at high school level and about 1% enrolment rate of qualified candidates and success rate at tertiary level), leading to the unintentional creation of a large population of unskilled and unproductive youths and adults. Integrated education systems which hold great potential and opportunities for both initial and continuing education remain hardly explored and virtually untapped. Yet, the challenges of the 21st century are such that only the unveiling and continuous cultivation of multi-faceted human capital can help individual citizens lead both a productive and fulfilled life. Formal education alone or non-formal education alone, irrespective of how well each is delivered, is no longer sufficient in facing up to the multifarious challenges of the 21st century. If education is to serve Benin beneficially in this century, the current national system of education must be reoriented to free up citizens' human capital through the implementation of an integrated educational system. This article proposes a new national education system which is rooted in the concept of lifelong learning and combines formal and non-formal systems of education for Benin.

  16. Formal Requirements-Based Programming for Complex Systems

    NASA Technical Reports Server (NTRS)

    Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis

    2005-01-01

    Computer science as a field has not yet produced a general method to mechanically transform complex computer system requirements into a provably equivalent implementation. Such a method would be one major step towards dealing with complexity in computing, yet it remains the elusive holy grail of system development. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that such tools and methods leave unfilled is that the formal models cannot be proven to be equivalent to the system requirements as originated by the customer For the classes of complex systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations. While other techniques are available, this method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. We illustrate the application of the method to an example procedure from the Hubble Robotic Servicing Mission currently under study and preliminary formulation at NASA Goddard Space Flight Center.

  17. A formal approach to validation and verification for knowledge-based control systems

    NASA Technical Reports Server (NTRS)

    Castore, Glen

    1987-01-01

    As control systems become more complex in response to desires for greater system flexibility, performance and reliability, the promise is held out that artificial intelligence might provide the means for building such systems. An obstacle to the use of symbolic processing constructs in this domain is the need for verification and validation (V and V) of the systems. Techniques currently in use do not seem appropriate for knowledge-based software. An outline of a formal approach to V and V for knowledge-based control systems is presented.

  18. Extracting observables from lattice data in the three-particle sector

    NASA Astrophysics Data System (ADS)

    Rusetsky, Akaki; Hammer, Hans-Werner; Pang, Jin-Yi

    2018-03-01

    The three-particle quantization condition is derived, using the particle-dimer picture in the non-relativistic effective field theory. The procedure for the extraction of various observables in the three-particle sector (the particle-dimer scattering amplitudes, breakup amplitudes, etc.) from the finite-volume lattice spectrum is discussed in detail. As an illustration of the general formalism, the expression for the finite-volume energy shift of the three-body bound-state in the unitary limit is re-derived. The role of the threebody force, which is essential for the renormalization, is highlighted, and the extension of the result beyond the unitary limit is studied. Comparison with other approaches, known in the literature, is carried out.

  19. Cosmology from group field theory formalism for quantum gravity.

    PubMed

    Gielen, Steffen; Oriti, Daniele; Sindoni, Lorenzo

    2013-07-19

    We identify a class of condensate states in the group field theory (GFT) formulation of quantum gravity that can be interpreted as macroscopic homogeneous spatial geometries. We then extract the dynamics of such condensate states directly from the fundamental quantum GFT dynamics, following the procedure used in ordinary quantum fluids. The effective dynamics is a nonlinear and nonlocal extension of quantum cosmology. We also show that any GFT model with a kinetic term of Laplacian type gives rise, in a semiclassical (WKB) approximation and in the isotropic case, to a modified Friedmann equation. This is the first concrete, general procedure for extracting an effective cosmological dynamics directly from a fundamental theory of quantum geometry.

  20. Classification systems in nursing: formalizing nursing knowledge and implications for nursing information systems.

    PubMed

    Goossen, W T; Epping, P J; Abraham, I L

    1996-03-01

    The development of nursing information systems (NIS) is often hampered by the fact that nursing lacks a unified nursing terminology and classification system. Currently there exist various initiatives in this area. We address the question as to how current initiatives in the development of nursing terminology and classification systems can contribute towards the development of NIS. First, the rationale behind the formalization of nursing knowledge is discussed. Next, using a framework for nursing information processing, the most important developments in the field of nursing on formalization, terminology and classification are critically reviewed. The initiatives discussed include nursing terminology projects in several countries, and the International Classification of Nursing Practice. Suggestions for further developments in the area are discussed. Finally, implications for NIS are presented, as well as the relationships of these components to other sections of an integrated computerized patient record.

  1. Experience Using Formal Methods for Specifying a Multi-Agent System

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Rash, James; Hinchey, Michael; Szczur, Martha R. (Technical Monitor)

    2000-01-01

    The process and results of using formal methods to specify the Lights Out Ground Operations System (LOGOS) is presented in this paper. LOGOS is a prototype multi-agent system developed to show the feasibility of providing autonomy to satellite ground operations functions at NASA Goddard Space Flight Center (GSFC). After the initial implementation of LOGOS the development team decided to use formal methods to check for race conditions, deadlocks and omissions. The specification exercise revealed several omissions as well as race conditions. After completing the specification, the team concluded that certain tools would have made the specification process easier. This paper gives a sample specification of two of the agents in the LOGOS system and examples of omissions and race conditions found. It concludes with describing an architecture of tools that would better support the future specification of agents and other concurrent systems.

  2. Petri Nets - A Mathematical Formalism to Analyze Chemical Reaction Networks.

    PubMed

    Koch, Ina

    2010-12-17

    In this review we introduce and discuss Petri nets - a mathematical formalism to describe and analyze chemical reaction networks. Petri nets were developed to describe concurrency in general systems. We find most applications to technical and financial systems, but since about twenty years also in systems biology to model biochemical systems. This review aims to give a short informal introduction to the basic formalism illustrated by a chemical example, and to discuss possible applications to the analysis of chemical reaction networks, including cheminformatics. We give a short overview about qualitative as well as quantitative modeling Petri net techniques useful in systems biology, summarizing the state-of-the-art in that field and providing the main literature references. Finally, we discuss advantages and limitations of Petri nets and give an outlook to further development. Copyright © 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Model Checking Artificial Intelligence Based Planners: Even the Best Laid Plans Must Be Verified

    NASA Technical Reports Server (NTRS)

    Smith, Margaret H.; Holzmann, Gerard J.; Cucullu, Gordon C., III; Smith, Benjamin D.

    2005-01-01

    Automated planning systems (APS) are gaining acceptance for use on NASA missions as evidenced by APS flown On missions such as Orbiter and Deep Space 1 both of which were commanded by onboard planning systems. The planning system takes high level goals and expands them onboard into a detailed of action fiat the spacecraft executes. The system must be verified to ensure that the automatically generated plans achieve the goals as expected and do not generate actions that would harm the spacecraft or mission. These systems are typically tested using empirical methods. Formal methods, such as model checking, offer exhaustive or measurable test coverage which leads to much greater confidence in correctness. This paper describes a formal method based on the SPIN model checker. This method guarantees that possible plans meet certain desirable properties. We express the input model in Promela, the language of SPIN and express the properties of desirable plans formally.

  4. A Survey of Formal Methods for Intelligent Swarms

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt; Rash, James; Hinchey, Mike; Rouff, Chrustopher A.

    2004-01-01

    Swarms of intelligent autonomous spacecraft, involving complex behaviors and interactions, are being proposed for future space exploration missions. Such missions provide greater flexibility and offer the possibility of gathering more science data than traditional single spacecraft missions. The emergent properties of swarms make these missions powerful, but simultaneously far more difficult to design, and to assure that the proper behaviors will emerge. These missions are also considerably more complex than previous types of missions, and NASA, like other organizations, has little experience in developing or in verifying and validating these types of missions. A significant challenge when verifying and validating swarms of intelligent interacting agents is how to determine that the possible exponential interactions and emergent behaviors are producing the desired results. Assuring correct behavior and interactions of swarms will be critical to mission success. The Autonomous Nano Technology Swarm (ANTS) mission is an example of one of the swarm types of missions NASA is considering. The ANTS mission will use a swarm of picospacecraft that will fly from Earth orbit to the Asteroid Belt. Using an insect colony analogy, ANTS will be composed of specialized workers for asteroid exploration. Exploration would consist of cataloguing the mass, density, morphology, and chemical composition of the asteroids, including any anomalous concentrations of specific minerals. To perform this task, ANTS would carry miniaturized instruments, such as imagers, spectrometers, and detectors. Since ANTS and other similar missions are going to consist of autonomous spacecraft that may be out of contact with the earth for extended periods of time, and have low bandwidths due to weight constraints, it will be difficult to observe improper behavior and to correct any errors after launch. Providing V&V (verification and validation) for this type of mission is new to NASA, and represents the cutting edge in system correctness, and requires higher levels of assurance than other (traditional) missions that use a single or small number of spacecraft that are deterministic in nature and have near continuous communication access. One of the highest possible levels of assurance comes from the application of formal methods. Formal methods are mathematics-based tools and techniques for specifying and verifying (software and hardware) systems. They are particularly useful for specifying complex parallel systems, such as exemplified by the ANTS mission, where the entire system is difficult for a single person to fully understand, a problem that is multiplied with multiple developers. Once written, a formal specification can be used to prove properties of a system (e.g., the underlying system will go from one state to another or not into a specific state) and check for particular types of errors (e.g., race or livelock conditions). A formal specification can also be used as input to a model checker for further validation. This report gives the results of a survey of formal methods techniques for verification and validation of space missions that use swarm technology. Multiple formal methods were evaluated to determine their effectiveness in modeling and assuring the behavior of swarms of spacecraft using the ANTS mission as an example system. This report is the first result of the project to determine formal approaches that are promising for formally specifying swarm-based systems. From this survey, the most promising approaches were selected and are discussed relative to their possible application to the ANTS mission. Future work will include the application of an integrated approach, based on the selected approaches identified in this report, to the formal specification of the ANTS mission.

  5. Signed language and human action processing: evidence for functional constraints on the human mirror-neuron system.

    PubMed

    Corina, David P; Knapp, Heather Patterson

    2008-12-01

    In the quest to further understand the neural underpinning of human communication, researchers have turned to studies of naturally occurring signed languages used in Deaf communities. The comparison of the commonalities and differences between spoken and signed languages provides an opportunity to determine core neural systems responsible for linguistic communication independent of the modality in which a language is expressed. The present article examines such studies, and in addition asks what we can learn about human languages by contrasting formal visual-gestural linguistic systems (signed languages) with more general human action perception. To understand visual language perception, it is important to distinguish the demands of general human motion processing from the highly task-dependent demands associated with extracting linguistic meaning from arbitrary, conventionalized gestures. This endeavor is particularly important because theorists have suggested close homologies between perception and production of actions and functions of human language and social communication. We review recent behavioral, functional imaging, and neuropsychological studies that explore dissociations between the processing of human actions and signed languages. These data suggest incomplete overlap between the mirror-neuron systems proposed to mediate human action and language.

  6. A brief overview of NASA Langley's research program in formal methods

    NASA Technical Reports Server (NTRS)

    1992-01-01

    An overview of NASA Langley's research program in formal methods is presented. The major goal of this work is to bring formal methods technology to a sufficiently mature level for use by the United States aerospace industry. Towards this goal, work is underway to design and formally verify a fault-tolerant computing platform suitable for advanced flight control applications. Also, several direct technology transfer efforts have been initiated that apply formal methods to critical subsystems of real aerospace computer systems. The research team consists of six NASA civil servants and contractors from Boeing Military Aircraft Company, Computational Logic Inc., Odyssey Research Associates, SRI International, University of California at Davis, and Vigyan Inc.

  7. Formalization of an environmental model using formal concept analysis - FCA

    NASA Astrophysics Data System (ADS)

    Bourdon-García, Rubén D.; Burgos-Salcedo, Javier D.

    2016-08-01

    Nowadays, there is a huge necessity to generate novel strategies for social-ecological systems analyses for resolving global sustainability problems. This paper has as main purpose the application of the formal concept analysis to formalize the theory of Augusto Ángel Maya, who without a doubt, was one of the most important environmental philosophers in South America; Ángel Maya proposed and established that Ecosystem-Culture relations, instead Human-Nature ones, are determinants in our understanding and management of natural resources. Based on this, a concept lattice, formal concepts, subconcept-superconcept relations, partially ordered sets, supremum and infimum of the lattice and implications between attributes (Duquenne-Guigues base), were determined for the ecosystem-culture relations.

  8. From Classical to Quantum: New Canonical Tools for the Dynamics of Gravity

    NASA Astrophysics Data System (ADS)

    Höhn, P. A.

    2012-05-01

    In a gravitational context, canonical methods offer an intuitive picture of the dynamics and simplify an identification of the degrees of freedom. Nevertheless, extracting dynamical information from background independent approaches to quantum gravity is a highly non-trivial challenge. In this thesis, the conundrum of (quantum) gravitational dynamics is approached from two different directions by means of new canonical tools. This thesis is accordingly divided into two parts: In the first part, a general canonical formalism for discrete systems featuring a variational action principle is developed which is equivalent to the covariant formulation following directly from the action. This formalism can handle evolving phase spaces and is thus appropriate for describing evolving lattices. Attention will be devoted to a characterization of the constraints, symmetries and degrees of freedom appearing in such discrete systems which, in the case of evolving phase spaces, is time step dependent. The advantage of this formalism is that it does not depend on the particular discretization and, hence, is suitable for coarse graining procedures. This formalism is applicable to discrete mechanics, lattice field theories and discrete gravity models---underlying some approaches to quantum gravity---and, furthermore, may prove useful for numerical imple mentations. For concreteness, these new tools are employed to formulate Regge Calculus canonically as a theory of the dynamics of discrete hypersurfaces in discrete spacetimes, thereby removing a longstanding obstacle to connecting covariant simplicial gravity models with canonical frameworks. This result is interesting in view of several background independent approaches to quantum gravity. In addition, perturbative expansions around symmetric background solutions of Regge Calculus are studied up to second order. Background gauge modes generically become propagating at second order as a consequence of a symmetry breaking. In the second part of this thesis, the paradigm of relational dynamics is considered. Dynamical observables in gravity are relational. Unfortunately, their construction and evaluation is notoriously difficult, especially in the quantum theory. An effective canonical framework is devised which permits to evaluate the semiclassical relational dynamics of constrained quantum systems by sidestepping technical problems associated with explicit constructions of physical Hilbert spaces. This effective approach is well-geared for addressing the concept of relational evolution in general quantum cosmological models since it (i) allows to depart from idealized relational `clock references’ and, instead, to employ generic degrees of freedom as imperfect relational `clocks’, (ii) enables one to systematically switch between different such `clocks’ and (iii) yields a consistent (temporally) local time evolution with transient observables so long as semiclassicality holds. These techniques are illustrated by toy models and, finally, are applied to a non-integrable cosmological model. It is argued that relational evolution is generically only a transient and semiclassical phenomenon

  9. Formulation of state projected centroid molecular dynamics: Microcanonical ensemble and connection to the Wigner distribution.

    PubMed

    Orr, Lindsay; Hernández de la Peña, Lisandro; Roy, Pierre-Nicholas

    2017-06-07

    A derivation of quantum statistical mechanics based on the concept of a Feynman path centroid is presented for the case of generalized density operators using the projected density operator formalism of Blinov and Roy [J. Chem. Phys. 115, 7822-7831 (2001)]. The resulting centroid densities, centroid symbols, and centroid correlation functions are formulated and analyzed in the context of the canonical equilibrium picture of Jang and Voth [J. Chem. Phys. 111, 2357-2370 (1999)]. The case where the density operator projects onto a particular energy eigenstate of the system is discussed, and it is shown that one can extract microcanonical dynamical information from double Kubo transformed correlation functions. It is also shown that the proposed projection operator approach can be used to formally connect the centroid and Wigner phase-space distributions in the zero reciprocal temperature β limit. A Centroid Molecular Dynamics (CMD) approximation to the state-projected exact quantum dynamics is proposed and proven to be exact in the harmonic limit. The state projected CMD method is also tested numerically for a quartic oscillator and a double-well potential and found to be more accurate than canonical CMD. In the case of a ground state projection, this method can resolve tunnelling splittings of the double well problem in the higher barrier regime where canonical CMD fails. Finally, the state-projected CMD framework is cast in a path integral form.

  10. Formulation of state projected centroid molecular dynamics: Microcanonical ensemble and connection to the Wigner distribution

    NASA Astrophysics Data System (ADS)

    Orr, Lindsay; Hernández de la Peña, Lisandro; Roy, Pierre-Nicholas

    2017-06-01

    A derivation of quantum statistical mechanics based on the concept of a Feynman path centroid is presented for the case of generalized density operators using the projected density operator formalism of Blinov and Roy [J. Chem. Phys. 115, 7822-7831 (2001)]. The resulting centroid densities, centroid symbols, and centroid correlation functions are formulated and analyzed in the context of the canonical equilibrium picture of Jang and Voth [J. Chem. Phys. 111, 2357-2370 (1999)]. The case where the density operator projects onto a particular energy eigenstate of the system is discussed, and it is shown that one can extract microcanonical dynamical information from double Kubo transformed correlation functions. It is also shown that the proposed projection operator approach can be used to formally connect the centroid and Wigner phase-space distributions in the zero reciprocal temperature β limit. A Centroid Molecular Dynamics (CMD) approximation to the state-projected exact quantum dynamics is proposed and proven to be exact in the harmonic limit. The state projected CMD method is also tested numerically for a quartic oscillator and a double-well potential and found to be more accurate than canonical CMD. In the case of a ground state projection, this method can resolve tunnelling splittings of the double well problem in the higher barrier regime where canonical CMD fails. Finally, the state-projected CMD framework is cast in a path integral form.

  11. Teaching Some Informatics Concepts Using Formal System

    ERIC Educational Resources Information Center

    Yang, Sojung; Park, Seongbin

    2014-01-01

    There are many important issues in informatics and many agree that algorithms and programming are most important issues that need to be included in informatics education (Dagiene and Jevsikova, 2012). In this paper, we propose how some of these issues can be easily taught using the notion of a formal system which consists of axioms and inference…

  12. Literate Specification: Using Design Rationale To Support Formal Methods in the Development of Human-Machine Interfaces.

    ERIC Educational Resources Information Center

    Johnson, Christopher W.

    1996-01-01

    The development of safety-critical systems (aircraft cockpits and reactor control rooms) is qualitatively different from that of other interactive systems. These differences impose burdens on design teams that must ensure the development of human-machine interfaces. Analyzes strengths and weaknesses of formal methods for the design of user…

  13. Relationship between the work of teachers in nonformal settings and in schools

    NASA Astrophysics Data System (ADS)

    Yoloye, E. Ayotunde

    1987-09-01

    Formal and nonformal education differ more in strategies and administration than in content. This however is not to say that there are not some important distinctions in the nature of content of what is usually done under formal education especially when we are dealing with particular professions and vocations. Recent efforts to integrate formal and nonformal education especially since publication of the report of the International Commission on the Development of Education have highlighted certain challenges in defining the role of the teacher especially in the nonformal sector and in the operation of an integrated system. Examples of efforts at integration may be found in the community schools in many Eastern African countries, the mosque schools or `maktabs' in Pakistan, the N.A.E.P. in India, the Vocational Skills Improvement unit in Nigeria and the various extramural and extension programmes of tertiary institutions. Among the major implications for a new orientation of teachers are (1) the issue of mobility of individuals between the formal and nonformal systems, (2) the issue of integrating the administration of formal and nonformal education, (3) the issue of appropriate strategies for teacher training and (4) the issue of creating new cadres of teachers besides those currently trained in conventional teachers' colleges. Among the embedded challenges is that of evolving new assessment procedures and establishment of equivalences between practical experience and formal academic instruction. The educational system as a whole still has a considerable way to go in meeting these challenges.

  14. Reformulating the Schrödinger equation as a Shabat-Zakharov system

    NASA Astrophysics Data System (ADS)

    Boonserm, Petarpa; Visser, Matt

    2010-02-01

    We reformulate the second-order Schrödinger equation as a set of two coupled first-order differential equations, a so-called "Shabat-Zakharov system" (sometimes called a "Zakharov-Shabat" system). There is considerable flexibility in this approach, and we emphasize the utility of introducing an "auxiliary condition" or "gauge condition" that is used to cut down the degrees of freedom. Using this formalism, we derive the explicit (but formal) general solution to the Schrödinger equation. The general solution depends on three arbitrarily chosen functions, and a path-ordered exponential matrix. If one considers path ordering to be an "elementary" process, then this represents complete quadrature, albeit formal, of the second-order linear ordinary differential equation.

  15. Formal functional test designs with a test representation language

    NASA Technical Reports Server (NTRS)

    Hops, J. M.

    1993-01-01

    The application of the category-partition method to the test design phase of hardware, software, or system test development is discussed. The method provides a formal framework for reducing the total number of possible test cases to a minimum logical subset for effective testing. An automatic tool and a formal language were developed to implement the method and produce the specification of test cases.

  16. Multilingual Content Extraction Extended with Background Knowledge for Military Intelligence

    DTIC Science & Technology

    2011-06-01

    extended with background knowledge (WordNet [Fel98], YAGO [SKW08]) so that new conclusions (logical inferences) can be drawn. For this purpose theorem...such formalized content is extended with background knowledge (WordNet, YAGO ) so that new conclusions (logical inferences) can be drawn. Our aim is to...External Knowledge Formulas Transformation FOLE MRS to FOLE WordNet OpenCyc ... YAGO Logical Calculation Knowledge Background Knowledge Axioms Background

  17. Stochastic Representation of Chaos Using Terminal Attractors

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    2006-01-01

    A nonlinear version of the Liouville equation based on terminal attractors is part of a mathematical formalism for describing postinstability motions of dynamical systems characterized by exponential divergences of trajectories leading to chaos (including turbulence as a form of chaos). The formalism can be applied to both conservative systems (e.g., multibody systems in celestial mechanics) and dissipative systems (e.g., viscous fluids). The development of the present formalism was undertaken in an effort to remove positive Lyapunov exponents. The means chosen to accomplish this is coupling of the governing dynamical equations with the corresponding Liouville equation that describes the evolution of the flow of error probability. The underlying idea is to suppress the divergences of different trajectories that correspond to different initial conditions, without affecting a target trajectory, which is one that starts with prescribed initial conditions.

  18. European Train Control System: A Case Study in Formal Verification

    NASA Astrophysics Data System (ADS)

    Platzer, André; Quesel, Jan-David

    Complex physical systems have several degrees of freedom. They only work correctly when their control parameters obey corresponding constraints. Based on the informal specification of the European Train Control System (ETCS), we design a controller for its cooperation protocol. For its free parameters, we successively identify constraints that are required to ensure collision freedom. We formally prove the parameter constraints to be sharp by characterizing them equivalently in terms of reachability properties of the hybrid system dynamics. Using our deductive verification tool KeYmaera, we formally verify controllability, safety, liveness, and reactivity properties of the ETCS protocol that entail collision freedom. We prove that the ETCS protocol remains correct even in the presence of perturbation by disturbances in the dynamics. We verify that safety is preserved when a PI controlled speed supervision is used.

  19. Verification of the FtCayuga fault-tolerant microprocessor system. Volume 2: Formal specification and correctness theorems

    NASA Technical Reports Server (NTRS)

    Bickford, Mark; Srivas, Mandayam

    1991-01-01

    Presented here is a formal specification and verification of a property of a quadruplicately redundant fault tolerant microprocessor system design. A complete listing of the formal specification of the system and the correctness theorems that are proved are given. The system performs the task of obtaining interactive consistency among the processors using a special instruction on the processors. The design is based on an algorithm proposed by Pease, Shostak, and Lamport. The property verified insures that an execution of the special instruction by the processors correctly accomplishes interactive consistency, providing certain preconditions hold, using a computer aided design verification tool, Spectool, and the theorem prover, Clio. A major contribution of the work is the demonstration of a significant fault tolerant hardware design that is mechanically verified by a theorem prover.

  20. Landau's statistical mechanics for quasi-particle models

    NASA Astrophysics Data System (ADS)

    Bannur, Vishnu M.

    2014-04-01

    Landau's formalism of statistical mechanics [following L. D. Landau and E. M. Lifshitz, Statistical Physics (Pergamon Press, Oxford, 1980)] is applied to the quasi-particle model of quark-gluon plasma. Here, one starts from the expression for pressure and develop all thermodynamics. It is a general formalism and consistent with our earlier studies [V. M. Bannur, Phys. Lett. B647, 271 (2007)] based on Pathria's formalism [following R. K. Pathria, Statistical Mechanics (Butterworth-Heinemann, Oxford, 1977)]. In Pathria's formalism, one starts from the expression for energy density and develop thermodynamics. Both the formalisms are consistent with thermodynamics and statistical mechanics. Under certain conditions, which are wrongly called thermodynamic consistent relation, we recover other formalism of quasi-particle system, like in M. I. Gorenstein and S. N. Yang, Phys. Rev. D52, 5206 (1995), widely studied in quark-gluon plasma.

  1. Addressing software security and mitigations in the life cycle

    NASA Technical Reports Server (NTRS)

    Gilliam, David; Powell, John; Haugh, Eric; Bishop, Matt

    2003-01-01

    Traditionally, security is viewed as an organizational and Information Technology (IIJ systems function comprising of Firewalls, intrusion detection systems (IDS), system security settings and patches to the operating system (OS) and applications running on it. Until recently, little thought has been given to the importance of security as a formal approach in the software life cycle. The Jet Propulsion Laboratory has approached the problem through the development of an integrated formal Software Security Assessment Instrument (SSAI) with six foci for the software life cycle.

  2. Addressing software security and mitigations in the life cycle

    NASA Technical Reports Server (NTRS)

    Gilliam, David; Powell, John; Haugh, Eric; Bishop, Matt

    2004-01-01

    Traditionally, security is viewed as an organizational and Information Technology (IT) systems function comprising of firewalls, intrusion detection systems (IDS), system security settings and patches to the operating system (OS) and applications running on it. Until recently, little thought has been given to the importance of security as a formal approach in the software life cycle. The Jet Propulsion Laboratory has approached the problem through the development of an integrated formal Software Security Assessment Instrument (SSAI) with six foci for the software life cycle.

  3. A Formal Modelling Language Extending SysML for Simulation of Continuous and Discrete System

    DTIC Science & Technology

    2012-11-01

    UNCLASSIFIED DSTO-GD-0734 16. A Formal Modelling Language Extending SysML for Simulation of Continuous and Discrete System – Mark Hodson1 and...be conceptual at some level because a one to one mapping with the real system will never exist. SysML is an extension and modification of UML that...simulation, which can provide great insights into the behaviour of complex systems. Although UML and SysML primarily support conceptual modelling they

  4. Specification and simulation of behavior of the Continuous Infusion Insulin Pump system.

    PubMed

    Babamir, Seyed Morteza; Dehkordi, Mehdi Borhani

    2014-01-01

    Continuous Infusion Insulin Pump (CIIP) system is responsible for monitoring diabetic blood sugar. In this paper, we aim to specify and simulate the CIIP software behavior. To this end, we first: (1) presented a model consisting of the CIIP system behavior in response to its environment (diabetic) behavior and (2) we formally defined the safety requirements of the system environment (diabetic) in the Z formal modeling language. Such requirements should be satisfied by the CIIP software. Finally, we programmed the model and requirements.

  5. Module Extraction for Efficient Object Queries over Ontologies with Large ABoxes

    PubMed Central

    Xu, Jia; Shironoshita, Patrick; Visser, Ubbo; John, Nigel; Kabuka, Mansur

    2015-01-01

    The extraction of logically-independent fragments out of an ontology ABox can be useful for solving the tractability problem of querying ontologies with large ABoxes. In this paper, we propose a formal definition of an ABox module, such that it guarantees complete preservation of facts about a given set of individuals, and thus can be reasoned independently w.r.t. the ontology TBox. With ABox modules of this type, isolated or distributed (parallel) ABox reasoning becomes feasible, and more efficient data retrieval from ontology ABoxes can be attained. To compute such an ABox module, we present a theoretical approach and also an approximation for SHIQ ontologies. Evaluation of the module approximation on different types of ontologies shows that, on average, extracted ABox modules are significantly smaller than the entire ABox, and the time for ontology reasoning based on ABox modules can be improved significantly. PMID:26848490

  6. Extraction processes and solvents for recovery of cesium, strontium, rare earth elements, technetium and actinides from liquid radioactive waste

    DOEpatents

    Zaitsev, Boris N.; Esimantovskiy, Vyacheslav M.; Lazarev, Leonard N.; Dzekun, Evgeniy G.; Romanovskiy, Valeriy N.; Todd, Terry A.; Brewer, Ken N.; Herbst, Ronald S.; Law, Jack D.

    2001-01-01

    Cesium and strontium are extracted from aqueous acidic radioactive waste containing rare earth elements, technetium and actinides, by contacting the waste with a composition of a complex organoboron compound and polyethylene glycol in an organofluorine diluent mixture. In a preferred embodiment the complex organoboron compound is chlorinated cobalt dicarbollide, the polyethylene glycol has the formula RC.sub.6 H.sub.4 (OCH.sub.2 CH.sub.2).sub.n OH, and the organofluorine diluent is a mixture of bis-tetrafluoropropyl ether of diethylene glycol with at least one of bis-tetrafluoropropyl ether of ethylene glycol and bis-tetrafluoropropyl formal. The rare earths, technetium and the actinides (especially uranium, plutonium and americium), are extracted from the aqueous phase using a phosphine oxide in a hydrocarbon diluent, and reextracted from the resulting organic phase into an aqueous phase by using a suitable strip reagent.

  7. Formalizing Knowledge in Multi-Scale Agent-Based Simulations

    PubMed Central

    Somogyi, Endre; Sluka, James P.; Glazier, James A.

    2017-01-01

    Multi-scale, agent-based simulations of cellular and tissue biology are increasingly common. These simulations combine and integrate a range of components from different domains. Simulations continuously create, destroy and reorganize constituent elements causing their interactions to dynamically change. For example, the multi-cellular tissue development process coordinates molecular, cellular and tissue scale objects with biochemical, biomechanical, spatial and behavioral processes to form a dynamic network. Different domain specific languages can describe these components in isolation, but cannot describe their interactions. No current programming language is designed to represent in human readable and reusable form the domain specific knowledge contained in these components and interactions. We present a new hybrid programming language paradigm that naturally expresses the complex multi-scale objects and dynamic interactions in a unified way and allows domain knowledge to be captured, searched, formalized, extracted and reused. PMID:29338063

  8. Formalizing Knowledge in Multi-Scale Agent-Based Simulations.

    PubMed

    Somogyi, Endre; Sluka, James P; Glazier, James A

    2016-10-01

    Multi-scale, agent-based simulations of cellular and tissue biology are increasingly common. These simulations combine and integrate a range of components from different domains. Simulations continuously create, destroy and reorganize constituent elements causing their interactions to dynamically change. For example, the multi-cellular tissue development process coordinates molecular, cellular and tissue scale objects with biochemical, biomechanical, spatial and behavioral processes to form a dynamic network. Different domain specific languages can describe these components in isolation, but cannot describe their interactions. No current programming language is designed to represent in human readable and reusable form the domain specific knowledge contained in these components and interactions. We present a new hybrid programming language paradigm that naturally expresses the complex multi-scale objects and dynamic interactions in a unified way and allows domain knowledge to be captured, searched, formalized, extracted and reused.

  9. Nuclear deformation in the laboratory frame

    NASA Astrophysics Data System (ADS)

    Gilbreth, C. N.; Alhassid, Y.; Bertsch, G. F.

    2018-01-01

    We develop a formalism for calculating the distribution of the axial quadrupole operator in the laboratory frame within the rotationally invariant framework of the configuration-interaction shell model. The calculation is carried out using a finite-temperature auxiliary-field quantum Monte Carlo method. We apply this formalism to isotope chains of even-mass samarium and neodymium nuclei and show that the quadrupole distribution provides a model-independent signature of nuclear deformation. Two technical advances are described that greatly facilitate the calculations. The first is to exploit the rotational invariance of the underlying Hamiltonian to reduce the statistical fluctuations in the Monte Carlo calculations. The second is to determine quadruple invariants from the distribution of the axial quadrupole operator in the laboratory frame. This allows us to extract effective values of the intrinsic quadrupole shape parameters without invoking an intrinsic frame or a mean-field approximation.

  10. Formal Modeling and Analysis of a Preliminary Small Aircraft Transportation System (SATS)Concept

    NASA Technical Reports Server (NTRS)

    Carrreno, Victor A.; Gottliebsen, Hanne; Butler, Ricky; Kalvala, Sara

    2004-01-01

    New concepts for automating air traffic management functions at small non-towered airports raise serious safety issues associated with the software implementations and their underlying key algorithms. The criticality of such software systems necessitates that strong guarantees of the safety be developed for them. In this paper we present a formal method for modeling and verifying such systems using the PVS theorem proving system. The method is demonstrated on a preliminary concept of operation for the Small Aircraft Transportation System (SATS) project at NASA Langley.

  11. Spin-dependent optimized effective potential formalism for open and closed systems

    NASA Astrophysics Data System (ADS)

    Rigamonti, S.; Horowitz, C. M.; Proetto, C. R.

    2015-12-01

    Orbital-based exchange (x ) correlation (c ) energy functionals, leading to the optimized effective potential (OEP) formalism of density-functional theory (DFT), are gaining increasing importance in ground-state DFT, as applied to the calculation of the electronic structure of closed systems with a fixed number of particles, such as atoms and molecules. These types of functionals prove also to be extremely valuable for dealing with solid-state systems with reduced dimensionality, such as is the case of electrons trapped at the interface between two different semiconductors, or narrow metallic slabs. In both cases, electrons build a quasi-two-dimensional electron gas, or Q2DEG. We provide here a general DFT-OEP formal scheme valid both for Q2DEGs either isolated (closed) or in contact with a particle bath (open), and show that both possible representations are equivalent, being the choice of one or the other essentially a question of convenience. Based on this equivalence, a calculation scheme is proposed which avoids the noninvertibility problem of the density response function for closed systems. We also consider the case of spontaneously spin-polarized Q2DEGs, and find that far from the region where the Q2DEG is localized, the exact x -only exchange potential approaches two different, spin-dependent asymptotic limits. As an example, aside from these formal results, we also provide numerical results for a spin-polarized jellium slab, using the new OEP formalism for closed systems. The accuracy of the Krieger-Li-Iafrate approximation has been also tested for the same system, and found to be as good as it is for atoms and molecules.

  12. Model, analysis, and evaluation of the effects of analog VLSI arithmetic on linear subspace-based image recognition.

    PubMed

    Carvajal, Gonzalo; Figueroa, Miguel

    2014-07-01

    Typical image recognition systems operate in two stages: feature extraction to reduce the dimensionality of the input space, and classification based on the extracted features. Analog Very Large Scale Integration (VLSI) is an attractive technology to achieve compact and low-power implementations of these computationally intensive tasks for portable embedded devices. However, device mismatch limits the resolution of the circuits fabricated with this technology. Traditional layout techniques to reduce the mismatch aim to increase the resolution at the transistor level, without considering the intended application. Relating mismatch parameters to specific effects in the application level would allow designers to apply focalized mismatch compensation techniques according to predefined performance/cost tradeoffs. This paper models, analyzes, and evaluates the effects of mismatched analog arithmetic in both feature extraction and classification circuits. For the feature extraction, we propose analog adaptive linear combiners with on-chip learning for both Least Mean Square (LMS) and Generalized Hebbian Algorithm (GHA). Using mathematical abstractions of analog circuits, we identify mismatch parameters that are naturally compensated during the learning process, and propose cost-effective guidelines to reduce the effect of the rest. For the classification, we derive analog models for the circuits necessary to implement Nearest Neighbor (NN) approach and Radial Basis Function (RBF) networks, and use them to emulate analog classifiers with standard databases of face and hand-writing digits. Formal analysis and experiments show how we can exploit adaptive structures and properties of the input space to compensate the effects of device mismatch at the application level, thus reducing the design overhead of traditional layout techniques. Results are also directly extensible to multiple application domains using linear subspace methods. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. The generation of gravitational waves. I - Weak-field sources

    NASA Technical Reports Server (NTRS)

    Thorne, K. S.; Kovacs, S. J.

    1975-01-01

    This paper derives and summarizes a 'plug-in-and-grind' formalism for calculating the gravitational waves emitted by any system with weak internal gravitational fields. If the internal fields have negligible influence on the system's motions, the formalism reduces to standard 'linearized theory'. Independent of the effects of gravity on the motions, the formalism reduces to the standard 'quadrupole-moment formalism' if the motions are slow and internal stresses are weak. In the general case, the formalism expresses the radiation in terms of a retarded Green's function for slightly curved spacetime and breaks the Green's function integral into five easily understood pieces: direct radiation, produced directly by the motions of the source; whump radiation, produced by the 'gravitational stresses' of the source; transition radiation, produced by a time-changing time delay ('Shapiro effect') in the propagation of the nonradiative 1/r field of the source; focusing radiation, produced when one portion of the source focuses, in a time-dependent way, the nonradiative field of another portion of the source; and tail radiation, produced by 'back-scatter' of the nonradiative field in regions of focusing.

  14. Structuring Formal Control Systems Specifications for Reuse: Surviving Hardware Changes

    NASA Technical Reports Server (NTRS)

    Thompson, Jeffrey M.; Heimdahl, Mats P. E.; Erickson, Debra M.

    2000-01-01

    Formal capture and analysis of the required behavior of control systems have many advantages. For instance, it encourages rigorous requirements analysis, the required behavior is unambiguously defined, and we can assure that various safety properties are satisfied. Formal modeling is, however, a costly and time consuming process and if one could reuse the formal models over a family of products, significant cost savings would be realized. In an ongoing project we are investigating how to structure state-based models to achieve a high level of reusability within product families. In this paper we discuss a high-level structure of requirements models that achieves reusability of the desired control behavior across varying hardware platforms in a product family. The structuring approach is demonstrated through a case study in the mobile robotics domain where the desired robot behavior is reused on two diverse platforms-one commercial mobile platform and one build in-house. We use our language RSML (-e) to capture the control behavior for reuse and our tool NIMBUS to demonstrate how the formal specification can be validated and used as a prototype on the two platforms.

  15. Exact Solution of the Two-Level System and the Einstein Solid in the Microcanonical Formalism

    ERIC Educational Resources Information Center

    Bertoldi, Dalia S.; Bringa, Eduardo M.; Miranda, E. N.

    2011-01-01

    The two-level system and the Einstein model of a crystalline solid are taught in every course of statistical mechanics and they are solved in the microcanonical formalism because the number of accessible microstates can be easily evaluated. However, their solutions are usually presented using the Stirling approximation to deal with factorials. In…

  16. Software Tools for Formal Specification and Verification of Distributed Real-Time Systems

    DTIC Science & Technology

    1994-07-29

    time systems and to evaluate the design. The evaluation of the design includes investigation of both the capability and potential usefulness of the toolkit environment and the feasibility of its implementation....The goals of Phase 1 are to design in detail a toolkit environment based on formal methods for the specification and verification of distributed real

  17. Non-Lipschitz Dynamics Approach to Discrete Event Systems

    NASA Technical Reports Server (NTRS)

    Zak, M.; Meyers, R.

    1995-01-01

    This paper presents and discusses a mathematical formalism for simulation of discrete event dynamics (DED) - a special type of 'man- made' system designed to aid specific areas of information processing. A main objective is to demonstrate that the mathematical formalism for DED can be based upon the terminal model of Newtonian dynamics which allows one to relax Lipschitz conditions at some discrete points.

  18. Word Meaning as a Palimpsest: A Defense of Sociocultural Theory

    ERIC Educational Resources Information Center

    Song, Seonmi; Kellogg, David

    2011-01-01

    Vygotsky's work on the acquisition of foreign language words has been criticized for lacking a formal view of language as a system and for taking little interest in questions such as the route and rate of language acquisition. We argue that word meanings really do not constitute a formal system, either in the way they develop, or in the way they…

  19. Formalism over Function: Compulsion, Courts, and the Rise of Educational Formalism in America, 1870-1930

    ERIC Educational Resources Information Center

    Hutt, Ethan L.

    2012-01-01

    Background/Context: Though the impact of the legal system in shaping public education over the last sixty years is unquestioned, scholars have largely overlooked the impact of the legal system on the early development and trajectory of public schools in America. Scholars have given particularly little attention to the period in the late nineteenth…

  20. The Role of Formal Education, Technical and Management Training on Information Systems (IS) Managers' Managerial Effectiveness as Perceived by Their Subordinates

    ERIC Educational Resources Information Center

    Ligon, Jerry; Abdullah, ABM; Talukder, Majharul

    2007-01-01

    This study examined the relationship between Information Systems (IS) managers' formal education, level of technical and managerial training and their managerial effectiveness as perceived by their subordinates. The study finds that there is a strong positive relationship between the amount of technical training IS managers have received and their…

  1. The Formal Elements Art Therapy Scale: A Measurement System for Global Variables in Art

    ERIC Educational Resources Information Center

    Gantt, Linda M.

    2009-01-01

    The Formal Elements Art Therapy Scale (FEATS) is a measurement system for applying numbers to global variables in two-dimensional art (drawing and painting). While it was originally developed for use with the single-picture assessment ("Draw a person picking an apple from a tree" [PPAT]), researchers can also apply many of the 14 scales of the…

  2. Need a University Adopt a Formal Environmental Management System?: Progress without an EMS at a Small University

    ERIC Educational Resources Information Center

    Spellerberg, Ian F.; Buchan, Graeme D.; Englefield, Russell

    2004-01-01

    What system does a university need to optimise its progress to sustainability? Discusses the gradation of approaches possible for a university as it strives to improve its environmental performance. Argues that an environmental policy plus mechanisms for its implementation can be adequate, and endorsement of a single formal EMS need not be…

  3. Formal Methods for Verification and Validation of Partial Specifications: A Case Study

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Callahan, John

    1997-01-01

    This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe a case study of the use of partial formal models for V&V of the requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification are valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.

  4. Experience report: Using formal methods for requirements analysis of critical spacecraft software

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn R.; Ampo, Yoko

    1994-01-01

    Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.

  5. Probabilities for time-dependent properties in classical and quantum mechanics

    NASA Astrophysics Data System (ADS)

    Losada, Marcelo; Vanni, Leonardo; Laura, Roberto

    2013-05-01

    We present a formalism which allows one to define probabilities for expressions that involve properties at different times for classical and quantum systems and we study its lattice structure. The formalism is based on the notion of time translation of properties. In the quantum case, the properties involved should satisfy compatibility conditions in order to obtain well-defined probabilities. The formalism is applied to describe the double-slit experiment.

  6. The Information Processing Role of the Informal and Quasi-Formal Support Systems among the Hispanic Elderly: Implications for the Delivery of Formal Social Services.

    ERIC Educational Resources Information Center

    Starrett, Richard A.; And Others

    The study examined relationships among factors influencing utilization of social services by Hispanic elderly, particularly factors categorized as: (1) informal, such as support groups of family, kin, neighbors, friends, and (2) quasi-formal, such as church groups. Thirty-seven variables and data selected from a 1979-80 15-state survey of 1,805…

  7. The National Institute for Health Research Leadership Programme

    PubMed Central

    Jones, Molly Morgan; Wamae, Watu; Fry, Caroline Viola; Kennie, Tom; Chataway, Joanna

    2012-01-01

    Abstract RAND Europe evaluated the National Institute for Health Research (NIHR) Leadership Programme in an effort to help the English Department of Health consider the extent to which the programme has helped to foster NIHR's aims, extract lessons for the future, and develop plans for the next phase of the leadership programme. Successful delivery of high-quality health research requires not only an effective research base, but also a system of leadership supporting it. However, research leaders are not often given the opportunity, nor do they have the time, to attend formal leadership or management training programmes. This is unfortunate because research has shown that leadership training can have a hugely beneficial effect on an organisation. Therefore, the evaluation has a particular interest in understanding the role of the programme as a science policy intervention and will use its expertise in science policy analysis to consider this element alongside other, more traditional, measures of evaluation. PMID:28083231

  8. Comprehensive analyses of core-shell InGaN/GaN single nanowire photodiodes

    NASA Astrophysics Data System (ADS)

    Zhang, H.; Guan, N.; Piazza, V.; Kapoor, A.; Bougerol, C.; Julien, F. H.; Babichev, A. V.; Cavassilas, N.; Bescond, M.; Michelini, F.; Foldyna, M.; Gautier, E.; Durand, C.; Eymery, J.; Tchernycheva, M.

    2017-12-01

    Single nitride nanowire core/shell n-p photodetectors are fabricated and analyzed. Nanowires consisting of an n-doped GaN stem, a radial InGaN/GaN multiple quantum well system and a p-doped GaN external shell were grown by catalyst-free metal-organic vapour phase epitaxy on sapphire substrates. Single nanowires were dispersed and the core and the shell regions were contacted with a metal and an ITO deposition, respectively, defined using electron beam lithography. The single wire photodiodes present a response in the visible to UV spectral range under zero external bias. The detector operation speed has been analyzed under different bias conditions. Under zero bias, the  -3 dB cut-off frequency is ~200 Hz for small light modulations. The current generation was modeled using non-equilibrium Green function formalism, which evidenced the importance of phonon scattering for carrier extraction from the quantum wells.

  9. Learning to File: Reconfiguring Information and Information Work in the Early Twentieth Century.

    PubMed

    Robertson, Craig

    2017-01-01

    This article uses textbooks and advertisements to explore the formal and informal ways in which people were introduced to vertical filing in the early twentieth century. Through the privileging of "system" an ideal mode of paperwork emerged in which a clerk could "grasp" information simply by hand without having to understand or comprehend its content. A file clerk's hands and fingers became central to the representation and teaching of filing. In this way, filing offered an example of a distinctly modern form of information work. Filing textbooks sought to enhance dexterity as the rapid handling of paper came to represent information as something that existed in discrete units, in bits that could be easily extracted. Advertisements represented this mode of information work in its ideal form when they frequently erased the worker or reduced him or her to hands, as "instant" filing became "automatic" filing, with the filing cabinet presented as a machine.

  10. Experiences Using Formal Methods for Requirements Modeling

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David

    1996-01-01

    This paper describes three cases studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, the formal modeling provided a cost effective enhancement of the existing verification and validation processes. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.

  11. Comparison of the iterated equation of motion approach and the density matrix formalism for the quantum Rabi model

    NASA Astrophysics Data System (ADS)

    Kalthoff, Mona; Keim, Frederik; Krull, Holger; Uhrig, Götz S.

    2017-05-01

    The density matrix formalism and the equation of motion approach are two semi-analytical methods that can be used to compute the non-equilibrium dynamics of correlated systems. While for a bilinear Hamiltonian both formalisms yield the exact result, for any non-bilinear Hamiltonian a truncation is necessary. Due to the fact that the commonly used truncation schemes differ for these two methods, the accuracy of the obtained results depends significantly on the chosen approach. In this paper, both formalisms are applied to the quantum Rabi model. This allows us to compare the approximate results and the exact dynamics of the system and enables us to discuss the accuracy of the approximations as well as the advantages and the disadvantages of both methods. It is shown to which extent the results fulfill physical requirements for the observables and which properties of the methods lead to unphysical results.

  12. An elementary tutorial on formal specification and verification using PVS

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.

    1993-01-01

    A tutorial on the development of a formal specification and its verification using the Prototype Verification System (PVS) is presented. The tutorial presents the formal specification and verification techniques by way of specific example - an airline reservation system. The airline reservation system is modeled as a simple state machine with two basic operations. These operations are shown to preserve a state invariant using the theorem proving capabilities of PVS. The technique of validating a specification via 'putative theorem proving' is also discussed and illustrated in detail. This paper is intended for the novice and assumes only some of the basic concepts of logic. A complete description of user inputs and the PVS output is provided and thus it can be effectively used while one is sitting at a computer terminal.

  13. Designing urban rules from emergent patterns: co-evolving paths of informal and formal urban systems - the case of Portugal

    NASA Astrophysics Data System (ADS)

    Silva, Paulo

    2018-05-01

    In many societies, informality has been a relevant part of the construction of the urban fabric. This is valid along a city’s history and in recent urbanization processes. In the past, informality was in the origin of many of urban planning. Very soon urban planning adopted, as one of their main missions malfunctions in cities. Therefore, the need of formalization became one of the main reasons on the emergence, the control of informal processes. As an answer to informal individual solutions, urban planning responded with standardized rules and the urge of creating spaces fitting into pre-established rules instead of rules fitting into spaces. Urban planning as a discipline has gradually changed its path. The contrast between urbanization promoted under formal urban planning and informal urbanization is only one sign of the mismatch between urban planning actions and informal urbanization dynamics. Considering this tension between formal and informal dynamics, in some cases, planning rules and planning processes continue ignoring informal dynamics; in other cases, planning rules are designed to integrate informality “without losing its face” through “planning games” [1]; and a third and less explored way in which planning systems interact with informality and from that interaction learn how to improve (we consider it a process of enrichment) planning rules while they promote an upgrade of informal interventions [2]. This latter win-win situation in which both informal and formal systems benefit from their interaction is still rare: most of the time either only one side benefits or none benefit from the interaction. Nevertheless, there are signs that from this interaction co-dependent adaptation might occur with positive outcomes for the urban system – in which co-evolutionary dynamics can be traced. We propose to look at the way building rules have been designed in Europe in a context considered successful in the sense of dealing of informality – the one of Portugal. The country experienced a wave of informality associated with illegal urbanization since the 1960’s in the main urban areas. The process of interaction between informal and formal urban systems proved to be a success in statistic terms. Slum clearance reduced the existence of informal occupations to almost zero. Informal settlements involving land tenure have been dealt with in the last two decades with considerable positive impact in the urban fabric. Based on this, with this paper we will evaluate how informal and formal systems are impacting each other and changing along the time the shape of building and of planning rules. For this we will look at the planning tools created to formalize informal settlements in the Lisbon Metropolitan Area from the last forty years to see how urban and building rules were adapted to respond to the specific needs of informal settlements; how this adaptation moved from temporary and exceptional to permanent rules; finally, how were these new rules able to “contaminate” the general planning and building codes. We aim that these findings would help us to contribute to a “healthier” relation between formal and informal urban systems, not ignoring each other, not controlling each other but instead learning with each other. By achieving this, planning systems become more responsive; on the other hand, informal occupations can be upgraded without being destroyed with the contribution of the planning systems.

  14. User Interface Technology for Formal Specification Development

    NASA Technical Reports Server (NTRS)

    Lowry, Michael; Philpot, Andrew; Pressburger, Thomas; Underwood, Ian; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    Formal specification development and modification are an essential component of the knowledge-based software life cycle. User interface technology is needed to empower end-users to create their own formal specifications. This paper describes the advanced user interface for AMPHION1 a knowledge-based software engineering system that targets scientific subroutine libraries. AMPHION is a generic, domain-independent architecture that is specialized to an application domain through a declarative domain theory. Formal specification development and reuse is made accessible to end-users through an intuitive graphical interface that provides semantic guidance in creating diagrams denoting formal specifications in an application domain. The diagrams also serve to document the specifications. Automatic deductive program synthesis ensures that end-user specifications are correctly implemented. The tables that drive AMPHION's user interface are automatically compiled from a domain theory; portions of the interface can be customized by the end-user. The user interface facilitates formal specification development by hiding syntactic details, such as logical notation. It also turns some of the barriers for end-user specification development associated with strongly typed formal languages into active sources of guidance, without restricting advanced users. The interface is especially suited for specification modification. AMPHION has been applied to the domain of solar system kinematics through the development of a declarative domain theory. Testing over six months with planetary scientists indicates that AMPHION's interactive specification acquisition paradigm enables users to develop, modify, and reuse specifications at least an order of magnitude more rapidly than manual program development.

  15. Interpreter composition issues in the formal verification of a processor-memory module

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Cohen, Gerald C.

    1994-01-01

    This report describes interpreter composition techniques suitable for the formal specification and verification of a processor-memory module using the HOL theorem proving system. The processor-memory module is a multichip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. Modeling and verification methods were developed that permit provably secure composition at the transaction-level of specification, significantly reducing the complexity of the hierarchical verification of the system.

  16. Abstract Model of the SATS Concept of Operations: Initial Results and Recommendations

    NASA Technical Reports Server (NTRS)

    Dowek, Gilles; Munoz, Cesar; Carreno, Victor A.

    2004-01-01

    An abstract mathematical model of the concept of operations for the Small Aircraft Transportation System (SATS) is presented. The Concept of Operations consist of several procedures that describe nominal operations for SATS, Several safety properties of the system are proven using formal techniques. The final goal of the verification effort is to show that under nominal operations, aircraft are safely separated. The abstract model was written and formally verified in the Prototype Verification System (PVS).

  17. Advanced orbiting systems test-bedding and protocol verification

    NASA Technical Reports Server (NTRS)

    Noles, James; De Gree, Melvin

    1989-01-01

    The Consultative Committee for Space Data Systems (CCSDS) has begun the development of a set of protocol recommendations for Advanced Orbiting Systems (SOS). The AOS validation program and formal definition of AOS protocols are reviewed, and the configuration control of the AOS formal specifications is summarized. Independent implementations of the AOS protocols by NASA and ESA are discussed, and cross-support/interoperability tests which will allow the space agencies of various countries to share AOS communication facilities are addressed.

  18. Electromagnetic δ -function sphere

    NASA Astrophysics Data System (ADS)

    Parashar, Prachi; Milton, Kimball A.; Shajesh, K. V.; Brevik, Iver

    2017-10-01

    We develop a formalism to extend our previous work on the electromagnetic δ -function plates to a spherical surface. The electric (λe) and magnetic (λg) couplings to the surface are through δ -function potentials defining the dielectric permittivity and the diamagnetic permeability, with two anisotropic coupling tensors. The formalism incorporates dispersion. The electromagnetic Green's dyadic breaks up into transverse electric and transverse magnetic parts. We derive the Casimir interaction energy between two concentric δ -function spheres in this formalism and show that it has the correct asymptotic flat-plate limit. We systematically derive expressions for the Casimir self-energy and the total stress on a spherical shell using a δ -function potential, properly regulated by temporal and spatial point splitting, which are different from the conventional temporal point splitting. In the strong-coupling limit, we recover the usual result for the perfectly conducting spherical shell but in addition there is an integrated curvature-squared divergent contribution. For finite coupling, there are additional divergent contributions; in particular, there is a familiar logarithmic divergence occurring in the third order of the uniform asymptotic expansion that renders it impossible to extract a unique finite energy except in the case of an isorefractive sphere, which translates into λg=-λe.

  19. Impact of Ambient Humidity on Child Health: A Systematic Review

    PubMed Central

    Gao, Jinghong; Sun, Yunzong; Lu, Yaogui; Li, Liping

    2014-01-01

    Background and Objectives Changes in relative humidity, along with other meteorological factors, accompany ongoing climate change and play a significant role in weather-related health outcomes, particularly among children. The purpose of this review is to improve our understanding of the relationship between ambient humidity and child health, and to propose directions for future research. Methods A comprehensive search of electronic databases (PubMed, Medline, Web of Science, ScienceDirect, OvidSP and EBSCO host) and review of reference lists, to supplement relevant studies, were conducted in March 2013. All identified records were selected based on explicit inclusion criteria. We extracted data from the included studies using a pre-designed data extraction form, and then performed a quality assessment. Various heterogeneities precluded a formal quantitative meta-analysis, therefore, evidence was compiled using descriptive summaries. Results Out of a total of 3797 identified records, 37 papers were selected for inclusion in this review. Among the 37 studies, 35% were focused on allergic diseases and 32% on respiratory system diseases. Quality assessment revealed 78% of the studies had reporting quality scores above 70%, and all findings demonstrated that ambient humidity generally plays an important role in the incidence and prevalence of climate-sensitive diseases among children. Conclusions With climate change, there is a significant impact of ambient humidity on child health, especially for climate-sensitive infectious diseases, diarrhoeal diseases, respiratory system diseases, and pediatric allergic diseases. However, some inconsistencies in the direction and magnitude of the effects are observed. PMID:25503413

  20. A Comparative Framework for Studying the Histories of the Humanities and Science.

    PubMed

    Bod, Rens

    2015-06-01

    While the humanities and the sciences have a closely connected history, there are no general histories that bring the two fields together on an equal footing. This paper argues that there is a level at which some humanistic and scientific disciplines can be brought under a common denominator and compared. This is at the level of underlying methods, especially at the level of formalisms and rule systems used by different disciplines. The essay formally compares linguistics and computer science by noting that the same grammar formalism was used in the 1950s for describing both human and. programming languages. Additionally, it examines the influence of philology on molecular biology, and vice versa, by recognizing that the tree-formalism and rule system used for text reconstruction was also employed in DNA genetics. It also shows that rule systems for source criticism in history are used in forensic science, evidence-based medicine, and jurisprudence. This paper thus opens up a new comparative approach within which the histories of the humanities and the sciences can be examined on a common level.

  1. Momentum distributions for H 2 ( e , e ' p )

    DOE PAGES

    Ford, William P.; Jeschonnek, Sabine; Van Orden, J. W.

    2014-12-29

    [Background] A primary goal of deuteron electrodisintegration is the possibility of extracting the deuteron momentum distribution. This extraction is inherently fraught with difficulty, as the momentum distribution is not an observable and the extraction relies on theoretical models dependent on other models as input. [Purpose] We present a new method for extracting the momentum distribution which takes into account a wide variety of model inputs thus providing a theoretical uncertainty due to the various model constituents. [Method] The calculations presented here are using a Bethe-Salpeter like formalism with a wide variety of bound state wave functions, form factors, and finalmore » state interactions. We present a method to extract the momentum distributions from experimental cross sections, which takes into account the theoretical uncertainty from the various model constituents entering the calculation. [Results] In order to test the extraction pseudo-data was generated, and the extracted "experimental'' distribution, which has theoretical uncertainty from the various model inputs, was compared with the theoretical distribution used to generate the pseudo-data. [Conclusions] In the examples we compared the original distribution was typically within the error band of the extracted distribution. The input wave functions do contain some outliers which are discussed in the text, but at least this process can provide an upper bound on the deuteron momentum distribution. Due to the reliance on the theoretical calculation to obtain this quantity any extraction method should account for the theoretical error inherent in these calculations due to model inputs.« less

  2. Summary of the DOD Process for Developing Ouantitative Munitions Requirements

    DTIC Science & Technology

    2000-02-24

    extracted from the overall classified Secret documents, by itself, is unclassified. 20 conflict in the NonNuclear Consumables Annual Analysis threat...Projected Kits’ Projecled Consumption’ Projected Kfc ’ Projected Consumption4 Total Projected Consumption e.g. Tanks MTWEist Portion to defeat I1 MTW...umns2and5. er ol projected ’ The total munitions consumed lo achieve Ihe numb (ifebyrmivtiontype. Figure 3-1 Combat Munitions Data Formal ?? MUNITIONS

  3. Terminal Dynamics Approach to Discrete Event Systems

    NASA Technical Reports Server (NTRS)

    Zak, Michail; Meyers, Ronald

    1995-01-01

    This paper presents and discusses a mathematical formalism for simulation of discrete event dynamic (DED)-a special type of 'man-made' systems to serve specific purposes of information processing. The main objective of this work is to demonstrate that the mathematical formalism for DED can be based upon a terminal model of Newtonian dynamics which allows one to relax Lipschitz conditions at some discrete points.!.

  4. Can Schools Be Autonomous in a Centralised Educational System?: On Formal and Actual School Autonomy in the Italian Context

    ERIC Educational Resources Information Center

    Agasisti, Tommaso; Catalano, Giuseppe; Sibiano, Piergiacomo

    2013-01-01

    Purpose: The purpose of this paper is to examine the difference between formal and real school autonomy in the Italian educational system. The Italian case is characterised by low levels of school autonomy. It is interesting to consider whether heterogeneity of patterns is possible in this context. A description of this heterogeneity is provided…

  5. Approaches to formalization of the informal waste sector into municipal solid waste management systems in low- and middle-income countries: Review of barriers and success factors.

    PubMed

    Aparcana, Sandra

    2017-03-01

    The Municipal Solid Waste Management (MSWM) sector represents a major challenge for low-and middle-income countries due to significant environmental and socioeconomic issues involving rapid urbanization, their MSWM systems, and the existence of the informal waste sector. Recognizing its role, several countries have implemented various formalization measures, aiming to address the social problems linked to this sector. However, regardless of these initiatives, not all attempts at formalization have proved successful due to the existence of barriers preventing their implementation in the long term. Along with this, there is a frequent lack of knowledge or understanding regarding these barriers and the kind of measures that may enable formalization, thereby attaining a win-win situation for all the stakeholders involved. In this context, policy- and decision-makers in the public and private sectors are frequently confronted with the dilemma of finding workable approaches to formalization, adjusted to their particular MSWM contexts. Building on the review of frequently implemented approaches to formalization, including an analysis of the barriers to and enabling measures for formalization, this paper aims to address this gap by explaining to policy- and decision-makers, and to waste managers in the private sector, certain dynamics that can be observed and that should be taken into account when designing formalization strategies that are adapted to their particular socioeconomic and political-institutional context. This includes possible links between formalization approaches and barriers, the kinds of barriers that need to be removed, and enabling measures leading to successful formalization in the long term. This paper involved a literature review of common approaches to formalization, which were classified into three categories: (1) informal waste workers organized in associations or cooperatives; (2) organized in CBOs or MSEs; and (3) contracted as individual workers by the formal waste sector. This was followed by the identification and subsequent classification of measures for removing common barriers to formalization into five categories: policy/legal, institutional/organizational, technical, social, and economic/financial. The approaches to formalization, as well as the barrier categories, were validated through the assessment of twenty case studies of formalization. Building on the assessment, the paper discussed possible links between formalization approaches and barriers, the 'persistent' challenges that represent barriers to formalization, as well as key enabling factors improving the likelihood of successful formalization. Regardless of the type of approach adopted to formalization, the review identifies measures to remove barriers in all five categories, with a stronger link between the approaches 1 and 2 and the existence of measures in the policy, institutional, and financial categories. Regarding persistent barriers, the review identified ones arising from the absence of measures to address a particular issue before formalization or due to specific country- or sector-related conditions, and their interaction with the MSWM context. 75% of the case studies had persistent barriers in respect of policy/legal issues, 50% of institutional/organizational, 45% of financial/economic, and 40%, and 35% of social and technical issues respectively. This paper concludes that independently of the formalization approach, the lack of interventions or measures in any of the five categories of barriers may lead formalization initiatives to fail, as unaddressed barriers become 'persistent' after formalization is implemented. Furthermore, 'persistent barriers' may also appear due to unfavorable country-specific conditions. The success of a formalization initiative does not depend on a specific approach, but most likely on the inclusion of country-appropriate measures at the policy, economic and institutional levels. The empowerment of informal waste-workers is again confirmed as a further key success factor for their formalization. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Causal tapestries for psychology and physics.

    PubMed

    Sulis, William H

    2012-04-01

    Archetypal dynamics is a formal approach to the modeling of information flow in complex systems used to study emergence. It is grounded in the Fundamental Triad of realisation (system), interpretation (archetype) and representation (formal model). Tapestries play a fundamental role in the framework of archetypal dynamics as a formal representational system. They represent information flow by means of multi layered, recursive, interlinked graphical structures that express both geometry (form or sign) and logic (semantics). This paper presents a detailed mathematical description of a specific tapestry model, the causal tapestry, selected for use in describing behaving systems such as appear in psychology and physics from the standpoint of Process Theory. Causal tapestries express an explicit Lorentz invariant transient now generated by means of a reality game. Observables are represented by tapestry informons while subjective or hidden components (for example intellectual and emotional processes) are incorporated into the reality game that determines the tapestry dynamics. As a specific example, we formulate a random graphical dynamical system using causal tapestries.

  7. The ADVANCE project : formal evaluation of the targeted deployment. Volume 2

    DOT National Transportation Integrated Search

    1997-01-01

    This document reports on the formal evaluation of the targeted (limited but highly focused) deployment of the Advanced Driver and Vehicle Advisory Navigation ConcEpt (ADVANCE), an in-vehicle advanced traveler information system designed to provide sh...

  8. Innovation in Library Education: Historical X-Files on Technology, People, and Change.

    ERIC Educational Resources Information Center

    Carmichael, James V., Jr.

    1998-01-01

    Discusses the history of library education and library educators. Highlights include Melvil Dewey's proposal for formal library education, the earlier apprentice system, obstacles to formal education, changes in attitudes toward patrons, accreditation, standards, and technological changes. (LRW)

  9. NASA software specification and evaluation system design, part 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The research to develop methods for reducing the effort expended in software and verification is reported. The development of a formal software requirements methodology, a formal specifications language, a programming language, a language preprocessor, and code analysis tools are discussed.

  10. Formal Logic and Flowchart for Diagnosis Validity Verification and Inclusion in Clinical Decision Support Systems

    NASA Astrophysics Data System (ADS)

    Sosa, M.; Grundel, L.; Simini, F.

    2016-04-01

    Logical reasoning is part of medical practice since its origins. Modern Medicine has included information-intensive tools to refine diagnostics and treatment protocols. We are introducing formal logic teaching in Medical School prior to Clinical Internship, to foster medical practice. Two simple examples (Acute Myocardial Infarction and Diabetes Mellitus) are given in terms of formal logic expression and truth tables. Flowcharts of both diagnostic processes help understand the procedures and to validate them logically. The particularity of medical information is that it is often accompanied by “missing data” which suggests to adapt formal logic to a “three state” logic in the future. Medical Education must include formal logic to understand complex protocols and best practices, prone to mutual interactions.

  11. Open systems & non-formal education

    NASA Astrophysics Data System (ADS)

    Wheeler, Gerald F.

    1988-10-01

    Professor Dib created an important structure that can be used to attach the many and various activities that fall in the category of this title. While I plan to use his structure, I will be emphasizing a different component of his spectrum and promoting a different need. Professor Dib suggested a critical need to move our teaching styles away from formal modes to non-formal modes of delivery. I suggest an equally critical need in the area of informal education. And, I will propose aways to move us toward the same goal, non-formal activities. I believe we need to find ways to use the many informal education activities that occur almost automatically in our societies to move our potential learners to richer non-formal endeavors. Both needs are real; both activities are valid.

  12. Experiences Using Lightweight Formal Methods for Requirements Modeling

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David

    1997-01-01

    This paper describes three case studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, formal methods enhanced the existing verification and validation processes, by testing key properties of the evolving requirements, and helping to identify weaknesses. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.

  13. Investigating Actuation Force Fight with Asynchronous and Synchronous Redundancy Management Techniques

    NASA Technical Reports Server (NTRS)

    Hall, Brendan; Driscoll, Kevin; Schweiker, Kevin; Dutertre, Bruno

    2013-01-01

    Within distributed fault-tolerant systems the term force-fight is colloquially used to describe the level of command disagreement present at redundant actuation interfaces. This report details an investigation of force-fight using three distributed system case-study architectures. Each case study architecture is abstracted and formally modeled using the Symbolic Analysis Laboratory (SAL) tool chain from the Stanford Research Institute (SRI). We use the formal SAL models to produce k-induction based proofs of a bounded actuation agreement property. We also present a mathematically derived bound of redundant actuation agreement for sine-wave stimulus. The report documents our experiences and lessons learned developing the formal models and the associated proofs.

  14. Working on the Boundaries: Philosophies and Practices of the Design Process

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Blair, J.; Townsend, J.; Verderaime, V.

    1996-01-01

    While systems engineering process is a program formal management technique and contractually binding, the design process is the informal practice of achieving the design project requirements throughout all design phases of the systems engineering process. The design process and organization are systems and component dependent. Informal reviews include technical information meetings and concurrent engineering sessions, and formal technical discipline reviews are conducted through the systems engineering process. This paper discusses and references major philosophical principles in the design process, identifies its role in interacting systems and disciplines analyses and integrations, and illustrates the process application in experienced aerostructural designs.

  15. Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems

    NASA Technical Reports Server (NTRS)

    Bujorianu, Marius C.; Bujorianu, Manuela L.

    2009-01-01

    In this paper, we sketch a framework for interdisciplinary modeling of space systems, by proposing a holistic view. We consider different system dimensions and their interaction. Specifically, we study the interactions between computation, physics, communication, uncertainty and autonomy. The most comprehensive computational paradigm that supports a holistic perspective on autonomous space systems is given by cyber-physical systems. For these, the state of art consists of collaborating multi-engineering efforts that prompt for an adequate formal foundation. To achieve this, we propose a leveraging of the traditional content of formal modeling by a co-engineering process.

  16. Enlisted Personnel Individualized Career System (EPICS) and Conventional Personnel System (CPS): Preliminary Comparison of Training and Ancillary Costs

    DTIC Science & Technology

    1983-04-01

    NUMNIIIR(e) A. M. Megrditchian S. PERFORMING ORGANIZATION NAME ANO ADDRESS 10. PROGRAM ELEMENT. PROJECT, TASK AREA & WORK UNIT NUMUERS Navy Personnel...individualized career system (EPICS) program provides an alternative that strives for these advantages. It defers formal school assignment to follow sea duty...enabling the seaman to understand and adjust to the shipboard environment as well as prepare for an opt imall y- phased, formal, shore-based schools program

  17. Feature extraction across individual time series observations with spikes using wavelet principal component analysis.

    PubMed

    Røislien, Jo; Winje, Brita

    2013-09-20

    Clinical studies frequently include repeated measurements of individuals, often for long periods. We present a methodology for extracting common temporal features across a set of individual time series observations. In particular, the methodology explores extreme observations within the time series, such as spikes, as a possible common temporal phenomenon. Wavelet basis functions are attractive in this sense, as they are localized in both time and frequency domains simultaneously, allowing for localized feature extraction from a time-varying signal. We apply wavelet basis function decomposition of individual time series, with corresponding wavelet shrinkage to remove noise. We then extract common temporal features using linear principal component analysis on the wavelet coefficients, before inverse transformation back to the time domain for clinical interpretation. We demonstrate the methodology on a subset of a large fetal activity study aiming to identify temporal patterns in fetal movement (FM) count data in order to explore formal FM counting as a screening tool for identifying fetal compromise and thus preventing adverse birth outcomes. Copyright © 2013 John Wiley & Sons, Ltd.

  18. Relativistic, model-independent, multichannel 2 → 2 transition amplitudes in a finite volume

    DOE PAGES

    Briceno, Raul A.; Hansen, Maxwell T.

    2016-07-13

    We derive formalism for determining 2 + J → 2 infinite-volume transition amplitudes from finite-volume matrix elements. Specifically, we present a relativistic, model-independent relation between finite-volume matrix elements of external currents and the physically observable infinite-volume matrix elements involving two-particle asymptotic states. The result presented holds for states composed of two scalar bosons. These can be identical or non-identical and, in the latter case, can be either degenerate or non-degenerate. We further accommodate any number of strongly-coupled two-scalar channels. This formalism will, for example, allow future lattice QCD calculations of themore » $$\\rho$$-meson form factor, in which the unstable nature of the $$\\rho$$ is rigorously accommodated. In conclusion, we also discuss how this work will impact future extractions of nuclear parity and hadronic long-range matrix elements from lattice QCD.« less

  19. The evaluation of a formalized queue management system for coronary angiography waiting lists.

    PubMed

    Alter, D A; Newman, Alice M; Cohen, Eric A; Sykora, Kathy; Tu, Jack V

    2005-11-01

    Lengthy waiting lists for coronary angiography have been described in many health care systems worldwide. The extent to which formal queue management systems may improve the prioritization and survival of patients in the angiography queue is unknown. To prospectively evaluate the performance of a formal queue management system for patients awaiting coronary angiography in Ontario. The coronary angiography urgency scale, a formal queue management system developed in 1993 using a modified Delphi panel, allocates recommended maximum waiting times (RMWTs) in accordance with clinical necessity. By using a provincial clinical registry, 35,617 consecutive patients referred into the coronary angiography queue between April 1, 2001, and March 31, 2002, were prospectively tracked. Cox proportional hazards models were used to examined mortality risk across urgency after adjusting for additional clinical and comorbid factors. Good agreement was determined in urgency ratings between scores from the coronary angiography urgency scale and implicit physician judgement, which was obtained independently at the time of the index referral (weighted kappa = 0.49). The overall mortality in the queue was 0.3% (0.47%, 0.26% and 0.13% for urgent, semiurgent and elective patients, respectively). Urgency, as specified by the coronary angiography urgency scale, was the strongest predictor of death in the queue (P<0.001). However, when patients were censored according to their RMWTs, mortality was similar across different levels of urgency. Consequently, up to 18.5 deaths per 10,000 patients could have potentially been averted had patients been triaged and undergone coronary angiography within the RMWT as specified by the coronary angiography urgency scale. The incorporation of the coronary angiography urgency scale as a formal queue management system may decrease mortality in the coronary angiography queue. The authors recommend its implementation in health care systems where patients experience excessive waiting time delays for coronary angiography.

  20. Towards Formal Implementation of PUS Standard

    NASA Astrophysics Data System (ADS)

    Ilić, D.

    2009-05-01

    As an effort to promote the reuse of on-board and ground systems ESA developed a standard for packet telemetry and telecommand - PUS. It defines a set of standard service models with the corresponding structures of the associated telemetry and telecommand packets. Various missions then can choose to implement those standard PUS services that best conform to their specific requirements. In this paper we propose a formal development (based on the Event-B method) of reusable service patterns, which can be instantiated for concrete application. Our formal models allow us to formally express and verify specific service properties including various telecommand and telemetry packet structure validation.

  1. Multisymplectic Lagrangian and Hamiltonian Formalisms of Classical Field Theories

    NASA Astrophysics Data System (ADS)

    Román-Roy, Narciso

    2009-11-01

    This review paper is devoted to presenting the standard multisymplectic formulation for describing geometrically classical field theories, both the regular and singular cases. First, the main features of the Lagrangian formalism are revisited and, second, the Hamiltonian formalism is constructed using Hamiltonian sections. In both cases, the variational principles leading to the Euler-Lagrange and the Hamilton-De Donder-Weyl equations, respectively, are stated, and these field equations are given in different but equivalent geometrical ways in each formalism. Finally, both are unified in a new formulation (which has been developed in the last years), following the original ideas of Rusk and Skinner for mechanical systems.

  2. Fuzzylot: a novel self-organising fuzzy-neural rule-based pilot system for automated vehicles.

    PubMed

    Pasquier, M; Quek, C; Toh, M

    2001-10-01

    This paper presents part of our research work concerned with the realisation of an Intelligent Vehicle and the technologies required for its routing, navigation, and control. An automated driver prototype has been developed using a self-organising fuzzy rule-based system (POPFNN-CRI(S)) to model and subsequently emulate human driving expertise. The ability of fuzzy logic to represent vague information using linguistic variables makes it a powerful tool to develop rule-based control systems when an exact working model is not available, as is the case of any vehicle-driving task. Designing a fuzzy system, however, is a complex endeavour, due to the need to define the variables and their associated fuzzy sets, and determine a suitable rule base. Many efforts have thus been devoted to automating this process, yielding the development of learning and optimisation techniques. One of them is the family of POP-FNNs, or Pseudo-Outer Product Fuzzy Neural Networks (TVR, AARS(S), AARS(NS), CRI, Yager). These generic self-organising neural networks developed at the Intelligent Systems Laboratory (ISL/NTU) are based on formal fuzzy mathematical theory and are able to objectively extract a fuzzy rule base from training data. In this application, a driving simulator has been developed, that integrates a detailed model of the car dynamics, complete with engine characteristics and environmental parameters, and an OpenGL-based 3D-simulation interface coupled with driving wheel and accelerator/ brake pedals. The simulator has been used on various road scenarios to record from a human pilot driving data consisting of steering and speed control actions associated to road features. Specifically, the POPFNN-CRI(S) system is used to cluster the data and extract a fuzzy rule base modelling the human driving behaviour. Finally, the effectiveness of the generated rule base has been validated using the simulator in autopilot mode.

  3. Modelling of Indoor Environments Using Lindenmayer Systems

    NASA Astrophysics Data System (ADS)

    Peter, M.

    2017-09-01

    Documentation of the "as-built" state of building interiors has gained a lot of interest in the recent years. Various data acquisition methods exist, e.g. the extraction from photographed evacuation plans using image processing or, most prominently, indoor mobile laser scanning. Due to clutter or data gaps as well as errors during data acquisition and processing, automatic reconstruction of CAD/BIM-like models from these data sources is not a trivial task. Thus it is often tried to support reconstruction by general rules for the perpendicularity and parallelism which are predominant in man-made structures. Indoor environments of large, public buildings, however, often also follow higher-level rules like symmetry and repetition of e.g. room sizes and corridor widths. In the context of reconstruction of city city elements (e.g. street networks) or building elements (e.g. façade layouts), formal grammars have been put to use. In this paper, we describe the use of Lindenmayer systems - which originally have been developed for the computer-based modelling of plant growth - to model and reproduce the layout of indoor environments in 2D.

  4. A three-dimensional virtual environment for modeling mechanical cardiopulmonary interactions.

    PubMed

    Kaye, J M; Primiano, F P; Metaxas, D N

    1998-06-01

    We have developed a real-time computer system for modeling mechanical physiological behavior in an interactive, 3-D virtual environment. Such an environment can be used to facilitate exploration of cardiopulmonary physiology, particularly in situations that are difficult to reproduce clinically. We integrate 3-D deformable body dynamics with new, formal models of (scalar) cardiorespiratory physiology, associating the scalar physiological variables and parameters with the corresponding 3-D anatomy. Our framework enables us to drive a high-dimensional system (the 3-D anatomical models) from one with fewer parameters (the scalar physiological models) because of the nature of the domain and our intended application. Our approach is amenable to modeling patient-specific circumstances in two ways. First, using CT scan data, we apply semi-automatic methods for extracting and reconstructing the anatomy to use in our simulations. Second, our scalar physiological models are defined in terms of clinically measurable, patient-specific parameters. This paper describes our approach, problems we have encountered and a sample of results showing normal breathing and acute effects of pneumothoraces.

  5. Tools reference manual for a Requirements Specification Language (RSL), version 2.0

    NASA Technical Reports Server (NTRS)

    Fisher, Gene L.; Cohen, Gerald C.

    1993-01-01

    This report describes a general-purpose Requirements Specification Language, RSL. The purpose of RSL is to specify precisely the external structure of a mechanized system and to define requirements that the system must meet. A system can be comprised of a mixture of hardware, software, and human processing elements. RSL is a hybrid of features found in several popular requirements specification languages, such as SADT (Structured Analysis and Design Technique), PSL (Problem Statement Language), and RMF (Requirements Modeling Framework). While languages such as these have useful features for structuring a specification, they generally lack formality. To overcome the deficiencies of informal requirements languages, RSL has constructs for formal mathematical specification. These constructs are similar to those found in formal specification languages such as EHDM (Enhanced Hierarchical Development Methodology), Larch, and OBJ3.

  6. Formal analysis of imprecise system requirements with Event-B.

    PubMed

    Le, Hong Anh; Nakajima, Shin; Truong, Ninh Thuan

    2016-01-01

    Formal analysis of functional properties of system requirements needs precise descriptions. However, the stakeholders sometimes describe the system with ambiguous, vague or fuzzy terms, hence formal frameworks for modeling and verifying such requirements are desirable. The Fuzzy If-Then rules have been used for imprecise requirements representation, but verifying their functional properties still needs new methods. In this paper, we propose a refinement-based modeling approach for specification and verification of such requirements. First, we introduce a representation of imprecise requirements in the set theory. Then we make use of Event-B refinement providing a set of translation rules from Fuzzy If-Then rules to Event-B notations. After that, we show how to verify both safety and eventuality properties with RODIN/Event-B. Finally, we illustrate the proposed method on the example of Crane Controller.

  7. A randomized control trial comparing the visual and verbal communication methods for reducing fear and anxiety during tooth extraction.

    PubMed

    Gazal, Giath; Tola, Ahmed W; Fareed, Wamiq M; Alnazzawi, Ahmad A; Zafar, Muhammad S

    2016-04-01

    To evaluate the value of using the visual information for reducing the level of dental fear and anxiety in patients undergoing teeth extraction under LA. A total of 64 patients were indiscriminately allotted to solitary of the study groups following reading the information sheet and signing the formal consent. If patient was in the control group, only verbal information and routine warnings were provided. If patient was in the study group, tooth extraction video was showed. The level of dental fear and anxiety was detailed by the patients on customary 100 mm visual analog scales (VAS), with "no dental fear and anxiety" (0 mm) and "severe dental distress and unease" (100 mm). Evaluation of dental apprehension and fretfulness was made pre-operatively, following visual/verbal information and post-extraction. There was a substantial variance among the mean dental fear and anxiety scores for both groups post-extraction (p-value < 0.05). Patients in tooth extraction video group were more comfortable after dental extraction than verbal information and routine warning group. For tooth extraction video group there were major decreases in dental distress and anxiety scores between the pre-operative and either post video information scores or postoperative scores (p-values < 0.05). Younger patients recorded higher dental fear and anxiety scores than older ones (P < 0.05). Dental fear and anxiety associated with dental extractions under local anesthesia can be reduced by showing a tooth extraction video to the patients preoperatively.

  8. Towards the formal specification of the requirements and design of a processor interface unit

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.

    1993-01-01

    Work to formally specify the requirements and design of a Processor Interface Unit (PIU), a single-chip subsystem providing memory interface, bus interface, and additional support services for a commercial microprocessor within a fault-tolerant computer system, is described. This system, the Fault-Tolerant Embedded Processor (FTEP), is targeted towards applications in avionics and space requiring extremely high levels of mission reliability, extended maintenance free operation, or both. The approaches that were developed for modeling the PIU requirements and for composition of the PIU subcomponents at high levels of abstraction are described. These approaches were used to specify and verify a nontrivial subset of the PIU behavior. The PIU specification in Higher Order Logic (HOL) is documented in a companion NASA contractor report entitled 'Towards the Formal Specification of the Requirements and Design of a Processor Interfacs Unit - HOL Listings.' The subsequent verification approach and HOL listings are documented in NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit' and NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit - HOL Listings.'

  9. 49 CFR 236.923 - Task analysis and basic requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... classroom, simulator, computer-based, hands-on, or other formally structured training and testing, except... for Processor-Based Signal and Train Control Systems § 236.923 Task analysis and basic requirements...) Based on a formal task analysis, identify the installation, maintenance, repair, modification...

  10. Transforming Aggregate Object-Oriented Formal Specifications to Code

    DTIC Science & Technology

    1999-03-01

    integration issues associated with a formal-based software transformation system, such as the source specification, the problem space architecture , design architecture ... design transforms, and target software transforms. Software is critical in today’s Air Force, yet its specification, design, and development

  11. 41 CFR 109-1.5204 - Review and approval of a designated contractor's personal property management system.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... overhaul; and (2) An analysis of the cost to implement the overhaul within a year versus a proposed... be based on a formal comprehensive appraisal or a series of formal appraisals of the functional...

  12. Discrete mathematics, formal methods, the Z schema and the software life cycle

    NASA Technical Reports Server (NTRS)

    Bown, Rodney L.

    1991-01-01

    The proper role and scope for the use of discrete mathematics and formal methods in support of engineering the security and integrity of components within deployed computer systems are discussed. It is proposed that the Z schema can be used as the specification language to capture the precise definition of system and component interfaces. This can be accomplished with an object oriented development paradigm.

  13. A comparison between state-specific and linear-response formalisms for the calculation of vertical electronic transition energy in solution with the CCSD-PCM method.

    PubMed

    Caricato, Marco

    2013-07-28

    The calculation of vertical electronic transition energies of molecular systems in solution with accurate quantum mechanical methods requires the use of approximate and yet reliable models to describe the effect of the solvent on the electronic structure of the solute. The polarizable continuum model (PCM) of solvation represents a computationally efficient way to describe this effect, especially when combined with coupled cluster (CC) methods. Two formalisms are available to compute transition energies within the PCM framework: State-Specific (SS) and Linear-Response (LR). The former provides a more complete account of the solute-solvent polarization in the excited states, while the latter is computationally very efficient (i.e., comparable to gas phase) and transition properties are well defined. In this work, I review the theory for the two formalisms within CC theory with a focus on their computational requirements, and present the first implementation of the LR-PCM formalism with the coupled cluster singles and doubles method (CCSD). Transition energies computed with LR- and SS-CCSD-PCM are presented, as well as a comparison between solvation models in the LR approach. The numerical results show that the two formalisms provide different absolute values of transition energy, but similar relative solvatochromic shifts (from nonpolar to polar solvents). The LR formalism may then be used to explore the solvent effect on multiple states and evaluate transition probabilities, while the SS formalism may be used to refine the description of specific states and for the exploration of excited state potential energy surfaces of solvated systems.

  14. An evaluation of the current state of genomic data privacy protection technology and a roadmap for the future.

    PubMed

    Malin, Bradley A

    2005-01-01

    The incorporation of genomic data into personal medical records poses many challenges to patient privacy. In response, various systems for preserving patient privacy in shared genomic data have been developed and deployed. Although these systems de-identify the data by removing explicit identifiers (e.g., name, address, or Social Security number) and incorporate sound security design principles, they suffer from a lack of formal modeling of inferences learnable from shared data. This report evaluates the extent to which current protection systems are capable of withstanding a range of re-identification methods, including genotype-phenotype inferences, location-visit patterns, family structures, and dictionary attacks. For a comparative re-identification analysis, the systems are mapped to a common formalism. Although there is variation in susceptibility, each system is deficient in its protection capacity. The author discovers patterns of protection failure and discusses several of the reasons why these systems are susceptible. The analyses and discussion within provide guideposts for the development of next-generation protection methods amenable to formal proofs.

  15. An Evaluation of the Current State of Genomic Data Privacy Protection Technology and a Roadmap for the Future

    PubMed Central

    Malin, Bradley A.

    2005-01-01

    The incorporation of genomic data into personal medical records poses many challenges to patient privacy. In response, various systems for preserving patient privacy in shared genomic data have been developed and deployed. Although these systems de-identify the data by removing explicit identifiers (e.g., name, address, or Social Security number) and incorporate sound security design principles, they suffer from a lack of formal modeling of inferences learnable from shared data. This report evaluates the extent to which current protection systems are capable of withstanding a range of re-identification methods, including genotype–phenotype inferences, location–visit patterns, family structures, and dictionary attacks. For a comparative re-identification analysis, the systems are mapped to a common formalism. Although there is variation in susceptibility, each system is deficient in its protection capacity. The author discovers patterns of protection failure and discusses several of the reasons why these systems are susceptible. The analyses and discussion within provide guideposts for the development of next-generation protection methods amenable to formal proofs. PMID:15492030

  16. Formal Methods Specification and Verification Guidebook for Software and Computer Systems. Volume 1; Planning and Technology Insertion

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Formal Methods Specification and Verification Guidebook for Software and Computer Systems describes a set of techniques called Formal Methods (FM), and outlines their use in the specification and verification of computer systems and software. Development of increasingly complex systems has created a need for improved specification and verification techniques. NASA's Safety and Mission Quality Office has supported the investigation of techniques such as FM, which are now an accepted method for enhancing the quality of aerospace applications. The guidebook provides information for managers and practitioners who are interested in integrating FM into an existing systems development process. Information includes technical and administrative considerations that must be addressed when establishing the use of FM on a specific project. The guidebook is intended to aid decision makers in the successful application of FM to the development of high-quality systems at reasonable cost. This is the first volume of a planned two-volume set. The current volume focuses on administrative and planning considerations for the successful application of FM.

  17. Timing of Formal Phase Safety Reviews for Large-Scale Integrated Hazard Analysis

    NASA Technical Reports Server (NTRS)

    Massie, Michael J.; Morris, A. Terry

    2010-01-01

    Integrated hazard analysis (IHA) is a process used to identify and control unacceptable risk. As such, it does not occur in a vacuum. IHA approaches must be tailored to fit the system being analyzed. Physical, resource, organizational and temporal constraints on large-scale integrated systems impose additional direct or derived requirements on the IHA. The timing and interaction between engineering and safety organizations can provide either benefits or hindrances to the overall end product. The traditional approach for formal phase safety review timing and content, which generally works well for small- to moderate-scale systems, does not work well for very large-scale integrated systems. This paper proposes a modified approach to timing and content of formal phase safety reviews for IHA. Details of the tailoring process for IHA will describe how to avoid temporary disconnects in major milestone reviews and how to maintain a cohesive end-to-end integration story particularly for systems where the integrator inherently has little to no insight into lower level systems. The proposal has the advantage of allowing the hazard analysis development process to occur as technical data normally matures.

  18. Assessing voids in SNOMED CT for pediatric concepts.

    PubMed

    Wade, Geraldine; Gotlieb, Edward M; Weigle, Carl; Warren, Robert

    2008-11-06

    Reference terminologies such as SNOMED CT may have voids in their representation of concepts important to the practice of pediatrics. In this project, relevant pediatric concepts were extracted from an American Academy of Pediatrics guideline and were mapped to SNOMED CT. Concepts were identified that should be included in the standard reference terminology. A process for formally evaluating voids in reference terminologies for concepts needed in pediatric clinical decision-making is planned as a next step.

  19. Early Warning and Outbreak Detection Using Social Networking Websites: The Potential of Twitter

    NASA Astrophysics Data System (ADS)

    de Quincey, Ed; Kostkova, Patty

    Epidemic Intelligence is being used to gather information about potential diseases outbreaks from both formal and increasingly informal sources. A potential addition to these informal sources are social networking sites such as Facebook and Twitter. In this paper we describe a method for extracting messages, called "tweets" from the Twitter website and the results of a pilot study which collected over 135,000 tweets in a week during the current Swine Flu pandemic.

  20. Learning about systems-based practice in the informal curriculum: a case study in an academic pediatric continuity clinic.

    PubMed

    Balmer, Dorene; Ruzek, Sheryl; Ludwig, Stephen; Giardino, Angelo P

    2007-01-01

    Pediatric residents learn about systems-based practice (SBP) explicitly in the formal curriculum and implicitly in the informal curriculum as they engage in practice alongside physician faculty. Recent studies describe innovative ways to address SBP in the formal curriculum for SBP, but the informal curriculum has not been explored. We examined what, and how, third-year pediatric residents learn about SBP in the informal curriculum at one continuity clinic, and to consider how this learning aligns with the formal curriculum. A case study involving 10 third-year pediatric residents and 10 continuity preceptors was conducted at one continuity clinic, housed in a community-based, pediatric primary care center. Data were derived from 5 months (100 hours) of direct observation in the precepting room at the case clinic, semistructured interviews with residents (before and after observation) and with preceptors (after observation). Interview transcripts and notes from observation were inductively coded and analyzed for major themes. Two themes emerged in the informal curriculum. Residents perceived "our system," the academic health system in which they trained and practiced as separate and distinct from the "real system," the larger, societal context of health care. Residents also understood SBP as a commitment to helping individual patients and families navigate the complexities of "our system," dealing with issues that concerned them. Residents learn important lessons about SBP in the informal curriculum in continuity clinic. These lessons may reinforce some elements of the competency-based formal curriculum for SBP, but challenge others.

  1. Applications of finite-size scaling for atomic and non-equilibrium systems

    NASA Astrophysics Data System (ADS)

    Antillon, Edwin A.

    We apply the theory of Finite-size scaling (FSS) to an atomic and a non-equilibrium system in order to extract critical parameters. In atomic systems, we look at the energy dependence on the binding charge near threshold between bound and free states, where we seek the critical nuclear charge for stability. We use different ab initio methods, such as Hartree-Fock, Density Functional Theory, and exact formulations implemented numerically with the finite-element method (FEM). Using Finite-size scaling formalism, where in this case the size of the system is related to the number of elements used in the basis expansion of the wavefunction, we predict critical parameters in the large basis limit. Results prove to be in good agreement with previous Slater-basis set calculations and demonstrate that this combined approach provides a promising first-principles approach to describe quantum phase transitions for materials and extended systems. In the second part we look at non-equilibrium one-dimensional model known as the raise and peel model describing a growing surface which grows locally and has non-local desorption. For a specific values of adsorption ( ua) and desorption (ud) the model shows interesting features. At ua = ud, the model is described by a conformal field theory (with conformal charge c = 0) and its stationary probability can be mapped to the ground state of a quantum chain and can also be related a two dimensional statistical model. For ua ≥ ud, the model shows a scale invariant phase in the avalanche distribution. In this work we study the surface dynamics by looking at avalanche distributions using FSS formalism and explore the effect of changing the boundary conditions of the model. The model shows the same universality for the cases with and with our the wall for an odd number of tiles removed, but we find a new exponent in the presence of a wall for an even number of avalanches released. We provide new conjecture for the probability distribution of avalanches with a wall obtained by using exact diagonalization of small lattices and Monte-Carlo simulations.

  2. Job Requirements and Workers' Learning: Formal Gaps, Informal Closure, Systemic Limits

    ERIC Educational Resources Information Center

    Livingstone, D. W.

    2010-01-01

    There is substantial evidence that formal educational attainments increasingly exceed the educational job requirements of the employed labour force in many advanced market economies--a phenomenon variously termed "underemployment", "underutilisation", or "overqualification". Conversely, both experiential learning and workplace case studies suggest…

  3. Aspects of Financing Non-Formal Education.

    ERIC Educational Resources Information Center

    Morales, Francisco X. Swett

    1983-01-01

    Various financing structures for nonformal education are presented, using examples from Colombia, Brazil, Costa Rica, and Ecuador. Many resources of the formal education system can be used in the planning, coordination, and execution of nonformal education. The importance of community involvement and financial backing is stressed. (JA)

  4. Hamilton-Jacobi formalism for Podolsky's electromagnetic theory on the null-plane

    NASA Astrophysics Data System (ADS)

    Bertin, M. C.; Pimentel, B. M.; Valcárcel, C. E.; Zambrano, G. E. R.

    2017-08-01

    We develop the Hamilton-Jacobi formalism for Podolsky's electromagnetic theory on the null-plane. The main goal is to build the complete set of Hamiltonian generators of the system as well as to study the canonical and gauge transformations of the theory.

  5. 41 CFR 105-68.835 - Are debarment proceedings formal?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Are debarment proceedings formal? 105-68.835 Section 105-68.835 Public Contracts and Property Management Federal Property Management Regulations System (Continued) GENERAL SERVICES ADMINISTRATION Regional Offices-General Services...

  6. 41 CFR 105-68.740 - Are suspension proceedings formal?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Are suspension proceedings formal? 105-68.740 Section 105-68.740 Public Contracts and Property Management Federal Property Management Regulations System (Continued) GENERAL SERVICES ADMINISTRATION Regional Offices-General Services...

  7. An extensible six-step methodology to automatically generate fuzzy DSSs for diagnostic applications.

    PubMed

    d'Acierno, Antonio; Esposito, Massimo; De Pietro, Giuseppe

    2013-01-01

    The diagnosis of many diseases can be often formulated as a decision problem; uncertainty affects these problems so that many computerized Diagnostic Decision Support Systems (in the following, DDSSs) have been developed to aid the physician in interpreting clinical data and thus to improve the quality of the whole process. Fuzzy logic, a well established attempt at the formalization and mechanization of human capabilities in reasoning and deciding with noisy information, can be profitably used. Recently, we informally proposed a general methodology to automatically build DDSSs on the top of fuzzy knowledge extracted from data. We carefully refine and formalize our methodology that includes six stages, where the first three stages work with crisp rules, whereas the last three ones are employed on fuzzy models. Its strength relies on its generality and modularity since it supports the integration of alternative techniques in each of its stages. The methodology is designed and implemented in the form of a modular and portable software architecture according to a component-based approach. The architecture is deeply described and a summary inspection of the main components in terms of UML diagrams is outlined as well. A first implementation of the architecture has been then realized in Java following the object-oriented paradigm and used to instantiate a DDSS example aimed at accurately diagnosing breast masses as a proof of concept. The results prove the feasibility of the whole methodology implemented in terms of the architecture proposed.

  8. An extensible six-step methodology to automatically generate fuzzy DSSs for diagnostic applications

    PubMed Central

    2013-01-01

    Background The diagnosis of many diseases can be often formulated as a decision problem; uncertainty affects these problems so that many computerized Diagnostic Decision Support Systems (in the following, DDSSs) have been developed to aid the physician in interpreting clinical data and thus to improve the quality of the whole process. Fuzzy logic, a well established attempt at the formalization and mechanization of human capabilities in reasoning and deciding with noisy information, can be profitably used. Recently, we informally proposed a general methodology to automatically build DDSSs on the top of fuzzy knowledge extracted from data. Methods We carefully refine and formalize our methodology that includes six stages, where the first three stages work with crisp rules, whereas the last three ones are employed on fuzzy models. Its strength relies on its generality and modularity since it supports the integration of alternative techniques in each of its stages. Results The methodology is designed and implemented in the form of a modular and portable software architecture according to a component-based approach. The architecture is deeply described and a summary inspection of the main components in terms of UML diagrams is outlined as well. A first implementation of the architecture has been then realized in Java following the object-oriented paradigm and used to instantiate a DDSS example aimed at accurately diagnosing breast masses as a proof of concept. Conclusions The results prove the feasibility of the whole methodology implemented in terms of the architecture proposed. PMID:23368970

  9. Psoriasis, psoriatic arthritis, and rheumatoid arthritis: Is all inflammation the same?

    PubMed

    Coates, Laura C; FitzGerald, Oliver; Helliwell, Philip S; Paul, Carle

    2016-12-01

    To review the pathophysiology, co-morbidities, and therapeutic options for psoriasis, psoriatic arthritis and rheumatoid arthritis in order to further understand the similarities and differences in treatment paradigms in the management of each disease. New targets for individualized therapeutic decisions are also identified with the aim of improving therapeutic outcome and reducing toxicity. Using the PubMed database, we searched literature published from 2000 to 2015 using combinations of the key words "psoriasis," "psoriatic arthritis," "rheumatoid arthritis," "pathogenesis," "immunomodulation," and "treatment." This was a non-systematic review and there were no formal inclusion and exclusion criteria. Abstracts identified in the search were screened for relevance and articles considered appropriate evaluated further. References within these selected articles were also screened. Information was extracted from 198 articles for inclusion in this report. There was no formal data synthesis. Articles were reviewed and summarized according to disease area (psoriasis, psoriatic arthritis, and rheumatoid arthritis). The pathophysiology of psoriasis, psoriatic arthritis, and rheumatoid arthritis involves chronic inflammation mediated by pro-inflammatory cytokines. Dysfunction in integrated signaling pathways affecting different constituents of the immune system result in varying clinical features in the three diseases. Co-morbidities, including cardiovascular disease, malignancies, and non-alcoholic fatty liver disease are increased. Increased understanding of the immunopathogenesis allowed development of targeted treatments; however, despite a variety of potentially predictive genetic, protein and cellular biomarkers, there is still significant unmet need in these three inflammatory disorders. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  10. δ M formalism and anisotropic chaotic inflation power spectrum

    NASA Astrophysics Data System (ADS)

    Talebian-Ashkezari, A.; Ahmadi, N.

    2018-05-01

    A new analytical approach to linear perturbations in anisotropic inflation has been introduced in [A. Talebian-Ashkezari, N. Ahmadi and A.A. Abolhasani, JCAP 03 (2018) 001] under the name of δ M formalism. In this paper we apply the mentioned approach to a model of anisotropic inflation driven by a scalar field, coupled to the kinetic term of a vector field with a U(1) symmetry. The δ M formalism provides an efficient way of computing tensor-tensor, tensor-scalar as well as scalar-scalar 2-point correlations that are needed for the analysis of the observational features of an anisotropic model on the CMB. A comparison between δ M results and the tedious calculations using in-in formalism shows the aptitude of the δ M formalism in calculating accurate two point correlation functions between physical modes of the system.

  11. Using Ontologies to Formalize Services Specifications in Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Breitman, Karin Koogan; Filho, Aluizio Haendchen; Haeusler, Edward Hermann

    2004-01-01

    One key issue in multi-agent systems (MAS) is their ability to interact and exchange information autonomously across applications. To secure agent interoperability, designers must rely on a communication protocol that allows software agents to exchange meaningful information. In this paper we propose using ontologies as such communication protocol. Ontologies capture the semantics of the operations and services provided by agents, allowing interoperability and information exchange in a MAS. Ontologies are a formal, machine processable, representation that allows to capture the semantics of a domain and, to derive meaningful information by way of logical inference. In our proposal we use a formal knowledge representation language (OWL) that translates into Description Logics (a subset of first order logic), thus eliminating ambiguities and providing a solid base for machine based inference. The main contribution of this approach is to make the requirements explicit, centralize the specification in a single document (the ontology itself), at the same that it provides a formal, unambiguous representation that can be processed by automated inference machines.

  12. Numerical approximation abilities correlate with and predict informal but not formal mathematics abilities

    PubMed Central

    Libertus, Melissa E.; Feigenson, Lisa; Halberda, Justin

    2013-01-01

    Previous research has found a relationship between individual differences in children’s precision when nonverbally approximating quantities and their school mathematics performance. School mathematics performance emerges from both informal (e.g., counting) and formal (e.g., knowledge of mathematics facts) abilities. It remains unknown whether approximation precision relates to both of these types of mathematics abilities. In the present study we assessed the precision of numerical approximation in 85 3- to 7-year-old children four times over a span of two years. Additionally, at the last time point, we tested children’s informal and formal mathematics abilities using the Test of Early Mathematics Ability (TEMA-3; Ginsburg & Baroody, 2003). We found that children’s numerical approximation precision correlated with and predicted their informal, but not formal, mathematics abilities when controlling for age and IQ. These results add to our growing understanding of the relationship between an unlearned, non-symbolic system of quantity representation and the system of mathematical reasoning that children come to master through instruction. PMID:24076381

  13. 1998 Gordon Research Conference on Gravitational Effects on Living Systems

    NASA Technical Reports Server (NTRS)

    Evans, Michael L.

    1998-01-01

    The Gordon Research Conference (GRC) on GRAVITATIONAL EFFECTS ON LIVING SYSTEMS was held at COLBY SAYWER 2 from 7/12/98 thru 7/17/98. The Conference was well-attended with 94 participants. The attendees represented the spectrum of endeavor in this field coming from academia, industry, and government laboratories, both U.S. and foreign scientists, senior researchers, young investigators, and students. In designing the formal speakers program, emphasis was placed on current unpublished research and discussion of the future target areas in this field. There was a conscious effort to stimulate lively discussion about the key issues in the field today. Time for formal presentations was limited in the interest of group discussions. In order that more scientists could communicate their most recent results, poster presentation time was scheduled. A copy of the formal schedule and speaker program and the poster program is included. In addition to these formal interactions, "free time" was scheduled to allow informal discussions. Such discussions are fostering new collaborations and joint efforts in the field.

  14. Numerical approximation abilities correlate with and predict informal but not formal mathematics abilities.

    PubMed

    Libertus, Melissa E; Feigenson, Lisa; Halberda, Justin

    2013-12-01

    Previous research has found a relationship between individual differences in children's precision when nonverbally approximating quantities and their school mathematics performance. School mathematics performance emerges from both informal (e.g., counting) and formal (e.g., knowledge of mathematics facts) abilities. It remains unknown whether approximation precision relates to both of these types of mathematics abilities. In the current study, we assessed the precision of numerical approximation in 85 3- to 7-year-old children four times over a span of 2years. In addition, at the final time point, we tested children's informal and formal mathematics abilities using the Test of Early Mathematics Ability (TEMA-3). We found that children's numerical approximation precision correlated with and predicted their informal, but not formal, mathematics abilities when controlling for age and IQ. These results add to our growing understanding of the relationship between an unlearned nonsymbolic system of quantity representation and the system of mathematics reasoning that children come to master through instruction. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. Modeling and formal analysis of urban road traffic

    NASA Astrophysics Data System (ADS)

    Avram, Camelia; Machado, José; Aştilean, Adina

    2013-10-01

    Modern life in cities leads to complex urban traffic road and, sometimes, to go from one point to another, in a city, is a hard and very complex task. The use of assisted systems for helping drivers on their task of reaching the desired destination is being common, mainly systems like GPS location systems or other similar systems. The main gap of those systems is that they are not able to assist drivers when some unexpected changes occur, like accidents, or another unexpected situations. In this context, it would be desirable to have a dynamic system to inform the drivers, about everything that is happening "online". This work is inserted in this context and the work presented here is one part of a bigger project that has, as main goal, to be a dynamic system for assisting drivers under hard conditions of urban road traffic. In this paper is modeled, and formally analyzed, the intersection of four street segments, in order to take some considerations about this subject. This paper presents the model of the considered system, using timed automata formalism. The validation and verification of the road traffic model it is realized using UPPAAL model-checker.

  16. Applications of formal simulation languages in the control and monitoring subsystems of Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Lacovara, R. C.

    1990-01-01

    The notions, benefits, and drawbacks of numeric simulation are introduced. Two formal simulation languages, Simpscript and Modsim are introduced. The capabilities of each are discussed briefly, and then the two programs are compared. The use of simulation in the process of design engineering for the Control and Monitoring System (CMS) for Space Station Freedom is discussed. The application of the formal simulation language to the CMS design is presented, and recommendations are made as to their use.

  17. Dynamic Forms. Part 1: Functions

    NASA Technical Reports Server (NTRS)

    Meyer, George; Smith, G. Allan

    1993-01-01

    The formalism of dynamic forms is developed as a means for organizing and systematizing the design control systems. The formalism allows the designer to easily compute derivatives to various orders of large composite functions that occur in flight-control design. Such functions involve many function-of-a-function calls that may be nested to many levels. The component functions may be multiaxis, nonlinear, and they may include rotation transformations. A dynamic form is defined as a variable together with its time derivatives up to some fixed but arbitrary order. The variable may be a scalar, a vector, a matrix, a direction cosine matrix, Euler angles, or Euler parameters. Algorithms for standard elementary functions and operations of scalar dynamic forms are developed first. Then vector and matrix operations and transformations between parameterization of rotations are developed in the next level in the hierarchy. Commonly occurring algorithms in control-system design, including inversion of pure feedback systems, are developed in the third level. A large-angle, three-axis attitude servo and other examples are included to illustrate the effectiveness of the developed formalism. All algorithms were implemented in FORTRAN code. Practical experience shows that the proposed formalism may significantly improve the productivity of the design and coding process.

  18. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  19. Formal design and verification of a reliable computing platform for real-time control (phase 3 results)

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Divito, Ben L.; Holloway, C. Michael

    1994-01-01

    In this paper the design and formal verification of the lower levels of the Reliable Computing Platform (RCP), a fault-tolerant computing system for digital flight control applications, are presented. The RCP uses NMR-style redundancy to mask faults and internal majority voting to flush the effects of transient faults. Two new layers of the RCP hierarchy are introduced: the Minimal Voting refinement (DA_minv) of the Distributed Asynchronous (DA) model and the Local Executive (LE) Model. Both the DA_minv model and the LE model are specified formally and have been verified using the Ehdm verification system. All specifications and proofs are available electronically via the Internet using anonymous FTP or World Wide Web (WWW) access.

  20. Symbolic discrete event system specification

    NASA Technical Reports Server (NTRS)

    Zeigler, Bernard P.; Chi, Sungdo

    1992-01-01

    Extending discrete event modeling formalisms to facilitate greater symbol manipulation capabilities is important to further their use in intelligent control and design of high autonomy systems. An extension to the DEVS formalism that facilitates symbolic expression of event times by extending the time base from the real numbers to the field of linear polynomials over the reals is defined. A simulation algorithm is developed to generate the branching trajectories resulting from the underlying nondeterminism. To efficiently manage symbolic constraints, a consistency checking algorithm for linear polynomial constraints based on feasibility checking algorithms borrowed from linear programming has been developed. The extended formalism offers a convenient means to conduct multiple, simultaneous explorations of model behaviors. Examples of application are given with concentration on fault model analysis.

  1. A Framework for Modeling Human-Machine Interactions

    NASA Technical Reports Server (NTRS)

    Shafto, Michael G.; Rosekind, Mark R. (Technical Monitor)

    1996-01-01

    Modern automated flight-control systems employ a variety of different behaviors, or modes, for managing the flight. While developments in cockpit automation have resulted in workload reduction and economical advantages, they have also given rise to an ill-defined class of human-machine problems, sometimes referred to as 'automation surprises'. Our interest in applying formal methods for describing human-computer interaction stems from our ongoing research on cockpit automation. In this area of aeronautical human factors, there is much concern about how flight crews interact with automated flight-control systems, so that the likelihood of making errors, in particular mode-errors, is minimized and the consequences of such errors are contained. The goal of the ongoing research on formal methods in this context is: (1) to develop a framework for describing human interaction with control systems; (2) to formally categorize such automation surprises; and (3) to develop tests for identification of these categories early in the specification phase of a new human-machine system.

  2. The Formalism of Generalized Contexts and Decay Processes

    NASA Astrophysics Data System (ADS)

    Losada, Marcelo; Laura, Roberto

    2013-04-01

    The formalism of generalized contexts for quantum histories is used to investigate the possibility to consider the survival probability as the probability of no decay property at a given time conditional to no decay property at an earlier time. A negative result is found for an isolated system. The inclusion of two quantum measurement instruments at two different times makes possible to interpret the survival probability as a conditional probability of the whole system.

  3. Formal modeling of a system of chemical reactions under uncertainty.

    PubMed

    Ghosh, Krishnendu; Schlipf, John

    2014-10-01

    We describe a novel formalism representing a system of chemical reactions, with imprecise rates of reactions and concentrations of chemicals, and describe a model reduction method, pruning, based on the chemical properties. We present two algorithms, midpoint approximation and interval approximation, for construction of efficient model abstractions with uncertainty in data. We evaluate computational feasibility by posing queries in computation tree logic (CTL) on a prototype of extracellular-signal-regulated kinase (ERK) pathway.

  4. The quality management journey: the progress of health facilities in Australia.

    PubMed

    Carr, B J

    1994-12-01

    Many facilities in Australia have taken the Total Quality Management (TQM) step. The objective of this study was to examine progress of adopted formal quality systems in health. Sixty per cent of organizations surveyed have adopted formal systems. Of these, Deming adherents are the most common, followed by eclectic choices. Only 35% considered the quality transition as reasonably easy. There was no relationship between accreditation and formal quality systems identified. The most common improvement techniques were: flow charts, histograms, and cause and effect diagrams. Quality practitioners are happy to use several tools exceptionally well rather than have many tools at their disposal. The greatest impediment to the adoption of quality was the lack of top management support. This study did not support the view that clinicians are not readily actively supporting quality initiatives. Total Quality Management is not a mature concept; however, Chief Executive Officers are assured that rewards will be realized over time.

  5. Cross-Disciplinary Network Comparison: Matchmaking Between Hairballs

    PubMed Central

    Yan, Koon-Kiu; Wang, Daifeng; Sethi, Anurag; Muir, Paul; Kitchen, Robert; Cheng, Chao; Gerstein, Mark

    2016-01-01

    Biological systems are complex. In particular, the interactions between molecular components often form dense networks that, more often than not, are criticized for being inscrutable ‘hairballs’. We argue that one way of untangling these hairballs is through cross-disciplinary network comparison—leveraging advances in other disciplines to obtain new biological insights. In some cases, such comparisons enable the direct transfer of mathematical formalism between disciplines, precisely describing the abstract associations between entities and allowing us to apply a variety of sophisticated formalisms to biology. In cases where the detailed structure of the network does not permit the transfer of complete formalisms between disciplines, comparison of mechanistic interactions in systems for which we have significant day-to-day experience can provide analogies for interpreting relatively more abstruse biological networks. Here, we illustrate how these comparisons benefit the field with a few specific examples related to network growth, organizational hierarchies, and the evolution of adaptive systems. PMID:27047991

  6. Aspects of spatial dispersion in the optical properties of a vacuum-dielectric interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, D.L.; Rimbey, P.R.

    1976-09-15

    We have examined the relationship between the polarizibility for a two-phase (vacuum-dielectric) system and the use of additional boundary conditions and the like, as regards the response of systems exhibiting spatial dispersion. As a consequence we are able to derive information about induced-charge and current densities and the continuity of the field quantities across the interface. It is shown that it is not possible to resonantly excite longitudinal bulk modes with incident light in the formalism of Rimbey-Mahan. We have derived sum rules in wave-vector space on bulk polaritions in homogeneous isotropic systems. In the case of nonhomogeneous perfect crystalsmore » in which the bulk response is described by the matrix epsilon-bar (Q, Q'), we have solved formally for the surface impedance in terms of an assumed arbitrary epsilon-bar (Q, Q'), by means of an extension of the Fuchs-Kliewer formalism. (AIP)« less

  7. Formal Solutions for Polarized Radiative Transfer. III. Stiffness and Instability

    NASA Astrophysics Data System (ADS)

    Janett, Gioele; Paganini, Alberto

    2018-04-01

    Efficient numerical approximation of the polarized radiative transfer equation is challenging because this system of ordinary differential equations exhibits stiff behavior, which potentially results in numerical instability. This negatively impacts the accuracy of formal solvers, and small step-sizes are often necessary to retrieve physical solutions. This work presents stability analyses of formal solvers for the radiative transfer equation of polarized light, identifies instability issues, and suggests practical remedies. In particular, the assumptions and the limitations of the stability analysis of Runge–Kutta methods play a crucial role. On this basis, a suitable and pragmatic formal solver is outlined and tested. An insightful comparison to the scalar radiative transfer equation is also presented.

  8. The development of a model of creative space and its potential for transfer from non-formal to formal education

    NASA Astrophysics Data System (ADS)

    White, Irene; Lorenzi, Francesca

    2016-12-01

    Creativity has been emerging as a key concept in educational policies since the mid-1990s, with many Western countries restructuring their education systems to embrace innovative approaches likely to stimulate creative and critical thinking. But despite current intentions of putting more emphasis on creativity in education policies worldwide, there is still a relative dearth of viable models which capture the complexity of creativity and the conditions for its successful infusion into formal school environments. The push for creativity is in direct conflict with the results-driven/competitive performance-oriented culture which continues to dominate formal education systems. The authors of this article argue that incorporating creativity into mainstream education is a complex task and is best tackled by taking a systematic and multifaceted approach. They present a multidimensional model designed to help educators in tackling the challenges of the promotion of creativity. Their model encompasses three distinct yet interrelated dimensions of a creative space - physical, social-emotional and critical. The authors use the metaphor of space to refer to the interplay of the three identified dimensions. Drawing on confluence approaches to the theorisation of creativity, this paper exemplifies the development of a model before the background of a growing trend of systems theories. The aim of the model is to be helpful in systematising creativity by offering parameters - derived from the evaluation of an example offered by a non-formal educational environment - for the development of creative environments within mainstream secondary schools.

  9. Facility Monitoring: A Qualitative Theory for Sensor Fusion

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando

    2001-01-01

    Data fusion and sensor management approaches have largely been implemented with centralized and hierarchical architectures. Numerical and statistical methods are the most common data fusion methods found in these systems. Given the proliferation and low cost of processing power, there is now an emphasis on designing distributed and decentralized systems. These systems use analytical/quantitative techniques or qualitative reasoning methods for date fusion.Based on other work by the author, a sensor may be treated as a highly autonomous (decentralized) unit. Each highly autonomous sensor (HAS) is capable of extracting qualitative behaviours from its data. For example, it detects spikes, disturbances, noise levels, off-limit excursions, step changes, drift, and other typical measured trends. In this context, this paper describes a distributed sensor fusion paradigm and theory where each sensor in the system is a HAS. Hence, given the reach qualitative information from each HAS, a paradigm and formal definitions are given so that sensors and processes can reason and make decisions at the qualitative level. This approach to sensor fusion makes it possible the implementation of intuitive (effective) methods to monitor, diagnose, and compensate processes/systems and their sensors. This paradigm facilitates a balanced distribution of intelligence (code and/or hardware) to the sensor level, the process/system level, and a higher controller level. The primary application of interest is in intelligent health management of rocket engine test stands.

  10. Effective action for stochastic partial differential equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hochberg, David; Centro de Astrobiologia, INTA, Carratera Ajalvir, Km. 4, 28850 Torrejon, Madrid,; Molina-Paris, Carmen

    Stochastic partial differential equations (SPDEs) are the basic tool for modeling systems where noise is important. SPDEs are used for models of turbulence, pattern formation, and the structural development of the universe itself. It is reasonably well known that certain SPDEs can be manipulated to be equivalent to (nonquantum) field theories that nevertheless exhibit deep and important relationships with quantum field theory. In this paper we systematically extend these ideas: We set up a functional integral formalism and demonstrate how to extract all the one-loop physics for an arbitrary SPDE subject to arbitrary Gaussian noise. It is extremely important tomore » realize that Gaussian noise does not imply that the field variables undergo Gaussian fluctuations, and that these nonquantum field theories are fully interacting. The limitation to one loop is not as serious as might be supposed: Experience with quantum field theories (QFTs) has taught us that one-loop physics is often quite adequate to give a good description of the salient issues. The limitation to one loop does, however, offer marked technical advantages: Because at one loop almost any field theory can be rendered finite using zeta function technology, we can sidestep the complications inherent in the Martin-Siggia-Rose formalism (the SPDE analog of the Becchi-Rouet-Stora-Tyutin formalism used in QFT) and instead focus attention on a minimalist approach that uses only the physical fields (this ''direct approach'' is the SPDE analog of canonical quantization using physical fields). After setting up the general formalism for the characteristic functional (partition function), we show how to define the effective action to all loops, and then focus on the one-loop effective action and its specialization to constant fields: the effective potential. The physical interpretation of the effective action and effective potential for SPDEs is addressed and we show that key features carry over from QFT to the case of SPDEs. An important result is that the amplitude of the two-point function governing the noise acts as the loop-counting parameter and is the analog of Planck's constant ({Dirac_h}/2{pi}) in this SPDE context. We derive a general expression for the one-loop effective potential of an arbitrary SPDE subject to translation-invariant Gaussian noise, and compare this with the one-loop potential for QFT. (c) 1999 The American Physical Society.« less

  11. Organization and Finance of Non-Formal Education.

    ERIC Educational Resources Information Center

    Green, Reginald Herbold

    1979-01-01

    Discusses the importance of organization and finance in developing nonformal education programs (those outside the formal primary-secondary-tertiary system and its variants). Notes goals, six aspects of organization, and discusses the problems of financing programs: the lack of money; coordination between money and programs; implementation. (JOW)

  12. Coarse-grained hydrodynamics from correlation functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmer, Bruce

    This paper will describe a formalism for using correlation functions between different grid cells as the basis for determining coarse-grained hydrodynamic equations for modeling the behavior of mesoscopic fluid systems. Configuration from a molecular dynamics simulation are projected onto basis functions representing grid cells in a continuum hydrodynamic simulation. Equilbrium correlation functions between different grid cells are evaluated from the molecular simulation and used to determine the evolution operator for the coarse-grained hydrodynamic system. The formalism is applied to some simple hydrodynamic cases to determine the feasibility of applying this to realistic nanoscale systems.

  13. Development of a Software Safety Process and a Case Study of Its Use

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1996-01-01

    Research in the year covered by this reporting period has been primarily directed toward: continued development of mock-ups of computer screens for operator of a digital reactor control system; development of a reactor simulation to permit testing of various elements of the control system; formal specification of user interfaces; fault-tree analysis including software; evaluation of formal verification techniques; and continued development of a software documentation system. Technical results relating to this grant and the remainder of the principal investigator's research program are contained in various reports and papers.

  14. Formal verification of automated teller machine systems using SPIN

    NASA Astrophysics Data System (ADS)

    Iqbal, Ikhwan Mohammad; Adzkiya, Dieky; Mukhlash, Imam

    2017-08-01

    Formal verification is a technique for ensuring the correctness of systems. This work focuses on verifying a model of the Automated Teller Machine (ATM) system against some specifications. We construct the model as a state transition diagram that is suitable for verification. The specifications are expressed as Linear Temporal Logic (LTL) formulas. We use Simple Promela Interpreter (SPIN) model checker to check whether the model satisfies the formula. This model checker accepts models written in Process Meta Language (PROMELA), and its specifications are specified in LTL formulas.

  15. Applications of a formal approach to decipher discrete genetic networks.

    PubMed

    Corblin, Fabien; Fanchon, Eric; Trilling, Laurent

    2010-07-20

    A growing demand for tools to assist the building and analysis of biological networks exists in systems biology. We argue that the use of a formal approach is relevant and applicable to address questions raised by biologists about such networks. The behaviour of these systems being complex, it is essential to exploit efficiently every bit of experimental information. In our approach, both the evolution rules and the partial knowledge about the structure and the behaviour of the network are formalized using a common constraint-based language. In this article our formal and declarative approach is applied to three biological applications. The software environment that we developed allows to specifically address each application through a new class of biologically relevant queries. We show that we can describe easily and in a formal manner the partial knowledge about a genetic network. Moreover we show that this environment, based on a constraint algorithmic approach, offers a wide variety of functionalities, going beyond simple simulations, such as proof of consistency, model revision, prediction of properties, search for minimal models relatively to specified criteria. The formal approach proposed here deeply changes the way to proceed in the exploration of genetic and biochemical networks, first by avoiding the usual trial-and-error procedure, and second by placing the emphasis on sets of solutions, rather than a single solution arbitrarily chosen among many others. Last, the constraint approach promotes an integration of model and experimental data in a single framework.

  16. Integrating reasoning and clinical archetypes using OWL ontologies and SWRL rules.

    PubMed

    Lezcano, Leonardo; Sicilia, Miguel-Angel; Rodríguez-Solano, Carlos

    2011-04-01

    Semantic interoperability is essential to facilitate the computerized support for alerts, workflow management and evidence-based healthcare across heterogeneous electronic health record (EHR) systems. Clinical archetypes, which are formal definitions of specific clinical concepts defined as specializations of a generic reference (information) model, provide a mechanism to express data structures in a shared and interoperable way. However, currently available archetype languages do not provide direct support for mapping to formal ontologies and then exploiting reasoning on clinical knowledge, which are key ingredients of full semantic interoperability, as stated in the SemanticHEALTH report [1]. This paper reports on an approach to translate definitions expressed in the openEHR Archetype Definition Language (ADL) to a formal representation expressed using the Ontology Web Language (OWL). The formal representations are then integrated with rules expressed with Semantic Web Rule Language (SWRL) expressions, providing an approach to apply the SWRL rules to concrete instances of clinical data. Sharing the knowledge expressed in the form of rules is consistent with the philosophy of open sharing, encouraged by archetypes. Our approach also allows the reuse of formal knowledge, expressed through ontologies, and extends reuse to propositions of declarative knowledge, such as those encoded in clinical guidelines. This paper describes the ADL-to-OWL translation approach, describes the techniques to map archetypes to formal ontologies, and demonstrates how rules can be applied to the resulting representation. We provide examples taken from a patient safety alerting system to illustrate our approach. Copyright © 2010 Elsevier Inc. All rights reserved.

  17. Systematic study of α decay of nuclei around the Z =82 , N =126 shell closures within the cluster-formation model and proximity potential 1977 formalism

    NASA Astrophysics Data System (ADS)

    Deng, Jun-Gang; Zhao, Jie-Cheng; Chu, Peng-Cheng; Li, Xiao-Hua

    2018-04-01

    In the present work, we systematically study the α decay preformation factors Pα within the cluster-formation model and α decay half-lives by the proximity potential 1977 formalism for nuclei around Z =82 ,N =126 closed shells. The calculations show that the realistic Pα is linearly dependent on the product of valance protons (holes) and valance neutrons (holes) NpNn . It is consistent with our previous works [Sun et al., Phys. Rev. C 94, 024338 (2016), 10.1103/PhysRevC.94.024338; Deng et al., Phys. Rev. C 96, 024318 (2017), 10.1103/PhysRevC.96.024318], in which Pα are model dependent and extracted from the ratios of calculated α half-lives to experimental data. Combining with our previous works, we confirm that the valance proton-neutron interaction plays a key role in the α preformation for nuclei around Z =82 ,N =126 shell closures whether the Pα is model dependent or microcosmic. In addition, our calculated α decay half-lives by using the proximity potential 1977 formalism taking Pα evaluated by the cluster-formation model can well reproduce the experimental data and significantly reduce the errors.

  18. Dose heterogeneity correction for low-energy brachytherapy sources using dual-energy CT images

    NASA Astrophysics Data System (ADS)

    Mashouf, S.; Lechtman, E.; Lai, P.; Keller, B. M.; Karotki, A.; Beachey, D. J.; Pignol, J. P.

    2014-09-01

    Permanent seed implant brachytherapy is currently used for adjuvant radiotherapy of early stage prostate and breast cancer patients. The current standard for calculation of dose around brachytherapy sources is based on the AAPM TG-43 formalism, which generates the dose in a homogeneous water medium. Recently, AAPM TG-186 emphasized the importance of accounting for tissue heterogeneities. We have previously reported on a methodology where the absorbed dose in tissue can be obtained by multiplying the dose, calculated by the TG-43 formalism, by an inhomogeneity correction factor (ICF). In this work we make use of dual energy CT (DECT) images to extract ICF parameters. The advantage of DECT over conventional CT is that it eliminates the need for tissue segmentation as well as assignment of population based atomic compositions. DECT images of a heterogeneous phantom were acquired and the dose was calculated using both TG-43 and TG-43 × \\text{ICF} formalisms. The results were compared to experimental measurements using Gafchromic films in the mid-plane of the phantom. For a seed implant configuration of 8 seeds spaced 1.5 cm apart in a cubic structure, the gamma passing score for 2%/2 mm criteria improved from 40.8% to 90.5% when ICF was applied to TG-43 dose distributions.

  19. Developing a Treatment Planning Software Based on TG-43U1 Formalism for Cs-137 LDR Brachytherapy.

    PubMed

    Sina, Sedigheh; Faghihi, Reza; Soleimani Meigooni, Ali; Siavashpour, Zahra; Mosleh-Shirazi, Mohammad Amin

    2013-08-01

    The old Treatment Planning Systems (TPSs) used for intracavitary brachytherapy with Cs-137 Selectron source utilize traditional dose calculation methods, considering each source as a point source. Using such methods introduces significant errors in dose estimation. As of 1995, TG-43 is used as the main dose calculation formalism in treatment TPSs. The purpose of this study is to design and establish a treatment planning software for Cs-137 Solectron brachytherapy source, based on TG-43U1 formalism by applying the effects of the applicator and dummy spacers. Two softwares used for treatment planning of Cs-137 sources in Iran (STPS and PLATO), are based on old formalisms. The purpose of this work is to establish and develop a TPS for Selectron source based on TG-43 formalism. In this planning system, the dosimetry parameters of each pellet in different places inside applicators were obtained by MCNP4c code. Then the dose distribution around every combination of active and inactive pellets was obtained by summing the doses. The accuracy of this algorithm was checked by comparing its results for special combination of active and inactive pellets with MC simulations. Finally, the uncertainty of old dose calculation formalism was investigated by comparing the results of STPS and PLATO softwares with those obtained by the new algorithm. For a typical arrangement of 10 active pellets in the applicator, the percentage difference between doses obtained by the new algorithm at 1cm distance from the tip of the applicator and those obtained by old formalisms is about 30%, while the difference between the results of MCNP and the new algorithm is less than 5%. According to the results, the old dosimetry formalisms, overestimate the dose especially towards the applicator's tip. While the TG-43U1 based software perform the calculations more accurately.

  20. HIV Education in the Formal Curriculum

    ERIC Educational Resources Information Center

    Nsubuga, Yusuf K.; Bonnet, Sandrine

    2009-01-01

    The AIDS epidemic presents a complex of issues that require global answers, involving entire societies. The only sustainable solution is to include all sectors of society in a multidisciplinary collaboration, within which the formal education system plays a key role in delivering a comprehensive response to the disease at the national level.…

  1. Formal Mentoring Relationships and Attachment Theory: Implications for Human Resource Development

    ERIC Educational Resources Information Center

    Germain, Marie-Line

    2011-01-01

    An attachment theory perspective of mentoring is presented to explain the degree of functionality of a mentor-protege formal match in an organizational setting. By focusing on Bowlby's (1969/1982) behavioral system of attachment and its triarchic taxonomy of secure, avoidant, and anxious-ambivalent attachment, previous conceptualizations are…

  2. Open Learning and Formal Credentialing in Higher Education: Curriculum Models and Institutional Policies

    ERIC Educational Resources Information Center

    Reushle, Shirley, Ed.; Antonio, Amy, Ed.; Keppell, Mike, Ed.

    2016-01-01

    The discipline of education is a multi-faceted system that must constantly integrate new strategies and procedures to ensure successful learning experiences. Enhancements in education provide learners with greater opportunities for growth and advancement. "Open Learning and Formal Credentialing in Higher Education: Curriculum Models and…

  3. Education, Training and Work. Some Commonwealth Responses to Youth Unemployment.

    ERIC Educational Resources Information Center

    Commonwealth Secretariat, London (England).

    This report documents programs linking education and work in the Commonwealth of Nations. It contains four parts: "Learning about Science and Technology Outside School: Project Review" (Keith Lewin, Roger Jones); "Education and Productive Work Linkages in the Formal and Non-Formal Educational Systems of the Commonwealth…

  4. Educational Administration and the Professional Development of Teachers.

    ERIC Educational Resources Information Center

    Bassett, G. W.

    Beeby's description of the four modes of operation of school systems is couched mainly in terms of teachers' conduct in the classroom. In translating this into administrative terms, only two levels are of prime importance, formalism and meaning. Formalism implies preoccupation with maintenance of existing procedures and policies, while meaning…

  5. Experiences applying Formal Approaches in the Development of Swarm-Based Space Exploration Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher A.; Hinchey, Michael G.; Truszkowski, Walter F.; Rash, James L.

    2006-01-01

    NASA is researching advanced technologies for future exploration missions using intelligent swarms of robotic vehicles. One of these missions is the Autonomous Nan0 Technology Swarm (ANTS) mission that will explore the asteroid belt using 1,000 cooperative autonomous spacecraft. The emergent properties of intelligent swarms make it a potentially powerful concept, but at the same time more difficult to design and ensure that the proper behaviors will emerge. NASA is investigating formal methods and techniques for verification of such missions. The advantage of using formal methods is the ability to mathematically verify the behavior of a swarm, emergent or otherwise. Using the ANTS mission as a case study, we have evaluated multiple formal methods to determine their effectiveness in modeling and ensuring desired swarm behavior. This paper discusses the results of this evaluation and proposes an integrated formal method for ensuring correct behavior of future NASA intelligent swarms.

  6. Access to timely formal dementia care in Europe: protocol of the Actifcare (ACcess to Timely Formal Care) study.

    PubMed

    Kerpershoek, Liselot; de Vugt, Marjolein; Wolfs, Claire; Jelley, Hannah; Orrell, Martin; Woods, Bob; Stephan, Astrid; Bieber, Anja; Meyer, Gabriele; Engedal, Knut; Selbaek, Geir; Handels, Ron; Wimo, Anders; Hopper, Louise; Irving, Kate; Marques, Maria; Gonçalves-Pereira, Manuel; Portolani, Elisa; Zanetti, Orazio; Verhey, Frans

    2016-08-23

    Previous findings indicate that people with dementia and their informal carers experience difficulties accessing and using formal care services due to a mismatch between needs and service use. This mismatch causes overall dissatisfaction and is a waste of the scarce financial care resources. This article presents the background and methods of the Actifcare (ACcess to Timely Formal Care) project. This is a European study aiming at best-practice development in finding timely access to formal care for community-dwelling people with dementia and their informal carers. There are five main objectives: 1) Explore predisposing and enabling factors associated with the use of formal care, 2) Explore the association between the use of formal care, needs and quality of life and 3) Compare these across European countries, 4) Understand the costs and consequences of formal care services utilization in people with unmet needs, 5) Determine the major costs and quality of life drivers and their relationship with formal care services across European countries. In a longitudinal cohort study conducted in eight European countries approximately 450 people with dementia and informal carers will be assessed three times in 1 year (baseline, 6 and 12 months). In this year we will closely monitor the process of finding access to formal care. Data on service use, quality of life and needs will be collected. The results of Actifcare are expected to reveal best-practices in organizing formal care. Knowledge about enabling and predisposing factors regarding access to care services, as well as its costs and consequences, can advance the state of the art in health systems research into pathways to dementia care, in order to benefit people with dementia and their informal carers.

  7. Bright and singular soliton solutions of the conformable time-fractional Klein-Gordon equations with different nonlinearities

    NASA Astrophysics Data System (ADS)

    Hosseini, Kamyar; Mayeli, Peyman; Ansari, Reza

    2018-07-01

    Finding the exact solutions of nonlinear fractional differential equations has gained considerable attention, during the past two decades. In this paper, the conformable time-fractional Klein-Gordon equations with quadratic and cubic nonlinearities are studied. Several exact soliton solutions, including the bright (non-topological) and singular soliton solutions are formally extracted by making use of the ansatz method. Results demonstrate that the method can efficiently handle the time-fractional Klein-Gordon equations with different nonlinearities.

  8. Spherical Harmonic Decomposition of Gravitational Waves Across Mesh Refinement Boundaries

    NASA Technical Reports Server (NTRS)

    Fiske, David R.; Baker, John; vanMeter, James R.; Centrella, Joan M.

    2005-01-01

    We evolve a linearized (Teukolsky) solution of the Einstein equations with a non-linear Einstein solver. Using this testbed, we are able to show that such gravitational waves, defined by the Weyl scalars in the Newman-Penrose formalism, propagate faithfully across mesh refinement boundaries, and use, for the first time to our knowledge, a novel algorithm due to Misner to compute spherical harmonic components of our waveforms. We show that the algorithm performs extremely well, even when the extraction sphere intersects refinement boundaries.

  9. Knowledge representation and management: transforming textual information into useful knowledge.

    PubMed

    Rassinoux, A-M

    2010-01-01

    To summarize current outstanding research in the field of knowledge representation and management. Synopsis of the articles selected for the IMIA Yearbook 2010. Four interesting papers, dealing with structured knowledge, have been selected for the section knowledge representation and management. Combining the newest techniques in computational linguistics and natural language processing with the latest methods in statistical data analysis, machine learning and text mining has proved to be efficient for turning unstructured textual information into meaningful knowledge. Three of the four selected papers for the section knowledge representation and management corroborate this approach and depict various experiments conducted to .extract meaningful knowledge from unstructured free texts such as extracting cancer disease characteristics from pathology reports, or extracting protein-protein interactions from biomedical papers, as well as extracting knowledge for the support of hypothesis generation in molecular biology from the Medline literature. Finally, the last paper addresses the level of formally representing and structuring information within clinical terminologies in order to render such information easily available and shareable among the health informatics community. Delivering common powerful tools able to automatically extract meaningful information from the huge amount of electronically unstructured free texts is an essential step towards promoting sharing and reusability across applications, domains, and institutions thus contributing to building capacities worldwide.

  10. Synthesis, extraction and electronic structure of Ce@C2n

    NASA Astrophysics Data System (ADS)

    Liu, Bing-Bing; Zou, Guang-Tian; Yang, Hai-Bin; Yu, San; Lu, Jin-Shan; Liu, Zi-Yang; Liu, Shu-Ying; Xu, Wen-Guo

    1997-11-01

    In view of the growing interest in endohedral lanthanide fullerenes, Ce, as a typical+ 4 oxidation state lanthanide element, has been systematically studied. The synthesis, extraction and electronic structure of Ce @ C2n are investigated. Soot containing Ce@C2n was synthesized in high yield by carbonizing CeO2-containing graphite rods and are back-burning the CeC2-enriched cathode deposit in a DC arc plasma apparatus. Ce@C2n dominated by Ce@C82, can be efficiently extracted from the insoluble part of the soot after toluene Soxhlet extraction by pyridine at high temperature and high pressure in a closed vessel. About 60% Ce@C2n(2n = 82, 80, 78, 76) and 35% Ce@C82 can be enriched in the pyridine extract. This fact is identified by desorption electron impact mass spectrometry (DEI MS). The electronic structure of Ce@C2n is analyzed by using X-ray photoemission spectroscopy (XPS) of pyridine-free film. It is suggested that the encapsulated Ce atom is in a charge state close to+ 3 and was effectively protected from reaction with water and oxygen by the enclosing fullerene cage. Unlike theoretical expectation, the electronic state of Ce@C82 is formally described as Ce+3@C3-82.

  11. In vitro anti-mycobacterial activity of nine medicinal plants used by ethnic groups in Sonora, Mexico.

    PubMed

    Robles-Zepeda, Ramón Enrique; Coronado-Aceves, Enrique Wenceslao; Velázquez-Contreras, Carlos Arturo; Ruiz-Bustos, Eduardo; Navarro-Navarro, Moisés; Garibay-Escobar, Adriana

    2013-11-25

    Sonoran ethnic groups (Yaquis, Mayos, Seris, Guarijíos, Pimas, Kikapúes and Pápagos) use mainly herbal based preparations as their first line of medicinal treatment. Among the plants used are those with anti-tuberculosis properties; however, no formal research is available. Organic extracts were obtained from nine medicinal plants traditionally used by Sonoran ethnic groups to treat different kinds of diseases; three of them are mainly used to treat tuberculosis. All of the extracts were tested against Mycobacterium tuberculosis H37Rv using the Alamar Blue redox bioassay. Methanolic extracts from Ambrosia confertiflora, Ambrosia ambrosioides and Guaiacum coulteri showed minimal inhibitory concentration (MIC) values of 200, 790 and 1000 μg/mL, respectively, whereas no effect was observed with the rest of the methanolic extracts at the concentrations tested. Chloroform, dichloromethane, and ethyl acetate extracts from Ambrosia confertiflora showed a MIC of 90, 120 and 160 μg/mL, respectively. A. confertiflora and A. ambrosioides showed the best anti-mycobacterial activity in vitro. The activity of Guaiacum coulteri is consistent with the traditional use by Sonoran ethnic groups as anti-tuberculosis agent.For these reasons, it is important to investigate a broader spectrum of medicinal plants in order to find compounds active against Mycobacterium tuberculosis.

  12. Formal specification and verification of a fault-masking and transient-recovery model for digital flight-control systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1991-01-01

    The formal specification and mechanically checked verification for a model of fault-masking and transient-recovery among the replicated computers of digital flight-control systems are presented. The verification establishes, subject to certain carefully stated assumptions, that faults among the component computers are masked so that commands sent to the actuators are the same as those that would be sent by a single computer that suffers no failures.

  13. Which Method of Assigning Bond Orders in Lewis Structures Best Reflects Experimental Data? An Analysis of the Octet Rule and Formal Charge Systems for Period 2 and 3 Nonmetallic Compounds

    ERIC Educational Resources Information Center

    See, Ronald F.

    2009-01-01

    Two systems were evaluated for drawing Lewis structures of period 2 and 3 non-metallic compounds: the octet rule and minimization of formal charge. The test set of molecules consisted of the oxides, halides, oxohalides, oxoanions, and oxoacids of B, N, O, F, Al, P, S, and Cl. Bond orders were quantified using experimental data, including bond…

  14. Sub-grid scale models for discontinuous Galerkin methods based on the Mori-Zwanzig formalism

    NASA Astrophysics Data System (ADS)

    Parish, Eric; Duraisamy, Karthk

    2017-11-01

    The optimal prediction framework of Chorin et al., which is a reformulation of the Mori-Zwanzig (M-Z) formalism of non-equilibrium statistical mechanics, provides a framework for the development of mathematically-derived closure models. The M-Z formalism provides a methodology to reformulate a high-dimensional Markovian dynamical system as a lower-dimensional, non-Markovian (non-local) system. In this lower-dimensional system, the effects of the unresolved scales on the resolved scales are non-local and appear as a convolution integral. The non-Markovian system is an exact statement of the original dynamics and is used as a starting point for model development. In this work, we investigate the development of M-Z-based closures model within the context of the Variational Multiscale Method (VMS). The method relies on a decomposition of the solution space into two orthogonal subspaces. The impact of the unresolved subspace on the resolved subspace is shown to be non-local in time and is modeled through the M-Z-formalism. The models are applied to hierarchical discontinuous Galerkin discretizations. Commonalities between the M-Z closures and conventional flux schemes are explored. This work was supported in part by AFOSR under the project ''LES Modeling of Non-local effects using Statistical Coarse-graining'' with Dr. Jean-Luc Cambier as the technical monitor.

  15. Nurses' attitudes towards the reporting of violence in the emergency department.

    PubMed

    Hogarth, Kathryn M; Beattie, Jill; Morphet, Julia

    2016-05-01

    The incidence of workplace violence against nurses in emergency departments is underreported. Thus, the true nature and frequency of violent incidents remains unknown. It is therefore difficult to address the problem. To identify the attitudes, barriers and enablers of emergency nurses to the reporting of workplace violence. Using a phenomenological approach, two focus groups were conducted at a tertiary emergency department. The data were audio-recorded, transcribed verbatim and analysed using thematic analysis. Violent incidents in this emergency department were underreported. Nurses accepted violence as part of their normal working day, and therefore were less likely to report it. Violent incidents were not defined as 'violence' if no physical injury was sustained, therefore it was not reported. Nurses were also motivated to report formally in order to protect themselves from any possible future complaints made by perpetrators. The current formal reporting system was a major barrier to reporting because it was difficult and time consuming to use. Nurses reported violence using methods other than the designated reporting system. While emergency nurses do report violence, they do not use the formal reporting system. When they did use the formal reporting system they were motivated to do so in order to protect themselves. As a consequence of underreporting, the nature and extent of workplace violence remains unknown. Copyright © 2015 College of Emergency Nursing Australasia. Published by Elsevier Ltd. All rights reserved.

  16. Detecting modification of biomedical events using a deep parsing approach.

    PubMed

    Mackinlay, Andrew; Martinez, David; Baldwin, Timothy

    2012-04-30

    This work describes a system for identifying event mentions in bio-molecular research abstracts that are either speculative (e.g. analysis of IkappaBalpha phosphorylation, where it is not specified whether phosphorylation did or did not occur) or negated (e.g. inhibition of IkappaBalpha phosphorylation, where phosphorylation did not occur). The data comes from a standard dataset created for the BioNLP 2009 Shared Task. The system uses a machine-learning approach, where the features used for classification are a combination of shallow features derived from the words of the sentences and more complex features based on the semantic outputs produced by a deep parser. To detect event modification, we use a Maximum Entropy learner with features extracted from the data relative to the trigger words of the events. The shallow features are bag-of-words features based on a small sliding context window of 3-4 tokens on either side of the trigger word. The deep parser features are derived from parses produced by the English Resource Grammar and the RASP parser. The outputs of these parsers are converted into the Minimal Recursion Semantics formalism, and from this, we extract features motivated by linguistics and the data itself. All of these features are combined to create training or test data for the machine learning algorithm. Over the test data, our methods produce approximately a 4% absolute increase in F-score for detection of event modification compared to a baseline based only on the shallow bag-of-words features. Our results indicate that grammar-based techniques can enhance the accuracy of methods for detecting event modification.

  17. Adiabatic invariance with first integrals of motion

    NASA Astrophysics Data System (ADS)

    Adib, Artur B.

    2002-10-01

    The construction of a microthermodynamic formalism for isolated systems based on the concept of adiabatic invariance is an old but seldom appreciated effort in the literature, dating back at least to P. Hertz [Ann. Phys. (Leipzig) 33, 225 (1910)]. An apparently independent extension of such formalism for systems bearing additional first integrals of motion was recently proposed by Hans H. Rugh [Phys. Rev. E 64, 055101 (2001)], establishing the concept of adiabatic invariance even in such singular cases. After some remarks in connection with the formalism pioneered by Hertz, it will be suggested that such an extension can incidentally explain the success of a dynamical method for computing the entropy of classical interacting fluids, at least in some potential applications where the presence of additional first integrals cannot be ignored.

  18. Effect of formal specifications on program complexity and reliability: An experimental study

    NASA Technical Reports Server (NTRS)

    Goel, Amrit L.; Sahoo, Swarupa N.

    1990-01-01

    The results are presented of an experimental study undertaken to assess the improvement in program quality by using formal specifications. Specifications in the Z notation were developed for a simple but realistic antimissile system. These specifications were then used to develop 2 versions in C by 2 programmers. Another set of 3 versions in Ada were independently developed from informal specifications in English. A comparison of the reliability and complexity of the resulting programs suggests the advantages of using formal specifications in terms of number of errors detected and fault avoidance.

  19. Formal Analysis of Extended Well-Clear Boundaries for Unmanned Aircraft

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar; Narkawicz, Anthony

    2016-01-01

    This paper concerns the application of formal methods to the definition of a detect and avoid concept for unmanned aircraft systems (UAS). In particular, it illustrates how formal analysis was used to explain and correct unexpected behaviors of the logic that issues alerts when two aircraft are predicted not to be well clear from one another. As a result of this analysis, a recommendation was proposed to, and subsequently adopted by, the US standards organization that defines the minimum operational requirements for the UAS detect and avoid concept.

  20. Assessing Requirements Quality through Requirements Coverage

    NASA Technical Reports Server (NTRS)

    Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt

    2008-01-01

    In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software requirements and existing model coverage metrics such as the Modified Condition and Decision Coverage (MC/DC) used when testing highly critical software in the avionics industry [8]. Our work is related to Chockler et al. [2], but we base our work on traditional testing techniques as opposed to verification techniques.

  1. Achieving universal health care coverage: Current debates in Ghana on covering those outside the formal sector

    PubMed Central

    2012-01-01

    Background Globally, extending financial protection and equitable access to health services to those outside the formal sector employment is a major challenge for achieving universal coverage. While some favour contributory schemes, others have embraced tax-funded health service cover for those outside the formal sector. This paper critically examines the issue of how to cover those outside the formal sector through the lens of stakeholder views on the proposed one-time premium payment (OTPP) policy in Ghana. Discussion Ghana in 2004 implemented a National Health Insurance Scheme, based on a contributory model where service benefits are restricted to those who contribute (with some groups exempted from contributing), as the policy direction for moving towards universal coverage. In 2008, the OTPP system was proposed as an alternative way of ensuring coverage for those outside formal sector employment. There are divergent stakeholder views with regard to the meaning of the one-time premium and how it will be financed and sustained. Our stakeholder interviews indicate that the underlying issue being debated is whether the current contributory NHIS model for those outside the formal employment sector should be maintained or whether services for this group should be tax funded. However, the advantages and disadvantages of these alternatives are not being explored in an explicit or systematic way and are obscured by the considerable confusion about the likely design of the OTPP policy. We attempt to contribute to the broader debate about how best to fund coverage for those outside the formal sector by unpacking some of these issues and pointing to the empirical evidence needed to shed even further light on appropriate funding mechanisms for universal health systems. Summary The Ghanaian debate on OTPP is related to one of the most important challenges facing low- and middle-income countries seeking to achieve a universal health care system. It is critical that there is more extensive debate on the advantages and disadvantages of alternative funding mechanisms, supported by a solid evidence base, and with the policy objective of universal coverage providing the guiding light. PMID:23102454

  2. Formal Methods of V&V of Partial Specifications: An Experience Report

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Callahan, John

    1997-01-01

    This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe an experiment in the application of the method SCR. to testing for consistency properties of a partial model of requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification is valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.

  3. The Markov process admits a consistent steady-state thermodynamic formalism

    NASA Astrophysics Data System (ADS)

    Peng, Liangrong; Zhu, Yi; Hong, Liu

    2018-01-01

    The search for a unified formulation for describing various non-equilibrium processes is a central task of modern non-equilibrium thermodynamics. In this paper, a novel steady-state thermodynamic formalism was established for general Markov processes described by the Chapman-Kolmogorov equation. Furthermore, corresponding formalisms of steady-state thermodynamics for the master equation and Fokker-Planck equation could be rigorously derived in mathematics. To be concrete, we proved that (1) in the limit of continuous time, the steady-state thermodynamic formalism for the Chapman-Kolmogorov equation fully agrees with that for the master equation; (2) a similar one-to-one correspondence could be established rigorously between the master equation and Fokker-Planck equation in the limit of large system size; (3) when a Markov process is restrained to one-step jump, the steady-state thermodynamic formalism for the Fokker-Planck equation with discrete state variables also goes to that for master equations, as the discretization step gets smaller and smaller. Our analysis indicated that general Markov processes admit a unified and self-consistent non-equilibrium steady-state thermodynamic formalism, regardless of underlying detailed models.

  4. Interacting hadron resonance gas model in the K -matrix formalism

    NASA Astrophysics Data System (ADS)

    Dash, Ashutosh; Samanta, Subhasis; Mohanty, Bedangadas

    2018-05-01

    An extension of hadron resonance gas (HRG) model is constructed to include interactions using relativistic virial expansion of partition function. The noninteracting part of the expansion contains all the stable baryons and mesons and the interacting part contains all the higher mass resonances which decay into two stable hadrons. The virial coefficients are related to the phase shifts which are calculated using K -matrix formalism in the present work. We have calculated various thermodynamics quantities like pressure, energy density, and entropy density of the system. A comparison of thermodynamic quantities with noninteracting HRG model, calculated using the same number of hadrons, shows that the results of the above formalism are larger. A good agreement between equation of state calculated in K -matrix formalism and lattice QCD simulations is observed. Specifically, the lattice QCD calculated interaction measure is well described in our formalism. We have also calculated second-order fluctuations and correlations of conserved charges in K -matrix formalism. We observe a good agreement of second-order fluctuations and baryon-strangeness correlation with lattice data below the crossover temperature.

  5. General Formalism of Mass Scaling Approach for Replica-Exchange Molecular Dynamics and its Application

    NASA Astrophysics Data System (ADS)

    Nagai, Tetsuro

    2017-01-01

    Replica-exchange molecular dynamics (REMD) has demonstrated its efficiency by combining trajectories of a wide range of temperatures. As an extension of the method, the author formalizes the mass-manipulating replica-exchange molecular dynamics (MMREMD) method that allows for arbitrary mass scaling with respect to temperature and individual particles. The formalism enables the versatile application of mass-scaling approaches to the REMD method. The key change introduced in the novel formalism is the generalized rules for the velocity and momentum scaling after accepted replica-exchange attempts. As an application of this general formalism, the refinement of the viscosity-REMD (V-REMD) method [P. H. Nguyen, J. Chem. Phys. 132, 144109 (2010)] is presented. Numerical results are provided using a pilot system, demonstrating easier and more optimized applicability of the new version of V-REMD as well as the importance of adherence to the generalized velocity scaling rules. With the new formalism, more sound and efficient simulations will be performed.

  6. Tristania Sumatrana Effect On Female Mus Musculus Fertility

    NASA Astrophysics Data System (ADS)

    Syamsurizal, S.

    2018-04-01

    The use of traditional medicinal plants are generally based on empirical experience, therefore it is necessary scientific approach in order to bring traditional medicine into medical practice and the formal health services. Tristania sumatrana Miq. is one of the traditional medicinal plants are often used as a contraceptive for women in West Sumatra Tristania sumatrana Miq. extract can prolong the estrous cycle in mice to eleven days. This study aimed to influence Tristania sumatrana Miq. extract treatment on fertility of female mice. Experiments conducted a randomized block design 5x2. Five dose groups: control with no treatment, placebo, treatment with doses of 600, 900 and 1200 mg/kgbw and the old two treatment groups: 10 and 20 days. Fertility parameters studied were ovarian weight follicle Graaf number, the corpus luteum and fetal life. The research proves of Tristania sumatrana Miq. extract treatment causes a decrease very significantly the ovarian weight (treatment of 900 mg/kgbw for 10 days), follicles Graaf number (900 mg/kgbw for 20 days), the corpus luteum (600 mg/kgbw for 10 days) and live fetuses (900 mg/kgbw for 10 days). Tristania sumatrana Miq. extract treatment can lead to decreased fertility of female mice.

  7. Extension of Liouville Formalism to Postinstability Dynamics

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    2003-01-01

    A mathematical formalism has been developed for predicting the postinstability motions of a dynamic system governed by a system of nonlinear equations and subject to initial conditions. Previously, there was no general method for prediction and mathematical modeling of postinstability behaviors (e.g., chaos and turbulence) in such a system. The formalism of nonlinear dynamics does not afford means to discriminate between stable and unstable motions: an additional stability analysis is necessary for such discrimination. However, an additional stability analysis does not suggest any modifications of a mathematical model that would enable the model to describe postinstability motions efficiently. The most important type of instability that necessitates a postinstability description is associated with positive Lyapunov exponents. Such an instability leads to exponential growth of small errors in initial conditions or, equivalently, exponential divergence of neighboring trajectories. The development of the present formalism was undertaken in an effort to remove positive Lyapunov exponents. The means chosen to accomplish this is coupling of the governing dynamical equations with the corresponding Liouville equation that describes the evolution of the flow of error probability. The underlying idea is to suppress the divergences of different trajectories that correspond to different initial conditions, without affecting a target trajectory, which is one that starts with prescribed initial conditions.

  8. Development of brain systems for nonsymbolic numerosity and the relationship to formal math academic achievement.

    PubMed

    Haist, Frank; Wazny, Jarnet H; Toomarian, Elizabeth; Adamo, Maha

    2015-02-01

    A central question in cognitive and educational neuroscience is whether brain operations supporting nonlinguistic intuitive number sense (numerosity) predict individual acquisition and academic achievement for symbolic or "formal" math knowledge. Here, we conducted a developmental functional magnetic resonance imaging (MRI) study of nonsymbolic numerosity task performance in 44 participants including 14 school age children (6-12 years old), 14 adolescents (13-17 years old), and 16 adults and compared a brain activity measure of numerosity precision to scores from the Woodcock-Johnson III Broad Math index of math academic achievement. Accuracy and reaction time from the numerosity task did not reliably predict formal math achievement. We found a significant positive developmental trend for improved numerosity precision in the parietal cortex and intraparietal sulcus specifically. Controlling for age and overall cognitive ability, we found a reliable positive relationship between individual math achievement scores and parietal lobe activity only in children. In addition, children showed robust positive relationships between math achievement and numerosity precision within ventral stream processing areas bilaterally. The pattern of results suggests a dynamic developmental trajectory for visual discrimination strategies that predict the acquisition of formal math knowledge. In adults, the efficiency of visual discrimination marked by numerosity acuity in ventral occipital-temporal cortex and hippocampus differentiated individuals with better or worse formal math achievement, respectively. Overall, these results suggest that two different brain systems for nonsymbolic numerosity acuity may contribute to individual differences in math achievement and that the contribution of these systems differs across development. © 2014 Wiley Periodicals, Inc.

  9. The mathematical bases for qualitative reasoning

    NASA Technical Reports Server (NTRS)

    Kalagnanam, Jayant; Simon, Herbert A.; Iwasaki, Yumi

    1991-01-01

    The practices of researchers in many fields who use qualitative reasoning are summarized and explained. The goal is to gain an understanding of the formal assumptions and mechanisms that underlie this kind of analysis. The explanations given are based on standard mathematical formalisms, particularly on ordinal properties, continuous differentiable functions, and the mathematics of nonlinear dynamic systems.

  10. Patterns In Contingencies: The Interlocking of Formal and Informal Political Institutions in Contemporary Argentina

    ERIC Educational Resources Information Center

    Llamazares, Ivan

    2005-01-01

    This article explores how the interlocking of formal and informal political institutions has affected the dynamics and performance of the Argentine democracy. Key institutional features of the Argentine political system have been a competitive form of federalism, loosely structured and political parties that are not ideologically unified,…

  11. Recognition of Prior Learning: The Participants' Perspective

    ERIC Educational Resources Information Center

    Miguel, Marta C.; Ornelas, José H.; Maroco, João P.

    2016-01-01

    The current narrative on lifelong learning goes beyond formal education and training, including learning at work, in the family and in the community. Recognition of prior learning is a process of evaluation of those skills and knowledge acquired through life experience, allowing them to be formally recognized by the qualification systems. It is a…

  12. Non-Formal Alternatives to Schooling: A Glossary of Educational Methods.

    ERIC Educational Resources Information Center

    Massachusetts Univ., Amherst. Center for International Education.

    This document describes activities in the field of nonformal education as an aid to educators as they develop programs to meet individual student needs. Advantages of nonformal education include that it is need-oriented, less expensive than formal systems, flexible, involves peer teaching, and does not encourage elitist feelings among students.…

  13. The Personnel Effectiveness Grid (PEG): A New Tool for Estimating Personnel Department Effectiveness

    ERIC Educational Resources Information Center

    Petersen, Donald J.; Malone, Robert L.

    1975-01-01

    Examines the difficulties inherent in attempting a formal personnel evaluation system, the major formal methods currently used for evaluating personnel department accountabilities, some parameters that should be part of a valid evaluation program, and a model for conducting the evaluation. (Available from Office of Publications, Graduate School of…

  14. A Formal Construction of Term Classes. Technical Report No. TR73-18.

    ERIC Educational Resources Information Center

    Yu, Clement T.

    The computational complexity of a formal process for the construction of term classes for information retrieval is examined. While the process is proven to be difficult computationally, heuristic methods are applied. Experimental results are obtained to illustrate the maximum possible improvement in system performance of retrieval using the formal…

  15. Framing the Adoption of Serious Games in Formal Education

    ERIC Educational Resources Information Center

    Arnab, Sylvester; Berta, Riccardo; Earp, Jeffrey; de Freitas, Sara; Popescu, Maria; Romero, Margarida; Stanescu, Ioana; Usart, Mireia

    2012-01-01

    Nowadays formal education systems are under increasing pressure to respond and adapt to rapid technological innovation and associated changes in the way we work and live. As well as accommodation of technology in its ever-diversifying forms, there is a fundamental need to enhance learning processes through evolution in pedagogical approaches, so…

  16. Developing Formal Object-oriented Requirements Specifications: A Model, Tool and Technique.

    ERIC Educational Resources Information Center

    Jackson, Robert B.; And Others

    1995-01-01

    Presents a formal object-oriented specification model (OSS) for computer software system development that is supported by a tool that automatically generates a prototype from an object-oriented analysis model (OSA) instance, lets the user examine the prototype, and permits the user to refine the OSA model instance to generate a requirements…

  17. Building ESD in Latin America

    ERIC Educational Resources Information Center

    Journal of Education for Sustainable Development, 2007

    2007-01-01

    To encourage efforts for furthering the UN DESD agenda in Latin America, a meeting titled "Building Education for Sustainable Development" was held in Costa Rica from 31 October to 2 November 2006. Plenary sessions were interspersed with working groups to look at how ESD can be integrated in formal and non-formal education systems, and…

  18. Peer Review of a Formal Verification/Design Proof Methodology

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The role of formal verification techniques in system validation was examined. The value and the state of the art of performance proving for fault-tolerant compuers were assessed. The investigation, development, and evaluation of performance proving tools were reviewed. The technical issues related to proof methodologies are examined. The technical issues discussed are summarized.

  19. Time-dependent boundary conditions for hyperbolic systems. II

    NASA Technical Reports Server (NTRS)

    Thompson, Kevin W.

    1990-01-01

    A general boundary condition formalism is developed for all types of boundary conditions to which hyperbolic systems are subject; the formalism makes possible a 'cookbook' approach to boundary conditions, by means of which novel boundary 'recipes' may be derived and previously devised ones may be consulted as required. Numerous useful conditions are derived for such CFD problems as subsonic and supersonic inflows and outflows, nonreflecting boundaries, force-free boundaries, constant pressure boundaries, and constant mass flux. Attention is given to the computation and integration of time derivatives.

  20. Time-dependent boundary conditions for hyperbolic systems. II

    NASA Astrophysics Data System (ADS)

    Thompson, Kevin W.

    1990-08-01

    A general boundary condition formalism is developed for all types of boundary conditions to which hyperbolic systems are subject; the formalism makes possible a 'cookbook' approach to boundary conditions, by means of which novel boundary 'recipes' may be derived and previously devised ones may be consulted as required. Numerous useful conditions are derived for such CFD problems as subsonic and supersonic inflows and outflows, nonreflecting boundaries, force-free boundaries, constant pressure boundaries, and constant mass flux. Attention is given to the computation and integration of time derivatives.

Top