Sample records for temporal logic specifications

  1. Specifying real-time systems with interval logic

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1988-01-01

    Pure temporal logic makes no reference to time. An interval temporal logic and an extension to that logic which includes real time constraints are described. The application of this logic by giving a specification for the well-known lift (elevator) example is demonstrated. It is shown how interval logic can be extended to include a notion of process. How the specification language and verification environment of EHDM could be enhanced to support this logic is described. A specification of the alternating bit protocol in this extended version of the specification language of EHDM is given.

  2. UML activity diagrams in requirements specification of logic controllers

    NASA Astrophysics Data System (ADS)

    Grobelna, Iwona; Grobelny, Michał

    2015-12-01

    Logic controller specification can be prepared using various techniques. One of them is the wide understandable and user-friendly UML language and its activity diagrams. Using formal methods during the design phase increases the assurance that implemented system meets the project requirements. In the approach we use the model checking technique to formally verify a specification against user-defined behavioral requirements. The properties are usually defined as temporal logic formulas. In the paper we propose to use UML activity diagrams in requirements definition and then to formalize them as temporal logic formulas. As a result, UML activity diagrams can be used both for logic controller specification and for requirements definition, what simplifies the specification and verification process.

  3. Graded Alternating-Time Temporal Logic

    NASA Astrophysics Data System (ADS)

    Faella, Marco; Napoli, Margherita; Parente, Mimmo

    Graded modalities enrich the universal and existential quantifiers with the capability to express the concept of at least k or all but k, for a non-negative integer k. Recently, temporal logics such as μ-calculus and Computational Tree Logic, Ctl, augmented with graded modalities have received attention from the scientific community, both from a theoretical side and from an applicative perspective. Both μ-calculus and Ctl naturally apply as specification languages for closed systems: in this paper, we add graded modalities to the Alternating-time Temporal Logic (Atl) introduced by Alur et al., to study how these modalities may affect specification languages for open systems.

  4. An Overview of the Runtime Verification Tool Java PathExplorer

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)

    2002-01-01

    We present an overview of the Java PathExplorer runtime verification tool, in short referred to as JPAX. JPAX can monitor the execution of a Java program and check that it conforms with a set of user provided properties formulated in temporal logic. JPAX can in addition analyze the program for concurrency errors such as deadlocks and data races. The concurrency analysis requires no user provided specification. The tool facilitates automated instrumentation of a program's bytecode, which when executed will emit an event stream, the execution trace, to an observer. The observer dispatches the incoming event stream to a set of observer processes, each performing a specialized analysis, such as the temporal logic verification, the deadlock analysis and the data race analysis. Temporal logic specifications can be formulated by the user in the Maude rewriting logic, where Maude is a high-speed rewriting system for equational logic, but here extended with executable temporal logic. The Maude rewriting engine is then activated as an event driven monitoring process. Alternatively, temporal specifications can be translated into efficient automata, which check the event stream. JPAX can be used during program testing to gain increased information about program executions, and can potentially furthermore be applied during operation to survey safety critical systems.

  5. Temporal Specification and Verification of Real-Time Systems.

    DTIC Science & Technology

    1991-08-30

    of concrete real - time systems can be modeled adequately. Specification: We present two conservative extensions of temporal logic that allow for the...logic. We present both model-checking algorithms for the automatic verification of finite-state real - time systems and proof methods for the deductive verification of real - time systems .

  6. Making Temporal Logic Calculational: A Tool for Unification and Discovery

    NASA Astrophysics Data System (ADS)

    Boute, Raymond

    In temporal logic, calculational proofs beyond simple cases are often seen as challenging. The situation is reversed by making temporal logic calculational, yielding shorter and clearer proofs than traditional ones, and serving as a (mental) tool for unification and discovery. A side-effect of unifying theories is easier access by practicians. The starting point is a simple generic (software tool independent) Functional Temporal Calculus (FTC). Specific temporal logics are then captured via endosemantic functions. This concept reflects tacit conventions throughout mathematics and, once identified, is general and useful. FTC also yields a reasoning style that helps discovering theorems by calculation rather than just proving given facts. This is illustrated by deriving various theorems, most related to liveness issues in TLA+, and finding strengthenings of known results. Educational issues are addressed in passing.

  7. Runtime Analysis of Linear Temporal Logic Specifications

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Havelund, Klaus

    2001-01-01

    This report presents an approach to checking a running program against its Linear Temporal Logic (LTL) specifications. LTL is a widely used logic for expressing properties of programs viewed as sets of executions. Our approach consists of translating LTL formulae to finite-state automata, which are used as observers of the program behavior. The translation algorithm we propose modifies standard LTL to B chi automata conversion techniques to generate automata that check finite program traces. The algorithm has been implemented in a tool, which has been integrated with the generic JPaX framework for runtime analysis of Java programs.

  8. Automatic Methods and Tools for the Verification of Real Time Systems

    DTIC Science & Technology

    1997-07-31

    real - time systems . This was accomplished by extending techniques, based on automata theory and temporal logic, that have been successful for the verification of time-independent reactive systems. As system specification lanmaage for embedded real - time systems , we introduced hybrid automata, which equip traditional discrete automata with real-numbered clock variables and continuous environment variables. As requirements specification languages, we introduced temporal logics with clock variables for expressing timing constraints.

  9. Fuzzy branching temporal logic.

    PubMed

    Moon, Seong-ick; Lee, Kwang H; Lee, Doheon

    2004-04-01

    Intelligent systems require a systematic way to represent and handle temporal information containing uncertainty. In particular, a logical framework is needed that can represent uncertain temporal information and its relationships with logical formulae. Fuzzy linear temporal logic (FLTL), a generalization of propositional linear temporal logic (PLTL) with fuzzy temporal events and fuzzy temporal states defined on a linear time model, was previously proposed for this purpose. However, many systems are best represented by branching time models in which each state can have more than one possible future path. In this paper, fuzzy branching temporal logic (FBTL) is proposed to address this problem. FBTL adopts and generalizes concurrent tree logic (CTL*), which is a classical branching temporal logic. The temporal model of FBTL is capable of representing fuzzy temporal events and fuzzy temporal states, and the order relation among them is represented as a directed graph. The utility of FBTL is demonstrated using a fuzzy job shop scheduling problem as an example.

  10. Temporal logics meet telerobotics

    NASA Technical Reports Server (NTRS)

    Rutten, Eric; Marce, Lionel

    1989-01-01

    The specificity of telerobotics being the presence of a human operator, decision assistance tools are necessary for the operator, especially in hostile environments. In order to reduce execution hazards due to a degraded ability for quick and efficient recovery of unexpected dangerous situations, it is of importance to have the opportunity, amongst others, to simulate the possible consequences of a plan before its actual execution, in order to detect these problematic situations. Hence the idea of providing the operator with a simulator enabling him to verify the temporal and logical coherence of his plans. Therefore, the power of logical formalisms is used for representation and deduction purposes. Starting from the class of situations that are represented, a STRIPS (the STanford Research Institute Problem Solver)-like formalism and its underlying logic are adapted to the simulation of plans of actions in time. The choice of a temporal logic enables to build a world representation, on which the effects of plans, grouping actions into control structures, will be transcribed by the simulation, resulting in a verdict and information about the plan's coherence.

  11. Specification and Verification of Web Applications in Rewriting Logic

    NASA Astrophysics Data System (ADS)

    Alpuente, María; Ballis, Demis; Romero, Daniel

    This paper presents a Rewriting Logic framework that formalizes the interactions between Web servers and Web browsers through a communicating protocol abstracting HTTP. The proposed framework includes a scripting language that is powerful enough to model the dynamics of complex Web applications by encompassing the main features of the most popular Web scripting languages (e.g. PHP, ASP, Java Servlets). We also provide a detailed characterization of browser actions (e.g. forward/backward navigation, page refresh, and new window/tab openings) via rewrite rules, and show how our models can be naturally model-checked by using the Linear Temporal Logic of Rewriting (LTLR), which is a Linear Temporal Logic specifically designed for model-checking rewrite theories. Our formalization is particularly suitable for verification purposes, since it allows one to perform in-depth analyses of many subtle aspects related to Web interaction. Finally, the framework has been completely implemented in Maude, and we report on some successful experiments that we conducted by using the Maude LTLR model-checker.

  12. Temporal-logic analysis of microglial phenotypic conversion with exposure to amyloid-β.

    PubMed

    Anastasio, Thomas J

    2015-02-01

    Alzheimer Disease (AD) remains a leading killer with no adequate treatment. Ongoing research increasingly implicates the brain's immune system as a critical contributor to AD pathogenesis, but the complexity of the immune contribution poses a barrier to understanding. Here I use temporal logic to analyze a computational specification of the immune component of AD. Temporal logic is an extension of logic to propositions expressed in terms of time. It has traditionally been used to analyze computational specifications of complex engineered systems but applications to complex biological systems are now appearing. The inflammatory component of AD involves the responses of microglia to the peptide amyloid-β (Aβ), which is an inflammatory stimulus and a likely causative AD agent. Temporal-logic analysis of the model provides explanations for the puzzling findings that Aβ induces an anti-inflammatory and well as a pro-inflammatory response, and that Aβ is phagocytized by microglia in young but not in old animals. To potentially explain the first puzzle, the model suggests that interferon-γ acts as an "autocrine bridge" over which an Aβ-induced increase in pro-inflammatory cytokines leads to an increase in anti-inflammatory mediators also. To potentially explain the second puzzle, the model identifies a potential instability in signaling via insulin-like growth factor 1 that could explain the failure of old microglia to phagocytize Aβ. The model predicts that augmentation of insulin-like growth factor 1 signaling, and activation of protein kinase C in particular, could move old microglia from a neurotoxic back toward a more neuroprotective and phagocytic phenotype.

  13. An interval logic for higher-level temporal reasoning

    NASA Technical Reports Server (NTRS)

    Schwartz, R. L.; Melliar-Smith, P. M.; Vogt, F. H.; Plaisted, D. A.

    1983-01-01

    Prior work explored temporal logics, based on classical modal logics, as a framework for specifying and reasoning about concurrent programs, distributed systems, and communications protocols, and reported on efforts using temporal reasoning primitives to express very high level abstract requirements that a program or system is to satisfy. Based on experience with those primitives, this report describes an Interval Logic that is more suitable for expressing such higher level temporal properties. The report provides a formal semantics for the Interval Logic, and several examples of its use. A description of decision procedures for the logic is also included.

  14. EAGLE can do Efficient LTL Monitoring

    NASA Technical Reports Server (NTRS)

    Barringer, Howard; Goldberg, Allen; Havelund, Klaus; Sen, Koushik

    2003-01-01

    We briefly present a rule-based framework, called EAGLE, that has been shown to be capable of defining and implementing finite trace monitoring logics, including future and past time temporal logic, extended regular expressions, real-time logics, interval logics, forms of quantified temporal logics, and so on. In this paper we show how EAGLE can do linear temporal logic (LTL) monitoring in an efficient way. We give an upper bound on the space and time complexity of this monitoring.

  15. Rule Systems for Runtime Verification: A Short Tutorial

    NASA Astrophysics Data System (ADS)

    Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex

    In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.

  16. Sensitivity and specificity of memory and naming tests for identifying left temporal-lobe epilepsy.

    PubMed

    Umfleet, Laura Glass; Janecek, Julie K; Quasney, Erin; Sabsevitz, David S; Ryan, Joseph J; Binder, Jeffrey R; Swanson, Sara J

    2015-01-01

    The sensitivity and specificity of the Selective Reminding Test (SRT) Delayed Recall, Wechsler Memory Scale (WMS) Logical Memory, the Boston Naming Test (BNT), and two nonverbal memory measures for detecting lateralized dysfunction in association with side of seizure focus was examined in a sample of 143 patients with left or right temporal-lobe epilepsy (TLE). Scores on the SRT and BNT were statistically significantly lower in the left TLE group compared with the right TLE group, whereas no group differences emerged on the Logical Memory subtest. No significant group differences were found with nonverbal memory measures. When the SRT and BNT were both entered as predictors in a logistic regression, the BNT, although significant, added minimal value to the model beyond the variance accounted for by the SRT Delayed Recall. Both variables emerged as significant predictors of side of seizure focus when entered into separate regressions. Sensitivity and specificity of the SRT and BNT ranged from 56% to 65%. The WMS Logical Memory and nonverbal memory measures were not significant predictors of the side of seizure focus.

  17. Temporal transcriptional logic of dynamic regulatory networks underlying nitrogen signaling and use in plants.

    PubMed

    Varala, Kranthi; Marshall-Colón, Amy; Cirrone, Jacopo; Brooks, Matthew D; Pasquino, Angelo V; Léran, Sophie; Mittal, Shipra; Rock, Tara M; Edwards, Molly B; Kim, Grace J; Ruffel, Sandrine; McCombie, W Richard; Shasha, Dennis; Coruzzi, Gloria M

    2018-06-19

    This study exploits time, the relatively unexplored fourth dimension of gene regulatory networks (GRNs), to learn the temporal transcriptional logic underlying dynamic nitrogen (N) signaling in plants. Our "just-in-time" analysis of time-series transcriptome data uncovered a temporal cascade of cis elements underlying dynamic N signaling. To infer transcription factor (TF)-target edges in a GRN, we applied a time-based machine learning method to 2,174 dynamic N-responsive genes. We experimentally determined a network precision cutoff, using TF-regulated genome-wide targets of three TF hubs (CRF4, SNZ, and CDF1), used to "prune" the network to 155 TFs and 608 targets. This network precision was reconfirmed using genome-wide TF-target regulation data for four additional TFs (TGA1, HHO5/6, and PHL1) not used in network pruning. These higher-confidence edges in the GRN were further filtered by independent TF-target binding data, used to calculate a TF "N-specificity" index. This refined GRN identifies the temporal relationship of known/validated regulators of N signaling (NLP7/8, TGA1/4, NAC4, HRS1, and LBD37/38/39) and 146 additional regulators. Six TFs-CRF4, SNZ, CDF1, HHO5/6, and PHL1-validated herein regulate a significant number of genes in the dynamic N response, targeting 54% of N-uptake/assimilation pathway genes. Phenotypically, inducible overexpression of CRF4 in planta regulates genes resulting in altered biomass, root development, and 15 NO 3 - uptake, specifically under low-N conditions. This dynamic N-signaling GRN now provides the temporal "transcriptional logic" for 155 candidate TFs to improve nitrogen use efficiency with potential agricultural applications. Broadly, these time-based approaches can uncover the temporal transcriptional logic for any biological response system in biology, agriculture, or medicine. Copyright © 2018 the Author(s). Published by PNAS.

  18. Automata-Based Verification of Temporal Properties on Running Programs

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Havelund, Klaus; Lan, Sonie (Technical Monitor)

    2001-01-01

    This paper presents an approach to checking a running program against its Linear Temporal Logic (LTL) specifications. LTL is a widely used logic for expressing properties of programs viewed as sets of executions. Our approach consists of translating LTL formulae to finite-state automata, which are used as observers of the program behavior. The translation algorithm we propose modifies standard LTL to Buchi automata conversion techniques to generate automata that check finite program traces. The algorithm has been implemented in a tool, which has been integrated with the generic JPaX framework for runtime analysis of Java programs.

  19. The Temporal Logic of the Tower Chief System

    NASA Technical Reports Server (NTRS)

    Hazelton, Lyman R., Jr.

    1990-01-01

    The purpose is to describe the logic used in the reasoning scheme employed in the Tower Chief system, a runway configuration management system. First, a review of classical logic is given. Defensible logics, truth maintenance, default logic, temporally dependent propositions, and resource allocation and planning are discussed.

  20. Application of temporal LNC logic in artificial intelligence

    NASA Astrophysics Data System (ADS)

    Adamek, Marek; Mulawka, Jan

    2016-09-01

    This paper presents the temporal logic inference engine developed in our university. It is an attempt to demonstrate implementation and practical application of temporal logic LNC developed in Cardinal Stefan Wyszynski University in Warsaw.1 The paper describes the fundamentals of LNC logic, architecture and implementation of inference engine. The practical application is shown by providing the solution for popular in Artificial Intelligence problem of Missionaries and Cannibals in terms of LNC logic. Both problem formulation and inference engine are described in details.

  1. Rule-Based Runtime Verification

    NASA Technical Reports Server (NTRS)

    Barringer, Howard; Goldberg, Allen; Havelund, Klaus; Sen, Koushik

    2003-01-01

    We present a rule-based framework for defining and implementing finite trace monitoring logics, including future and past time temporal logic, extended regular expressions, real-time logics, interval logics, forms of quantified temporal logics, and so on. Our logic, EAGLE, is implemented as a Java library and involves novel techniques for rule definition, manipulation and execution. Monitoring is done on a state-by-state basis, without storing the execution trace.

  2. Temporal logics and real time expert systems.

    PubMed

    Blom, J A

    1996-10-01

    This paper introduces temporal logics. Due to the eternal compromise between expressive adequacy and reasoning efficiency that must decided upon in any application, full (first order logic or modal logic based) temporal logics are frequently not suitable. This is especially true in real time expert systems, where a fixed (and usually small) response time must be guaranteed. One such expert system, Fagan's VM, is reviewed, and a delineation is given of how to formally describe and reason with time in medical protocols. It is shown that Petri net theory is a useful tool to check the correctness of formalised protocols.

  3. [Processes of logical thought in a case of cerebral vascular lesion].

    PubMed

    Blanco Men ndez, R; Aguado Balsas, A M

    Reasoning and logical thought processes have traditionally been attributed to frontal lobe function or,on the other hand, have been considered as diffuse functions of the brain. However, there is today evidence enough about the possibility to find dissociations in thought processes, depending on logical structure of the experimental tasks and referring to different areas of the brain, frontal and post rolandic ones. To study possible dissociations between thought structures corresponding to categorical and relational logic, on one hand, and propositional logic on the other hand. The case of a brain injured patient with vascular etiology, localized in left frontal parietal cortex, is presented. A specific battery of reasoning tests has been administered. . A differential performance at some reasoning experimental tasks has been found depending on such logical conceptual structures. The possibility of establishing dissociations among certain logical thought and intelectual functions depending on localization of possible brain lesion (frontal versus temporal) is discussed.

  4. Jeagle: a JAVA Runtime Verification Tool

    NASA Technical Reports Server (NTRS)

    DAmorim, Marcelo; Havelund, Klaus

    2005-01-01

    We introduce the temporal logic Jeagle and its supporting tool for runtime verification of Java programs. A monitor for an Jeagle formula checks if a finite trace of program events satisfies the formula. Jeagle is a programming oriented extension of the rule-based powerful Eagle logic that has been shown to be capable of defining and implementing a range of finite trace monitoring logics, including future and past time temporal logic, real-time and metric temporal logics, interval logics, forms of quantified temporal logics, and so on. Monitoring is achieved on a state-by-state basis avoiding any need to store the input trace. Jeagle extends Eagle with constructs for capturing parameterized program events such as method calls and method returns. Parameters can be the objects that methods are called upon, arguments to methods, and return values. Jeagle allows one to refer to these in formulas. The tool performs automated program instrumentation using AspectJ. We show the transformational semantics of Jeagle.

  5. Program Monitoring with LTL in EAGLE

    NASA Technical Reports Server (NTRS)

    Barringer, Howard; Goldberg, Allen; Havelund, Klaus; Sen, Koushik

    2004-01-01

    We briefly present a rule-based framework called EAGLE, shown to be capable of defining and implementing finite trace monitoring logics, including future and past time temporal logic, extended regular expressions, real-time and metric temporal logics (MTL), interval logics, forms of quantified temporal logics, and so on. In this paper we focus on a linear temporal logic (LTL) specialization of EAGLE. For an initial formula of size m, we establish upper bounds of O(m(sup 2)2(sup m)log m) and O(m(sup 4)2(sup 2m)log(sup 2) m) for the space and time complexity, respectively, of single step evaluation over an input trace. This bound is close to the lower bound O(2(sup square root m) for future-time LTL presented. EAGLE has been successfully used, in both LTL and metric LTL forms, to test a real-time controller of an experimental NASA planetary rover.

  6. Neuronal cell fate specification by the molecular convergence of different spatio-temporal cues on a common initiator terminal selector gene

    PubMed Central

    Stratmann, Johannes

    2017-01-01

    The extensive genetic regulatory flows underlying specification of different neuronal subtypes are not well understood at the molecular level. The Nplp1 neuropeptide neurons in the developing Drosophila nerve cord belong to two sub-classes; Tv1 and dAp neurons, generated by two distinct progenitors. Nplp1 neurons are specified by spatial cues; the Hox homeotic network and GATA factor grn, and temporal cues; the hb -> Kr -> Pdm -> cas -> grh temporal cascade. These spatio-temporal cues combine into two distinct codes; one for Tv1 and one for dAp neurons that activate a common terminal selector feedforward cascade of col -> ap/eya -> dimm -> Nplp1. Here, we molecularly decode the specification of Nplp1 neurons, and find that the cis-regulatory organization of col functions as an integratory node for the different spatio-temporal combinatorial codes. These findings may provide a logical framework for addressing spatio-temporal control of neuronal sub-type specification in other systems. PMID:28414802

  7. Model Checking Temporal Logic Formulas Using Sticker Automata

    PubMed Central

    Feng, Changwei; Wu, Huanmei

    2017-01-01

    As an important complex problem, the temporal logic model checking problem is still far from being fully resolved under the circumstance of DNA computing, especially Computation Tree Logic (CTL), Interval Temporal Logic (ITL), and Projection Temporal Logic (PTL), because there is still a lack of approaches for DNA model checking. To address this challenge, a model checking method is proposed for checking the basic formulas in the above three temporal logic types with DNA molecules. First, one-type single-stranded DNA molecules are employed to encode the Finite State Automaton (FSA) model of the given basic formula so that a sticker automaton is obtained. On the other hand, other single-stranded DNA molecules are employed to encode the given system model so that the input strings of the sticker automaton are obtained. Next, a series of biochemical reactions are conducted between the above two types of single-stranded DNA molecules. It can then be decided whether the system satisfies the formula or not. As a result, we have developed a DNA-based approach for checking all the basic formulas of CTL, ITL, and PTL. The simulated results demonstrate the effectiveness of the new method. PMID:29119114

  8. Temporal abstraction and inductive logic programming for arrhythmia recognition from electrocardiograms.

    PubMed

    Carrault, G; Cordier, M-O; Quiniou, R; Wang, F

    2003-07-01

    This paper proposes a novel approach to cardiac arrhythmia recognition from electrocardiograms (ECGs). ECGs record the electrical activity of the heart and are used to diagnose many heart disorders. The numerical ECG is first temporally abstracted into series of time-stamped events. Temporal abstraction makes use of artificial neural networks to extract interesting waves and their features from the input signals. A temporal reasoner called a chronicle recogniser processes such series in order to discover temporal patterns called chronicles which can be related to cardiac arrhythmias. Generally, it is difficult to elicit an accurate set of chronicles from a doctor. Thus, we propose to learn automatically from symbolic ECG examples the chronicles discriminating the arrhythmias belonging to some specific subset. Since temporal relationships are of major importance, inductive logic programming (ILP) is the tool of choice as it enables first-order relational learning. The approach has been evaluated on real ECGs taken from the MIT-BIH database. The performance of the different modules as well as the efficiency of the whole system is presented. The results are rather good and demonstrate that integrating numerical techniques for low level perception and symbolic techniques for high level classification is very valuable.

  9. Monitoring Java Programs with Java PathExplorer

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)

    2001-01-01

    We present recent work on the development Java PathExplorer (JPAX), a tool for monitoring the execution of Java programs. JPAX can be used during program testing to gain increased information about program executions, and can potentially furthermore be applied during operation to survey safety critical systems. The tool facilitates automated instrumentation of a program's late code which will then omit events to an observer during its execution. The observer checks the events against user provided high level requirement specifications, for example temporal logic formulae, and against lower level error detection procedures, for example concurrency related such as deadlock and data race algorithms. High level requirement specifications together with their underlying logics are defined in the Maude rewriting logic, and then can either be directly checked using the Maude rewriting engine, or be first translated to efficient data structures and then checked in Java.

  10. Analyzing Phylogenetic Trees with Timed and Probabilistic Model Checking: The Lactose Persistence Case Study.

    PubMed

    Requeno, José Ignacio; Colom, José Manuel

    2014-12-01

    Model checking is a generic verification technique that allows the phylogeneticist to focus on models and specifications instead of on implementation issues. Phylogenetic trees are considered as transition systems over which we interrogate phylogenetic questions written as formulas of temporal logic. Nonetheless, standard logics become insufficient for certain practices of phylogenetic analysis since they do not allow the inclusion of explicit time and probabilities. The aim of this paper is to extend the application of model checking techniques beyond qualitative phylogenetic properties and adapt the existing logical extensions and tools to the field of phylogeny. The introduction of time and probabilities in phylogenetic specifications is motivated by the study of a real example: the analysis of the ratio of lactose intolerance in some populations and the date of appearance of this phenotype.

  11. Analyzing phylogenetic trees with timed and probabilistic model checking: the lactose persistence case study.

    PubMed

    Requeno, José Ignacio; Colom, José Manuel

    2014-10-23

    Model checking is a generic verification technique that allows the phylogeneticist to focus on models and specifications instead of on implementation issues. Phylogenetic trees are considered as transition systems over which we interrogate phylogenetic questions written as formulas of temporal logic. Nonetheless, standard logics become insufficient for certain practices of phylogenetic analysis since they do not allow the inclusion of explicit time and probabilities. The aim of this paper is to extend the application of model checking techniques beyond qualitative phylogenetic properties and adapt the existing logical extensions and tools to the field of phylogeny. The introduction of time and probabilities in phylogenetic specifications is motivated by the study of a real example: the analysis of the ratio of lactose intolerance in some populations and the date of appearance of this phenotype.

  12. A Linguistic Truth-Valued Temporal Reasoning Formalism and Its Implementation

    NASA Astrophysics Data System (ADS)

    Lu, Zhirui; Liu, Jun; Augusto, Juan C.; Wang, Hui

    Temporality and uncertainty are important features of many real world systems. Solving problems in such systems requires the use of formal mechanism such as logic systems, statistical methods or other reasoning and decision-making methods. In this paper, we propose a linguistic truth-valued temporal reasoning formalism to enable the management of both features concurrently using a linguistic truth valued logic and a temporal logic. We also provide a backward reasoning algorithm which allows the answering of user queries. A simple but realistic scenario in a smart home application is used to illustrate our work.

  13. UTP and Temporal Logic Model Checking

    NASA Astrophysics Data System (ADS)

    Anderson, Hugh; Ciobanu, Gabriel; Freitas, Leo

    In this paper we give an additional perspective to the formal verification of programs through temporal logic model checking, which uses Hoare and He Unifying Theories of Programming (UTP). Our perspective emphasizes the use of UTP designs, an alphabetised relational calculus expressed as a pre/post condition pair of relations, to verify state or temporal assertions about programs. The temporal model checking relation is derived from a satisfaction relation between the model and its properties. The contribution of this paper is that it shows a UTP perspective to temporal logic model checking. The approach includes the notion of efficiency found in traditional model checkers, which reduced a state explosion problem through the use of efficient data structures

  14. EAGLE Monitors by Collecting Facts and Generating Obligations

    NASA Technical Reports Server (NTRS)

    Barrnger, Howard; Goldberg, Allen; Havelund, Klaus; Sen, Koushik

    2003-01-01

    We present a rule-based framework, called EAGLE, that has been shown to be capable of defining and implementing a range of finite trace monitoring logics, including future and past time temporal logic, extended regular expressions, real-time and metric temporal logics, interval logics, forms of quantified temporal logics, and so on. A monitor for an EAGLE formula checks if a finite trace of states satisfies the given formula. We present, in details, an algorithm for the synthesis of monitors for EAGLE. The algorithm is implemented as a Java application and involves novel techniques for rule definition, manipulation and execution. Monitoring is achieved on a state-by-state basis avoiding any need to store the input trace of states. Our initial experiments have been successful as EAGLE detected a previously unknown bug while testing a planetary rover controller.

  15. A Strategy for Efficiently Verifying Requirements Specifications Using Composition and Invariants

    DTIC Science & Technology

    2003-09-05

    Colle - sur - Loup , France, Oct. 1984. Springer-Verlag. [34] J. Ramish. Empirical studies of compositional abstraction. Technical report, Naval Research...global to modular temporal rea- soning about programs. In K. R. Apt, editor, Proc. NATO Adv. Study Inst. on Logics and Models of Concurrent Systems, La

  16. Nonvolatile reconfigurable sequential logic in a HfO2 resistive random access memory array.

    PubMed

    Zhou, Ya-Xiong; Li, Yi; Su, Yu-Ting; Wang, Zhuo-Rui; Shih, Ling-Yi; Chang, Ting-Chang; Chang, Kuan-Chang; Long, Shi-Bing; Sze, Simon M; Miao, Xiang-Shui

    2017-05-25

    Resistive random access memory (RRAM) based reconfigurable logic provides a temporal programmable dimension to realize Boolean logic functions and is regarded as a promising route to build non-von Neumann computing architecture. In this work, a reconfigurable operation method is proposed to perform nonvolatile sequential logic in a HfO 2 -based RRAM array. Eight kinds of Boolean logic functions can be implemented within the same hardware fabrics. During the logic computing processes, the RRAM devices in an array are flexibly configured in a bipolar or complementary structure. The validity was demonstrated by experimentally implemented NAND and XOR logic functions and a theoretically designed 1-bit full adder. With the trade-off between temporal and spatial computing complexity, our method makes better use of limited computing resources, thus provides an attractive scheme for the construction of logic-in-memory systems.

  17. Model checking for linear temporal logic: An efficient implementation

    NASA Technical Reports Server (NTRS)

    Sherman, Rivi; Pnueli, Amir

    1990-01-01

    This report provides evidence to support the claim that model checking for linear temporal logic (LTL) is practically efficient. Two implementations of a linear temporal logic model checker is described. One is based on transforming the model checking problem into a satisfiability problem; the other checks an LTL formula for a finite model by computing the cross-product of the finite state transition graph of the program with a structure containing all possible models for the property. An experiment was done with a set of mutual exclusion algorithms and tested safety and liveness under fairness for these algorithms.

  18. Testing Linear Temporal Logic Formulae on Finite Execution Traces

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Rosu, Grigore; Norvig, Peter (Technical Monitor)

    2001-01-01

    We present an algorithm for efficiently testing Linear Temporal Logic (LTL) formulae on finite execution traces. The standard models of LTL are infinite traces, reflecting the behavior of reactive and concurrent systems which conceptually may be continuously alive. In most past applications of LTL. theorem provers and model checkers have been used to formally prove that down-scaled models satisfy such LTL specifications. Our goal is instead to use LTL for up-scaled testing of real software applications. Such tests correspond to analyzing the conformance of finite traces against LTL formulae. We first describe what it means for a finite trace to satisfy an LTL property. We then suggest an optimized algorithm based on transforming LTL formulae. The work is done using the Maude rewriting system. which turns out to provide a perfect notation and an efficient rewriting engine for performing these experiments.

  19. Unique and shared validity of the "Wechsler logical memory test", the "California verbal learning test", and the "verbal learning and memory test" in patients with epilepsy.

    PubMed

    Helmstaedter, Christoph; Wietzke, Jennifer; Lutz, Martin T

    2009-12-01

    This study was set-up to evaluate the construct validity of three verbal memory tests in epilepsy patients. Sixty-one consecutively evaluated patients with temporal lobe epilepsy (TLE) or extra-temporal epilepsy (E-TLE) underwent testing with the verbal learning and memory test (VLMT, the German equivalent of the Rey auditory verbal learning test, RAVLT); the California verbal learning test (CVLT); the logical memory and digit span subtests of the Wechsler memory scale, revised (WMS-R); and testing of intelligence, attention, speech and executive functions. Factor analysis of the memory tests resulted in test-specific rather than test over-spanning factors. Parameters of the CVLT and WMS-R, and to a much lesser degree of the VLMT, were highly correlated with attention, language function and vocabulary. Delayed recall measures of logical memory and the VLMT differentiated TLE from E-TLE. Learning and memory scores off all three tests differentiated mesial temporal sclerosis from other pathologies. A lateralization of the epilepsy was possible only for a subsample of 15 patients with mesial TLE. Although the three tests provide overlapping indicators for a temporal lobe epilepsy or a mesial pathology, they can hardly be taken in exchange. The tests have different demands on semantic processing and memory organization, and they appear differentially sensitive to performance in non-memory domains. The tests capability to lateralize appears to be poor. The findings encourage the further discussion of the dependency of memory outcomes on test selection.

  20. Using Temporal Logic to Specify and Verify Cryptographic Protocols (Progress Report)

    DTIC Science & Technology

    1995-01-01

    know, Meadows’ 1Supported by grant HKUST 608/94E from the Hong Kong Research Grants Council. 1 Report Documentation Page Form ApprovedOMB No. 0704... 1 Introduction We have started work on a project to apply temporal logic to reason about cryptographic protocols. Some of the goals of the project...are as follows. 1 . Allow the user to state and prove that the penetrator cannot use logical or algebraic techniques (e.g., we are disregarding

  1. Methodology for the specification of communication activities within the framework of a multi-layered architecture: Toward the definition of a knowledge base

    NASA Astrophysics Data System (ADS)

    Amyay, Omar

    A method defined in terms of synthesis and verification steps is presented. The specification of the services and protocols of communication within a multilayered architecture of the Open Systems Interconnection (OSI) type is an essential issue for the design of computer networks. The aim is to obtain an operational specification of the protocol service couple of a given layer. Planning synthesis and verification steps constitute a specification trajectory. The latter is based on the progressive integration of the 'initial data' constraints and verification of the specification originating from each synthesis step, through validity constraints that characterize an admissible solution. Two types of trajectories are proposed according to the style of the initial specification of the service protocol couple: operational type and service supplier viewpoint; knowledge property oriented type and service viewpoint. Synthesis and verification activities were developed and formalized in terms of labeled transition systems, temporal logic and epistemic logic. The originality of the second specification trajectory and the use of the epistemic logic are shown. An 'artificial intelligence' approach enables a conceptual model to be defined for a knowledge base system for implementing the method proposed. It is structured in three levels of representation of the knowledge relating to the domain, the reasoning characterizing synthesis and verification activities and the planning of the steps of a specification trajectory.

  2. Closing the Gap Between Specification and Programming: VDM++ and SCALA

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus

    2011-01-01

    We argue that a modern programming language such as Scala offers a level of succinctness, which makes it suitable for program and systems specification as well as for high-level programming. We illustrate this by comparing the language with the Vdm++ specification language. The comparison also identifies areas where Scala perhaps could be improved, inspired by Vdm++. We furthermore illustrate Scala's potential as a specification language by augmenting it with a combination of parameterized state machines and temporal logic, defined as a library, thereby forming an expressive but simple runtime verification framework.

  3. A molecular logic gate.

    PubMed

    Kompa, K L; Levine, R D

    2001-01-16

    We propose a scheme for molecule-based information processing by combining well-studied spectroscopic techniques and recent results from chemical dynamics. Specifically it is discussed how optical transitions in single molecules can be used to rapidly perform classical (Boolean) logical operations. In the proposed way, a restricted number of states in a single molecule can act as a logical gate equivalent to at least two switches. It is argued that the four-level scheme can also be used to produce gain, because it allows an inversion, and not only a switching ability. The proposed scheme is quantum mechanical in that it takes advantage of the discrete nature of the energy levels but, we here discuss the temporal evolution, with the use of the populations only. On a longer time range we suggest that the same scheme could be extended to perform quantum logic, and a tentative suggestion, based on an available experiment, is discussed. We believe that the pumping can provide a partial proof of principle, although this and similar experiments were not interpreted thus far in our terms.

  4. Enhancing molecular logic through modulation of temporal and spatial constraints with quantum dot-based systems that use fluorescent (Förster) resonance energy transfer

    NASA Astrophysics Data System (ADS)

    Claussen, Jonathan C.; Algar, W. Russ; Hildebrandt, Niko; Susumu, Kimihiro; Ancona, Mario G.; Medintz, Igor L.

    2013-10-01

    Luminescent semiconductor nanocrystals or quantum dots (QDs) contain favorable photonic properties (e.g., resistance to photobleaching, size-tunable PL, and large effective Stokes shifts) that make them well-suited for fluorescence (Förster) resonance energy transfer (FRET) based applications including monitoring proteolytic activity, elucidating the effects of nanoparticles-mediated drug delivery, and analyzing the spatial and temporal dynamics of cellular biochemical processes. Herein, we demonstrate how unique considerations of temporal and spatial constraints can be used in conjunction with QD-FRET systems to open up new avenues of scientific discovery in information processing and molecular logic circuitry. For example, by conjugating both long lifetime luminescent terbium(III) complexes (Tb) and fluorescent dyes (A647) to a single QD, we can create multiple FRET lanes that change temporally as the QD acts as both an acceptor and donor at distinct time intervals. Such temporal FRET modulation creates multi-step FRET cascades that produce a wealth of unique photoluminescence (PL) spectra that are well-suited for the construction of a photonic alphabet and photonic logic circuits. These research advances in bio-based molecular logic open the door to future applications including multiplexed biosensing and drug delivery for disease diagnostics and treatment.

  5. Space, Time, History: The Reassertion of Space in Social Theory

    ERIC Educational Resources Information Center

    Peters, Michael A.; Kessl, Fabian

    2009-01-01

    The reassertion of space is discussed as an analytical awareness of the past obsession with temporal logics. Theorists now understand that social sciences discourses were shaped by a preoccupation with the temporal scales and logics of development considered as natural processes. The spatial turn in social theory is often seen to be a process of…

  6. Formal verification of automated teller machine systems using SPIN

    NASA Astrophysics Data System (ADS)

    Iqbal, Ikhwan Mohammad; Adzkiya, Dieky; Mukhlash, Imam

    2017-08-01

    Formal verification is a technique for ensuring the correctness of systems. This work focuses on verifying a model of the Automated Teller Machine (ATM) system against some specifications. We construct the model as a state transition diagram that is suitable for verification. The specifications are expressed as Linear Temporal Logic (LTL) formulas. We use Simple Promela Interpreter (SPIN) model checker to check whether the model satisfies the formula. This model checker accepts models written in Process Meta Language (PROMELA), and its specifications are specified in LTL formulas.

  7. A molecular logic gate

    PubMed Central

    Kompa, K. L.; Levine, R. D.

    2001-01-01

    We propose a scheme for molecule-based information processing by combining well-studied spectroscopic techniques and recent results from chemical dynamics. Specifically it is discussed how optical transitions in single molecules can be used to rapidly perform classical (Boolean) logical operations. In the proposed way, a restricted number of states in a single molecule can act as a logical gate equivalent to at least two switches. It is argued that the four-level scheme can also be used to produce gain, because it allows an inversion, and not only a switching ability. The proposed scheme is quantum mechanical in that it takes advantage of the discrete nature of the energy levels but, we here discuss the temporal evolution, with the use of the populations only. On a longer time range we suggest that the same scheme could be extended to perform quantum logic, and a tentative suggestion, based on an available experiment, is discussed. We believe that the pumping can provide a partial proof of principle, although this and similar experiments were not interpreted thus far in our terms. PMID:11209046

  8. Using Mobile TLA as a Logic for Dynamic I/O Automata

    NASA Astrophysics Data System (ADS)

    Kapus, Tatjana

    Input/Output (I/O) automata and the Temporal Logic of Actions (TLA) are two well-known techniques for the specification and verification of concurrent systems. Over the past few years, they have been extended to the so-called dynamic I/O automata and, respectively, Mobile TLA (MTLA) in order to be more appropriate for mobile agent systems. Dynamic I/O automata is just a mathematical model, whereas MTLA is a logic with a formally defined language. In this paper, therefore, we investigate how MTLA could be used as a formal language for the specification of dynamic I/O automata. We do this by writing an MTLA specification of a travel agent system which has been specified semi-formally in the literature on that model. In this specification, we deal with always existing agents as well as with an initially unknown number of dynamically created agents, with mobile and non-mobile agents, with I/O-automata-style communication, and with the changing communication capabilities of mobile agents. We have previously written a TLA specification of this system. This paper shows that an MTLA specification of such a system can be more elegant and faithful to the dynamic I/O automata definition because the agent existence and location can be expressed directly by using agent and location names instead of special variables as in TLA. It also shows how the reuse of names for dynamically created and destroyed agents within the dynamic I/O automata framework can be specified in MTLA.

  9. An efficient temporal logic for robotic task planning

    NASA Technical Reports Server (NTRS)

    Becker, Jeffrey M.

    1989-01-01

    Computations required for temporal reasoning can be prohibitively expensive if fully general representations are used. Overly simple representations, such as totally ordered sequence of time points, are inadequate for use in a nonlinear task planning system. A middle ground is identified which is general enough to support a capable nonlinear task planner, but specialized enough that the system can support online task planning in real time. A Temporal Logic System (TLS) was developed during the Intelligent Task Automation (ITA) project to support robotic task planning. TLS is also used within the ITA system to support plan execution, monitoring, and exception handling.

  10. YIP Formal Synthesis of Software-Based Control Protocols for Fractionated,Composable Autonomous Systems

    DTIC Science & Technology

    2016-07-08

    Systems Using Automata Theory and Barrier Certifi- cates We developed a sound but incomplete method for the computational verification of specifications...method merges ideas from automata -based model checking with those from control theory including so-called barrier certificates and optimization-based... Automata theory meets barrier certificates: Temporal logic verification of nonlinear systems,” IEEE Transactions on Automatic Control, 2015. [J2] R

  11. On a concept of computer game implementation based on a temporal logic

    NASA Astrophysics Data System (ADS)

    Szymańska, Emilia; Adamek, Marek J.; Mulawka, Jan J.

    2017-08-01

    Time is a concept which underlies all the contemporary civilization. Therefore, it was necessary to create mathematical tools that allow a precise way to describe the complex time dependencies. One such tool is temporal logic. Its definition, description and characteristics will be presented in this publication. Then the authors will conduct a discussion on the usefulness of this tool in context of creating storyline in computer games such as RPG genre.

  12. Syntax, Concepts, and Logic in the Temporal Dynamics of Language Comprehension: Evidence from Event-Related Potentials

    ERIC Educational Resources Information Center

    Steinhauer, Karsten; Drury, John E.; Portner, Paul; Walenski, Matthew; Ullman, Michael T.

    2010-01-01

    Logic has been intertwined with the study of language and meaning since antiquity, and such connections persist in present day research in linguistic theory (formal semantics) and cognitive psychology (e.g., studies of human reasoning). However, few studies in cognitive neuroscience have addressed logical dimensions of sentence-level language…

  13. Reasoning About Digital Circuits.

    DTIC Science & Technology

    1983-07-01

    The dissertation will later examine the logic’s formal syntax and semantics in great depth. Below are a few English - language statements and...function have a fixed point. Temporal lolc as a programming langua " Temporal logic can be used directly a a propamuing language . For example, the ...for a separate "sertion language ." For example, the formula S[(I+- );(I + i -- I) (I+2- I) states that if the variable I twice increaes by I in an

  14. Developing Formal Correctness Properties from Natural Language Requirements

    NASA Technical Reports Server (NTRS)

    Nikora, Allen P.

    2006-01-01

    This viewgraph presentation reviews the rationale of the program to transform natural language specifications into formal notation.Specifically, automate generation of Linear Temporal Logic (LTL)correctness properties from natural language temporal specifications. There are several reasons for this approach (1) Model-based techniques becoming more widely accepted, (2) Analytical verification techniques (e.g., model checking, theorem proving) significantly more effective at detecting types of specification design errors (e.g., race conditions, deadlock) than manual inspection, (3) Many requirements still written in natural language, which results in a high learning curve for specification languages, associated tools and increased schedule and budget pressure on projects reduce training opportunities for engineers, and (4) Formulation of correctness properties for system models can be a difficult problem. This has relevance to NASA in that it would simplify development of formal correctness properties, lead to more widespread use of model-based specification, design techniques, assist in earlier identification of defects and reduce residual defect content for space mission software systems. The presentation also discusses: potential applications, accomplishments and/or technological transfer potential and the next steps.

  15. Reasoning about real-time systems with temporal interval logic constraints on multi-state automata

    NASA Technical Reports Server (NTRS)

    Gabrielian, Armen

    1991-01-01

    Models of real-time systems using a single paradigm often turn out to be inadequate, whether the paradigm is based on states, rules, event sequences, or logic. A model-based approach to reasoning about real-time systems is presented in which a temporal interval logic called TIL is employed to define constraints on a new type of high level automata. The combination, called hierarchical multi-state (HMS) machines, can be used to model formally a real-time system, a dynamic set of requirements, the environment, heuristic knowledge about planning-related problem solving, and the computational states of the reasoning mechanism. In this framework, mathematical techniques were developed for: (1) proving the correctness of a representation; (2) planning of concurrent tasks to achieve goals; and (3) scheduling of plans to satisfy complex temporal constraints. HMS machines allow reasoning about a real-time system from a model of how truth arises instead of merely depending of what is true in a system.

  16. Design, Specification, and Synthesis of Aircraft Electric Power Systems Control Logic

    NASA Astrophysics Data System (ADS)

    Xu, Huan

    Cyber-physical systems integrate computation, networking, and physical processes. Substantial research challenges exist in the design and verification of such large-scale, distributed sensing, actuation, and control systems. Rapidly improving technology and recent advances in control theory, networked systems, and computer science give us the opportunity to drastically improve our approach to integrated flow of information and cooperative behavior. Current systems rely on text-based specifications and manual design. Using new technology advances, we can create easier, more efficient, and cheaper ways of developing these control systems. This thesis will focus on design considerations for system topologies, ways to formally and automatically specify requirements, and methods to synthesize reactive control protocols, all within the context of an aircraft electric power system as a representative application area. This thesis consists of three complementary parts: synthesis, specification, and design. The first section focuses on the synthesis of central and distributed reactive controllers for an aircraft elec- tric power system. This approach incorporates methodologies from computer science and control. The resulting controllers are correct by construction with respect to system requirements, which are formulated using the specification language of linear temporal logic (LTL). The second section addresses how to formally specify requirements and introduces a domain-specific language for electric power systems. A software tool automatically converts high-level requirements into LTL and synthesizes a controller. The final sections focus on design space exploration. A design methodology is proposed that uses mixed-integer linear programming to obtain candidate topologies, which are then used to synthesize controllers. The discrete-time control logic is then verified in real-time by two methods: hardware and simulation. Finally, the problem of partial observability and dynamic state estimation is explored. Given a set placement of sensors on an electric power system, measurements from these sensors can be used in conjunction with control logic to infer the state of the system.

  17. [The comparative analysis of changes of short pieces of EEG at perception of music on the basis of the event-related synchronization/desynchronization and wavelet-synchrony].

    PubMed

    Oknina, L B; Kuptsova, S V; Romanov, A S; Masherov, E L; Kuznetsova, O A; Sharova, E V

    2012-01-01

    The going of present pilot study is an analysis of features changes of EEG short pieces registered from 32 sites, at perception of musical melodies healthy examinees depending on logic (cognizance) and emotional (it was pleasant it was not pleasant) melody estimations. For this purpose changes of event-related synchronization/desynchronization, and also wavelet-synchrony of EEG-responses at 31 healthy examinees at the age from 18 till 60 years were compared. It is shown that at a logic estimation of music the melody cognizance is accompanied the event-related desynchronization in the left fronto-parietal-temporal area. At an emotional estimation of a melody the event-related synchronization in left fronto - temporal area for the pleasant melodies, desynchronization in temporal area for not pleasant and desynchronization in occipital area for the melodies which are not causing the emotional response is typical. At the analysis of wavelet-synchrony of EEG characterizing jet changes of interaction of cortical zones, it is revealed that the most distinct topographical distinctions concern type of processing of the heard music: logic (has learned-hasn't learned) or emotional (it was pleasant-it was not pleasant). If at an emotional estimation changes interhemispheric communications between associative cortical zones (central, frontal, temporal), are more expressed at logic - between inter - and intrahemispheric communications of projective zones of the acoustic analyzer (temporal area). It is supposed that the revealed event-related synchronization/desynhronization reflects, most likely, an activation component of an estimation of musical fragments whereas the wavelet-analysis provides guidance on character of processing of musical stimulus.

  18. Data Automata in Scala

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus

    2014-01-01

    The field of runtime verification has during the last decade seen a multitude of systems for monitoring event sequences (traces) emitted by a running system. The objective is to ensure correctness of a system by checking its execution traces against formal specifications representing requirements. A special challenge is data parameterized events, where monitors have to keep track of the combination of control states as well as data constraints, relating events and the data they carry across time points. This poses a challenge wrt. efficiency of monitors, as well as expressiveness of logics. Data automata is a form of automata where states are parameterized with data, supporting monitoring of data parameterized events. We describe the full details of a very simple API in the Scala programming language, an internal DSL (Domain-Specific Language), implementing data automata. The small implementation suggests a design pattern. Data automata allow transition conditions to refer to other states than the source state, and allow target states of transitions to be inlined, offering a temporal logic flavored notation. An embedding of a logic in a high-level language like Scala in addition allows monitors to be programmed using all of Scala's language constructs, offering the full flexibility of a programming language. The framework is demonstrated on an XML processing scenario previously addressed in related work.

  19. Efficiency of the strong satisfiability checking procedure for reactive system specifications

    NASA Astrophysics Data System (ADS)

    Shimakawa, Masaya; Hagihara, Shigeki; Yonezaki, Naoki

    2018-04-01

    Reactive systems are those that interact with their environment. To develop reactive systems without defects, it is important to describe behavior specifications in a formal language, such as linear temporal logic, and to verify the specification. Specifically, it is important to check whether specifications satisfy the property called realizability. In previous studies, we have proposed the concept of strong satisfiability as a necessary condition for realizability. Although this property of reactive system specifications is a necessary condition, many practical unrealizable specifications are also strongly unsatisfiable. Moreover, we have previously shown the theoretical complexity of the strong satisfiability problem. In the current study, we investigate the practical efficiency of the strong satisfiability checking procedure and demonstrate that strong satisfiability can be checked more efficiently than realizability.

  20. Impact of Temporal Masking of Flip-Flop Upsets on Soft Error Rates of Sequential Circuits

    NASA Astrophysics Data System (ADS)

    Chen, R. M.; Mahatme, N. N.; Diggins, Z. J.; Wang, L.; Zhang, E. X.; Chen, Y. P.; Liu, Y. N.; Narasimham, B.; Witulski, A. F.; Bhuva, B. L.; Fleetwood, D. M.

    2017-08-01

    Reductions in single-event (SE) upset (SEU) rates for sequential circuits due to temporal masking effects are evaluated. The impacts of supply voltage, combinational-logic delay, flip-flop (FF) SEU performance, and particle linear energy transfer (LET) values are analyzed for SE cross sections of sequential circuits. Alpha particles and heavy ions with different LET values are used to characterize the circuits fabricated at the 40-nm bulk CMOS technology node. Experimental results show that increasing the delay of the logic circuit present between FFs and decreasing the supply voltage are two effective ways of reducing SE error rates for sequential circuits for particles with low LET values due to temporal masking. SEU-hardened FFs benefit less from temporal masking than conventional FFs. Circuit hardening implications for SEU-hardened and unhardened FFs are discussed.

  1. Synthesizing Dynamic Programming Algorithms from Linear Temporal Logic Formulae

    NASA Technical Reports Server (NTRS)

    Rosu, Grigore; Havelund, Klaus

    2001-01-01

    The problem of testing a linear temporal logic (LTL) formula on a finite execution trace of events, generated by an executing program, occurs naturally in runtime analysis of software. We present an algorithm which takes an LTL formula and generates an efficient dynamic programming algorithm. The generated algorithm tests whether the LTL formula is satisfied by a finite trace of events given as input. The generated algorithm runs in linear time, its constant depending on the size of the LTL formula. The memory needed is constant, also depending on the size of the formula.

  2. Design and verification of distributed logic controllers with application of Petri nets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiśniewski, Remigiusz; Grobelna, Iwona; Grobelny, Michał

    2015-12-31

    The paper deals with the designing and verification of distributed logic controllers. The control system is initially modelled with Petri nets and formally verified against structural and behavioral properties with the application of the temporal logic and model checking technique. After that it is decomposed into separate sequential automata that are working concurrently. Each of them is re-verified and if the validation is successful, the system can be finally implemented.

  3. Design and verification of distributed logic controllers with application of Petri nets

    NASA Astrophysics Data System (ADS)

    Wiśniewski, Remigiusz; Grobelna, Iwona; Grobelny, Michał; Wiśniewska, Monika

    2015-12-01

    The paper deals with the designing and verification of distributed logic controllers. The control system is initially modelled with Petri nets and formally verified against structural and behavioral properties with the application of the temporal logic and model checking technique. After that it is decomposed into separate sequential automata that are working concurrently. Each of them is re-verified and if the validation is successful, the system can be finally implemented.

  4. Advances and challenges in logical modeling of cell cycle regulation: perspective for multi-scale, integrative yeast cell models

    PubMed Central

    Todd, Robert G.; van der Zee, Lucas

    2016-01-01

    Abstract The eukaryotic cell cycle is robustly designed, with interacting molecules organized within a definite topology that ensures temporal precision of its phase transitions. Its underlying dynamics are regulated by molecular switches, for which remarkable insights have been provided by genetic and molecular biology efforts. In a number of cases, this information has been made predictive, through computational models. These models have allowed for the identification of novel molecular mechanisms, later validated experimentally. Logical modeling represents one of the youngest approaches to address cell cycle regulation. We summarize the advances that this type of modeling has achieved to reproduce and predict cell cycle dynamics. Furthermore, we present the challenge that this type of modeling is now ready to tackle: its integration with intracellular networks, and its formalisms, to understand crosstalks underlying systems level properties, ultimate aim of multi-scale models. Specifically, we discuss and illustrate how such an integration may be realized, by integrating a minimal logical model of the cell cycle with a metabolic network. PMID:27993914

  5. Cognitive changes in people with temporal lobe epilepsy over a 13-year period.

    PubMed

    Mameniškienė, Rūta; Rimšienė, Justė; Puronaitė, Roma

    2016-10-01

    The aims of our study were to evaluate cognitive decline in people with temporal lobe epilepsy over a period of 13years and to determine what clinical and treatment characteristics may have been associated with these. Thirty-three individuals with temporal lobe epilepsy underwent the same neuropsychological assessment of verbal and nonverbal memory, attention, and executive functions using the same cognitive test battery as one used 13years ago. Long-term verbal and nonverbal memory was tested four weeks later. Results were compared with those carried out 13years earlier. There was no significant change in verbal and verbal-logical memory tests; however, nonverbal memory worsened significantly. Long-term verbal memory declined for 21.9% of participants, long-term verbal-logical memory for 34.4%, and long-term nonverbal memory for 56.3%. Worsening of working verbal and verbal-logical memory was associated with longer epilepsy duration and lower levels of patients' education; worsening of verbal delayed recall and long-term verbal-logical memory was associated with higher seizure frequency. Decline in long-term nonverbal memory had significant association with a longer duration of epilepsy. The worsening of reaction and attention inversely correlated with the symptoms of depression. Over a 13-year period, cognitive functions did not change significantly. Good seizure control and reduced symptoms of depression in this sample of people with temporal lobe epilepsy were associated with better cognitive functioning. The predictors of change of cognitive functions could be complex and require further study. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. TraceContract: A Scala DSL for Trace Analysis

    NASA Technical Reports Server (NTRS)

    Barringer, Howard; Havelund, Klaus

    2011-01-01

    In this paper we describe TRACECONTRACT, an API for trace analysis, implemented in the SCALA programming language. We argue that for certain forms of trace analysis the best weapon is a high level programming language augmented with constructs for temporal reasoning. A trace is a sequence of events, which may for example be generated by a running program, instrumented appropriately to generate events. The API supports writing properties in a notation that combines an advanced form of data parameterized state machines with temporal logic. The implementation utilizes SCALA's support for defining internal Domain Specific Languages (DSLs). Furthermore SCALA's combination of object oriented and functional programming features, including partial functions and pattern matching, makes it an ideal host language for such an API.

  7. Fuzzy Temporal Logic Based Railway Passenger Flow Forecast Model

    PubMed Central

    Dou, Fei; Jia, Limin; Wang, Li; Xu, Jie; Huang, Yakun

    2014-01-01

    Passenger flow forecast is of essential importance to the organization of railway transportation and is one of the most important basics for the decision-making on transportation pattern and train operation planning. Passenger flow of high-speed railway features the quasi-periodic variations in a short time and complex nonlinear fluctuation because of existence of many influencing factors. In this study, a fuzzy temporal logic based passenger flow forecast model (FTLPFFM) is presented based on fuzzy logic relationship recognition techniques that predicts the short-term passenger flow for high-speed railway, and the forecast accuracy is also significantly improved. An applied case that uses the real-world data illustrates the precision and accuracy of FTLPFFM. For this applied case, the proposed model performs better than the k-nearest neighbor (KNN) and autoregressive integrated moving average (ARIMA) models. PMID:25431586

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nishimura, Takahiro, E-mail: t-nishimura@ist.osaka-u.ac.jp; Fujii, Ryo; Ogura, Yusuke

    Molecular logic circuits represent a promising technology for observation and manipulation of biological systems at the molecular level. However, the implementation of molecular logic circuits for temporal and programmable operation remains challenging. In this paper, we demonstrate an optically controllable logic circuit that uses fluorescence resonance energy transfer (FRET) for signaling. The FRET-based signaling process is modulated by both molecular and optical inputs. Based on the distance dependence of FRET, the FRET pathways required to execute molecular logic operations are formed on a DNA nanostructure as a circuit based on its molecular inputs. In addition, the FRET pathways on themore » DNA nanostructure are controlled optically, using photoswitching fluorescent molecules to instruct the execution of the desired operation and the related timings. The behavior of the circuit can thus be controlled using external optical signals. As an example, a molecular logic circuit capable of executing two different logic operations was studied. The circuit contains functional DNAs and a DNA scaffold to construct two FRET routes for executing Input 1 AND Input 2 and Input 1 AND NOT Input 3 operations on molecular inputs. The circuit produced the correct outputs with all possible combinations of the inputs by following the light signals. Moreover, the operation execution timings were controlled based on light irradiation and the circuit responded to time-dependent inputs. The experimental results demonstrate that the circuit changes the output for the required operations following the input of temporal light signals.« less

  9. Computer-aided biochemical programming of synthetic microreactors as diagnostic devices.

    PubMed

    Courbet, Alexis; Amar, Patrick; Fages, François; Renard, Eric; Molina, Franck

    2018-04-26

    Biological systems have evolved efficient sensing and decision-making mechanisms to maximize fitness in changing molecular environments. Synthetic biologists have exploited these capabilities to engineer control on information and energy processing in living cells. While engineered organisms pose important technological and ethical challenges, de novo assembly of non-living biomolecular devices could offer promising avenues toward various real-world applications. However, assembling biochemical parts into functional information processing systems has remained challenging due to extensive multidimensional parameter spaces that must be sampled comprehensively in order to identify robust, specification compliant molecular implementations. We introduce a systematic methodology based on automated computational design and microfluidics enabling the programming of synthetic cell-like microreactors embedding biochemical logic circuits, or protosensors , to perform accurate biosensing and biocomputing operations in vitro according to temporal logic specifications. We show that proof-of-concept protosensors integrating diagnostic algorithms detect specific patterns of biomarkers in human clinical samples. Protosensors may enable novel approaches to medicine and represent a step toward autonomous micromachines capable of precise interfacing of human physiology or other complex biological environments, ecosystems, or industrial bioprocesses. © 2018 The Authors. Published under the terms of the CC BY 4.0 license.

  10. Clinical utility of the Wechsler Memory Scale - Fourth Edition (WMS-IV) in patients with intractable temporal lobe epilepsy.

    PubMed

    Bouman, Zita; Elhorst, Didi; Hendriks, Marc P H; Kessels, Roy P C; Aldenkamp, Albert P

    2016-02-01

    The Wechsler Memory Scale (WMS) is one of the most widely used test batteries to assess memory functions in patients with brain dysfunctions of different etiologies. This study examined the clinical validation of the Dutch Wechsler Memory Scale - Fourth Edition (WMS-IV-NL) in patients with temporal lobe epilepsy (TLE). The sample consisted of 75 patients with intractable TLE, who were eligible for epilepsy surgery, and 77 demographically matched healthy controls. All participants were examined with the WMS-IV-NL. Patients with TLE performed significantly worse than healthy controls on all WMS-IV-NL indices and subtests (p<.01), with the exception of the Visual Working Memory Index including its contributing subtests, as well as the subtests Logical Memory I, Verbal Paired Associates I, and Designs II. In addition, patients with mesiotemporal abnormalities performed significantly worse than patients with lateral temporal abnormalities on the subtests Logical Memory I and Designs II and all the indices (p<.05), with the exception of the Auditory Memory Index and Visual Working Memory Index. Patients with either a left or a right temporal focus performed equally on all WMS-IV-NL indices and subtests (F(15, 50)=.70, p=.78), as well as the Auditory-Visual discrepancy score (t(64)=-1.40, p=.17). The WMS-IV-NL is capable of detecting memory problems in patients with TLE, indicating that it is a sufficiently valid memory battery. Furthermore, the findings support previous research showing that the WMS-IV has limited value in identifying material-specific memory deficits in presurgical patients with TLE. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Biophotonic logic devices based on quantum dots and temporally-staggered Förster energy transfer relays

    NASA Astrophysics Data System (ADS)

    Claussen, Jonathan C.; Algar, W. Russ; Hildebrandt, Niko; Susumu, Kimihiro; Ancona, Mario G.; Medintz, Igor L.

    2013-11-01

    Integrating photonic inputs/outputs into unimolecular logic devices can provide significantly increased functional complexity and the ability to expand the repertoire of available operations. Here, we build upon a system previously utilized for biosensing to assemble and prototype several increasingly sophisticated biophotonic logic devices that function based upon multistep Förster resonance energy transfer (FRET) relays. The core system combines a central semiconductor quantum dot (QD) nanoplatform with a long-lifetime Tb complex FRET donor and a near-IR organic fluorophore acceptor; the latter acts as two unique inputs for the QD-based device. The Tb complex allows for a form of temporal memory by providing unique access to a time-delayed modality as an alternate output which significantly increases the inherent computing options. Altering the device by controlling the configuration parameters with biologically based self-assembly provides input control while monitoring changes in emission output of all participants, in both a spectral and temporal-dependent manner, gives rise to two input, single output Boolean Logic operations including OR, AND, INHIBIT, XOR, NOR, NAND, along with the possibility of gate transitions. Incorporation of an enzymatic cleavage step provides for a set-reset function that can be implemented repeatedly with the same building blocks and is demonstrated with single input, single output YES and NOT gates. Potential applications for these devices are discussed in the context of their constituent parts and the richness of available signal.

  12. Biophotonic logic devices based on quantum dots and temporally-staggered Förster energy transfer relays.

    PubMed

    Claussen, Jonathan C; Algar, W Russ; Hildebrandt, Niko; Susumu, Kimihiro; Ancona, Mario G; Medintz, Igor L

    2013-12-21

    Integrating photonic inputs/outputs into unimolecular logic devices can provide significantly increased functional complexity and the ability to expand the repertoire of available operations. Here, we build upon a system previously utilized for biosensing to assemble and prototype several increasingly sophisticated biophotonic logic devices that function based upon multistep Förster resonance energy transfer (FRET) relays. The core system combines a central semiconductor quantum dot (QD) nanoplatform with a long-lifetime Tb complex FRET donor and a near-IR organic fluorophore acceptor; the latter acts as two unique inputs for the QD-based device. The Tb complex allows for a form of temporal memory by providing unique access to a time-delayed modality as an alternate output which significantly increases the inherent computing options. Altering the device by controlling the configuration parameters with biologically based self-assembly provides input control while monitoring changes in emission output of all participants, in both a spectral and temporal-dependent manner, gives rise to two input, single output Boolean Logic operations including OR, AND, INHIBIT, XOR, NOR, NAND, along with the possibility of gate transitions. Incorporation of an enzymatic cleavage step provides for a set-reset function that can be implemented repeatedly with the same building blocks and is demonstrated with single input, single output YES and NOT gates. Potential applications for these devices are discussed in the context of their constituent parts and the richness of available signal.

  13. C code generation from Petri-net-based logic controller specification

    NASA Astrophysics Data System (ADS)

    Grobelny, Michał; Grobelna, Iwona; Karatkevich, Andrei

    2017-08-01

    The article focuses on programming of logic controllers. It is important that a programming code of a logic controller is executed flawlessly according to the primary specification. In the presented approach we generate C code for an AVR microcontroller from a rule-based logical model of a control process derived from a control interpreted Petri net. The same logical model is also used for formal verification of the specification by means of the model checking technique. The proposed rule-based logical model and formal rules of transformation ensure that the obtained implementation is consistent with the already verified specification. The approach is validated by practical experiments.

  14. Logic as Marr's Computational Level: Four Case Studies.

    PubMed

    Baggio, Giosuè; van Lambalgen, Michiel; Hagoort, Peter

    2015-04-01

    We sketch four applications of Marr's levels-of-analysis methodology to the relations between logic and experimental data in the cognitive neuroscience of language and reasoning. The first part of the paper illustrates the explanatory power of computational level theories based on logic. We show that a Bayesian treatment of the suppression task in reasoning with conditionals is ruled out by EEG data, supporting instead an analysis based on defeasible logic. Further, we describe how results from an EEG study on temporal prepositions can be reanalyzed using formal semantics, addressing a potential confound. The second part of the article demonstrates the predictive power of logical theories drawing on EEG data on processing progressive constructions and on behavioral data on conditional reasoning in people with autism. Logical theories can constrain processing hypotheses all the way down to neurophysiology, and conversely neuroscience data can guide the selection of alternative computational level models of cognition. Copyright © 2014 Cognitive Science Society, Inc.

  15. Automatic Review of Abstract State Machines by Meta Property Verification

    NASA Technical Reports Server (NTRS)

    Arcaini, Paolo; Gargantini, Angelo; Riccobene, Elvinia

    2010-01-01

    A model review is a validation technique aimed at determining if a model is of sufficient quality and allows defects to be identified early in the system development, reducing the cost of fixing them. In this paper we propose a technique to perform automatic review of Abstract State Machine (ASM) formal specifications. We first detect a family of typical vulnerabilities and defects a developer can introduce during the modeling activity using the ASMs and we express such faults as the violation of meta-properties that guarantee certain quality attributes of the specification. These meta-properties are then mapped to temporal logic formulas and model checked for their violation. As a proof of concept, we also report the result of applying this ASM review process to several specifications.

  16. Knowledge-guided fuzzy logic modeling to infer cellular signaling networks from proteomic data

    PubMed Central

    Liu, Hui; Zhang, Fan; Mishra, Shital Kumar; Zhou, Shuigeng; Zheng, Jie

    2016-01-01

    Modeling of signaling pathways is crucial for understanding and predicting cellular responses to drug treatments. However, canonical signaling pathways curated from literature are seldom context-specific and thus can hardly predict cell type-specific response to external perturbations; purely data-driven methods also have drawbacks such as limited biological interpretability. Therefore, hybrid methods that can integrate prior knowledge and real data for network inference are highly desirable. In this paper, we propose a knowledge-guided fuzzy logic network model to infer signaling pathways by exploiting both prior knowledge and time-series data. In particular, the dynamic time warping algorithm is employed to measure the goodness of fit between experimental and predicted data, so that our method can model temporally-ordered experimental observations. We evaluated the proposed method on a synthetic dataset and two real phosphoproteomic datasets. The experimental results demonstrate that our model can uncover drug-induced alterations in signaling pathways in cancer cells. Compared with existing hybrid models, our method can model feedback loops so that the dynamical mechanisms of signaling networks can be uncovered from time-series data. By calibrating generic models of signaling pathways against real data, our method supports precise predictions of context-specific anticancer drug effects, which is an important step towards precision medicine. PMID:27774993

  17. Visually defining and querying consistent multi-granular clinical temporal abstractions.

    PubMed

    Combi, Carlo; Oliboni, Barbara

    2012-02-01

    The main goal of this work is to propose a framework for the visual specification and query of consistent multi-granular clinical temporal abstractions. We focus on the issue of querying patient clinical information by visually defining and composing temporal abstractions, i.e., high level patterns derived from several time-stamped raw data. In particular, we focus on the visual specification of consistent temporal abstractions with different granularities and on the visual composition of different temporal abstractions for querying clinical databases. Temporal abstractions on clinical data provide a concise and high-level description of temporal raw data, and a suitable way to support decision making. Granularities define partitions on the time line and allow one to represent time and, thus, temporal clinical information at different levels of detail, according to the requirements coming from the represented clinical domain. The visual representation of temporal information has been considered since several years in clinical domains. Proposed visualization techniques must be easy and quick to understand, and could benefit from visual metaphors that do not lead to ambiguous interpretations. Recently, physical metaphors such as strips, springs, weights, and wires have been proposed and evaluated on clinical users for the specification of temporal clinical abstractions. Visual approaches to boolean queries have been considered in the last years and confirmed that the visual support to the specification of complex boolean queries is both an important and difficult research topic. We propose and describe a visual language for the definition of temporal abstractions based on a set of intuitive metaphors (striped wall, plastered wall, brick wall), allowing the clinician to use different granularities. A new algorithm, underlying the visual language, allows the physician to specify only consistent abstractions, i.e., abstractions not containing contradictory conditions on the component abstractions. Moreover, we propose a visual query language where different temporal abstractions can be composed to build complex queries: temporal abstractions are visually connected through the usual logical connectives AND, OR, and NOT. The proposed visual language allows one to simply define temporal abstractions by using intuitive metaphors, and to specify temporal intervals related to abstractions by using different temporal granularities. The physician can interact with the designed and implemented tool by point-and-click selections, and can visually compose queries involving several temporal abstractions. The evaluation of the proposed granularity-related metaphors consisted in two parts: (i) solving 30 interpretation exercises by choosing the correct interpretation of a given screenshot representing a possible scenario, and (ii) solving a complex exercise, by visually specifying through the interface a scenario described only in natural language. The exercises were done by 13 subjects. The percentage of correct answers to the interpretation exercises were slightly different with respect to the considered metaphors (54.4--striped wall, 73.3--plastered wall, 61--brick wall, and 61--no wall), but post hoc statistical analysis on means confirmed that differences were not statistically significant. The result of the user's satisfaction questionnaire related to the evaluation of the proposed granularity-related metaphors ratified that there are no preferences for one of them. The evaluation of the proposed logical notation consisted in two parts: (i) solving five interpretation exercises provided by a screenshot representing a possible scenario and by three different possible interpretations, of which only one was correct, and (ii) solving five exercises, by visually defining through the interface a scenario described only in natural language. Exercises had an increasing difficulty. The evaluation involved a total of 31 subjects. Results related to this evaluation phase confirmed us about the soundness of the proposed solution even in comparison with a well known proposal based on a tabular query form (the only significant difference is that our proposal requires more time for the training phase: 21 min versus 14 min). In this work we have considered the issue of visually composing and querying temporal clinical patient data. In this context we have proposed a visual framework for the specification of consistent temporal abstractions with different granularities and for the visual composition of different temporal abstractions to build (possibly) complex queries on clinical databases. A new algorithm has been proposed to check the consistency of the specified granular abstraction. From the evaluation of the proposed metaphors and interfaces and from the comparison of the visual query language with a well known visual method for boolean queries, the soundness of the overall system has been confirmed; moreover, pros and cons and possible improvements emerged from the comparison of different visual metaphors and solutions. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. COMPOSE: Using temporal patterns for interpreting wearable sensor data with computer interpretable guidelines.

    PubMed

    Urovi, V; Jimenez-Del-Toro, O; Dubosson, F; Ruiz Torres, A; Schumacher, M I

    2017-02-01

    This paper describes a novel temporal logic-based framework for reasoning with continuous data collected from wearable sensors. The work is motivated by the Metabolic Syndrome, a cluster of conditions which are linked to obesity and unhealthy lifestyle. We assume that, by interpreting the physiological parameters of continuous monitoring, we can identify which patients have a higher risk of Metabolic Syndrome. We define temporal patterns for reasoning with continuous data and specify the coordination mechanisms for combining different sets of clinical guidelines that relate to this condition. The proposed solution is tested with data provided by twenty subjects, which used sensors for four days of continuous monitoring. The results are compared to the gold standard. The novelty of the framework stands in extending a temporal logic formalism, namely the Event Calculus, with temporal patterns. These patterns are helpful to specify the rules for reasoning with continuous data and in combining new knowledge into one consistent outcome that is tailored to the patient's profile. The overall approach opens new possibilities for delivering patient-tailored interventions and educational material before the patients present the symptoms of the disease. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Runtime verification of embedded real-time systems.

    PubMed

    Reinbacher, Thomas; Függer, Matthias; Brauer, Jörg

    We present a runtime verification framework that allows on-line monitoring of past-time Metric Temporal Logic (ptMTL) specifications in a discrete time setting. We design observer algorithms for the time-bounded modalities of ptMTL, which take advantage of the highly parallel nature of hardware designs. The algorithms can be translated into efficient hardware blocks, which are designed for reconfigurability, thus, facilitate applications of the framework in both a prototyping and a post-deployment phase of embedded real-time systems. We provide formal correctness proofs for all presented observer algorithms and analyze their time and space complexity. For example, for the most general operator considered, the time-bounded Since operator, we obtain a time complexity that is doubly logarithmic both in the point in time the operator is executed and the operator's time bounds. This result is promising with respect to a self-contained, non-interfering monitoring approach that evaluates real-time specifications in parallel to the system-under-test. We implement our framework on a Field Programmable Gate Array platform and use extensive simulation and logic synthesis runs to assess the benefits of the approach in terms of resource usage and operating frequency.

  20. Monitoring Programs Using Rewriting

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Rosu, Grigore; Lan, Sonie (Technical Monitor)

    2001-01-01

    We present a rewriting algorithm for efficiently testing future time Linear Temporal Logic (LTL) formulae on finite execution traces, The standard models of LTL are infinite traces, reflecting the behavior of reactive and concurrent systems which conceptually may be continuously alive in most past applications of LTL, theorem provers and model checkers have been used to formally prove that down-scaled models satisfy such LTL specifications. Our goal is instead to use LTL for up-scaled testing of real software applications, corresponding to analyzing the conformance of finite traces against LTL formulae. We first describe what it means for a finite trace to satisfy an LTL property end then suggest an optimized algorithm based on transforming LTL formulae. We use the Maude rewriting logic, which turns out to be a good notation and being supported by an efficient rewriting engine for performing these experiments. The work constitutes part of the Java PathExplorer (JPAX) project, the purpose of which is to develop a flexible tool for monitoring Java program executions.

  1. Expanded all-optical programmable logic array based on multi-input/output canonical logic units.

    PubMed

    Lei, Lei; Dong, Jianji; Zou, Bingrong; Wu, Zhao; Dong, Wenchan; Zhang, Xinliang

    2014-04-21

    We present an expanded all-optical programmable logic array (O-PLA) using multi-input and multi-output canonical logic units (CLUs) generation. Based on four-wave mixing (FWM) in highly nonlinear fiber (HNLF), two-input and three-input CLUs are simultaneously achieved in five different channels with an operation speed of 40 Gb/s. Clear temporal waveforms and wide open eye diagrams are successfully observed. The effectiveness of the scheme is validated by extinction ratio and optical signal-to-noise ratio measurements. The computing capacity, defined as the total amount of logic functions achieved by the O-PLA, is discussed in detail. For a three-input O-PLA, the computing capacity of the expanded CLUs-PLA is more than two times as large as that of the standard CLUs-PLA, and this multiple will increase to more than three and a half as the idlers are individually independent.

  2. A Multi-Encoding Approach for LTL Symbolic Satisfiability Checking

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Y.; Vardi, Moshe Y.

    2011-01-01

    Formal behavioral specifications written early in the system-design process and communicated across all design phases have been shown to increase the efficiency, consistency, and quality of the system under development. To prevent introducing design or verification errors, it is crucial to test specifications for satisfiability. Our focus here is on specifications expressed in linear temporal logic (LTL). We introduce a novel encoding of symbolic transition-based Buchi automata and a novel, "sloppy," transition encoding, both of which result in improved scalability. We also define novel BDD variable orders based on tree decomposition of formula parse trees. We describe and extensively test a new multi-encoding approach utilizing these novel encoding techniques to create 30 encoding variations. We show that our novel encodings translate to significant, sometimes exponential, improvement over the current standard encoding for symbolic LTL satisfiability checking.

  3. Evaluation of properties over phylogenetic trees using stochastic logics.

    PubMed

    Requeno, José Ignacio; Colom, José Manuel

    2016-06-14

    Model checking has been recently introduced as an integrated framework for extracting information of the phylogenetic trees using temporal logics as a querying language, an extension of modal logics that imposes restrictions of a boolean formula along a path of events. The phylogenetic tree is considered a transition system modeling the evolution as a sequence of genomic mutations (we understand mutation as different ways that DNA can be changed), while this kind of logics are suitable for traversing it in a strict and exhaustive way. Given a biological property that we desire to inspect over the phylogeny, the verifier returns true if the specification is satisfied or a counterexample that falsifies it. However, this approach has been only considered over qualitative aspects of the phylogeny. In this paper, we repair the limitations of the previous framework for including and handling quantitative information such as explicit time or probability. To this end, we apply current probabilistic continuous-time extensions of model checking to phylogenetics. We reinterpret a catalog of qualitative properties in a numerical way, and we also present new properties that couldn't be analyzed before. For instance, we obtain the likelihood of a tree topology according to a mutation model. As case of study, we analyze several phylogenies in order to obtain the maximum likelihood with the model checking tool PRISM. In addition, we have adapted the software for optimizing the computation of maximum likelihoods. We have shown that probabilistic model checking is a competitive framework for describing and analyzing quantitative properties over phylogenetic trees. This formalism adds soundness and readability to the definition of models and specifications. Besides, the existence of model checking tools hides the underlying technology, omitting the extension, upgrade, debugging and maintenance of a software tool to the biologists. A set of benchmarks justify the feasibility of our approach.

  4. Linear Temporal Logic (LTL) Based Monitoring of Smart Manufacturing Systems.

    PubMed

    Heddy, Gerald; Huzaifa, Umer; Beling, Peter; Haimes, Yacov; Marvel, Jeremy; Weiss, Brian; LaViers, Amy

    2015-01-01

    The vision of Smart Manufacturing Systems (SMS) includes collaborative robots that can adapt to a range of scenarios. This vision requires a classification of multiple system behaviors, or sequences of movement, that can achieve the same high-level tasks. Likewise, this vision presents unique challenges regarding the management of environmental variables in concert with discrete, logic-based programming. Overcoming these challenges requires targeted performance and health monitoring of both the logical controller and the physical components of the robotic system. Prognostics and health management (PHM) defines a field of techniques and methods that enable condition-monitoring, diagnostics, and prognostics of physical elements, functional processes, overall systems, etc. PHM is warranted in this effort given that the controller is vulnerable to program changes, which propagate in unexpected ways, logical runtime exceptions, sensor failure, and even bit rot. The physical component's health is affected by the wear and tear experienced by machines constantly in motion. The controller's source of faults is inherently discrete, while the latter occurs in a manner that builds up continuously over time. Such a disconnect poses unique challenges for PHM. This paper presents a robotic monitoring system that captures and resolves this disconnect. This effort leverages supervisory robotic control and model checking with linear temporal logic (LTL), presenting them as a novel monitoring system for PHM. This methodology has been demonstrated in a MATLAB-based simulator for an industry inspired use-case in the context of PHM. Future work will use the methodology to develop adaptive, intelligent control strategies to evenly distribute wear on the joints of the robotic arms, maximizing the life of the system.

  5. The Effect of 3D Virtual Reality on Sequential Time Perception among Deaf and Hard-of-Hearing Children

    ERIC Educational Resources Information Center

    Eden, Sigal

    2008-01-01

    Over the years deaf and hard-of-hearing children have been reported as having difficulty with time conception and, in particular, the proper arrangement of events in a logical, temporal order. The research examined whether deaf and hard-of-hearing children perceive a temporal sequence differently under different representational modes. We compared…

  6. Three-Dimensionality as an Effective Mode of Representation for Expressing Sequential Time Perception

    ERIC Educational Resources Information Center

    Eden, Sigal; Passig, David

    2007-01-01

    The process of developing concepts of time continues from age 5 to 11 years (Zakay, 1998). This study sought the representation mode in which children could best express time concepts, especially the proper arrangement of events in a logical and temporal order. Usually, temporal order is examined and taught by 2D (2-dimensional) pictorial scripts.…

  7. Instrumentation of Java Bytecode for Runtime Analysis

    NASA Technical Reports Server (NTRS)

    Goldberg, Allen; Haveland, Klaus

    2003-01-01

    This paper describes JSpy, a system for high-level instrumentation of Java bytecode and its use with JPaX, OUT system for runtime analysis of Java programs. JPaX monitors the execution of temporal logic formulas and performs predicative analysis of deadlocks and data races. JSpy s input is an instrumentation specification, which consists of a collection of rules, where a rule is a predicate/action pair The predicate is a conjunction of syntactic constraints on a Java statement, and the action is a description of logging information to be inserted in the bytecode corresponding to the statement. JSpy is built using JTrek an instrumentation package at a lower level of abstraction.

  8. Formal Specification of Information Systems Requirements.

    ERIC Educational Resources Information Center

    Kampfner, Roberto R.

    1985-01-01

    Presents a formal model for specification of logical requirements of computer-based information systems that incorporates structural and dynamic aspects based on two separate models: the Logical Information Processing Structure and the Logical Information Processing Network. The model's role in systems development is discussed. (MBR)

  9. Cooperation Among Theorem Provers

    NASA Technical Reports Server (NTRS)

    Waldinger, Richard J.

    1998-01-01

    This is a final report, which supports NASA's PECSEE (Persistent Cognizant Software Engineering Environment) effort and complements the Kestrel Institute project "Inference System Integration via Logic Morphism". The ultimate purpose of the project is to develop a superior logical inference mechanism by combining the diverse abilities of multiple cooperating theorem provers. In many years of research, a number of powerful theorem-proving systems have arisen with differing capabilities and strengths. Resolution theorem provers (such as Kestrel's KITP or SRI's, SNARK) deal with first-order logic with equality but not the principle of mathematical induction. The Boyer-Moore theorem prover excels at proof by induction but cannot deal with full first-order logic. Both are highly automated but cannot accept user guidance easily. The PVS system (from SRI) in only automatic within decidable theories, but it has well-designed interactive capabilities: furthermore, it includes higher-order logic, not just first-order logic. The NuPRL system from Cornell University and the STeP system from Stanford University have facilities for constructive logic and temporal logic, respectively - both are interactive. It is often suggested - for example, in the anonymous "QED Manifesto"-that we should pool the resources of all these theorem provers into a single system, so that the strengths of one can compensate for the weaknesses of others, and so that effort will not be duplicated. However, there is no straightforward way of doing this, because each system relies on its own language and logic for its success. Thus. SNARK uses ordinary first-order logic with equality, PVS uses higher-order logic. and NuPRL uses constructive logic. The purpose of this project, and the companion project at Kestrel, has been to use the category-theoretic notion of logic morphism to combine systems with different logics and languages. Kestrel's SPECWARE system has been the vehicle for the implementation.

  10. A computer method of finding valuations forcing validity of LC formulae

    NASA Astrophysics Data System (ADS)

    Godlewski, Łukasz; Świetorzecka, Kordula; Mulawka, Jan

    2014-11-01

    The purpose of this paper is to present the computer implementation of a system known as LC temporal logic [1]. Firstly, to become familiar with some theoretical issues, a short introduction to this logic is discussed. The algorithms allowing a deep analysis of the formulae of LC logic are considered. In particular we discuss how to determine if a formula is a tautology, contrtautology or it is satisfable. Next, we show how to find all valuations to satisfy the formula. Finally, we consider finding histories generated by the formula and transforming these histories into the state machine. Moreover, a description of the experiments that verify the implementation are briefly presented.

  11. Beyond the visual word form area: the orthography-semantics interface in spelling and reading.

    PubMed

    Purcell, Jeremy J; Shea, Jennifer; Rapp, Brenda

    2014-01-01

    Lexical orthographic information provides the basis for recovering the meanings of words in reading and for generating correct word spellings in writing. Research has provided evidence that an area of the left ventral temporal cortex, a subregion of what is often referred to as the visual word form area (VWFA), plays a significant role specifically in lexical orthographic processing. The current investigation goes beyond this previous work by examining the neurotopography of the interface of lexical orthography with semantics. We apply a novel lesion mapping approach with three individuals with acquired dysgraphia and dyslexia who suffered lesions to left ventral temporal cortex. To map cognitive processes to their neural substrates, this lesion mapping approach applies similar logical constraints to those used in cognitive neuropsychological research. Using this approach, this investigation: (a) identifies a region anterior to the VWFA that is important in the interface of orthographic information with semantics for reading and spelling; (b) determines that, within this orthography-semantics interface region (OSIR), access to orthography from semantics (spelling) is topographically distinct from access to semantics from orthography (reading); (c) provides evidence that, within this region, there is modality-specific access to and from lexical semantics for both spoken and written modalities, in both word production and comprehension. Overall, this study contributes to our understanding of the neural architecture at the lexical orthography-semantic-phonological interface within left ventral temporal cortex.

  12. Beyond the VWFA: The orthography-semantics interface in spelling and reading

    PubMed Central

    Purcell, Jeremy J.; Shea, Jennifer; Rapp, Brenda

    2014-01-01

    Lexical orthographic information provides the basis for recovering the meanings of words in reading and for generating correct word spellings in writing. Research has provided evidence that an area of the left ventral temporal cortex, a sub-region of what is often referred to as the Visual Word Form Area (VWFA), plays a significant role specifically in lexical orthographic processing. The current investigation goes beyond this previous work by examining the neurotopography of the interface of lexical orthography with semantics. We apply a novel lesion mapping approach with three individuals with acquired dysgraphia and dyslexia who suffered lesions to left ventral temporal cortex. To map cognitive processes to their neural substrates, this lesion mapping approach applies similar logical constraints as used in cognitive neuropsychological research. Using this approach, this investigation: (1) Identifies a region anterior to the VWFA that is important in the interface of orthographic information with semantics for reading and spelling; (2) Determines that, within this Orthography-Semantics Interface Region (OSIR), access to orthography from semantics (spelling) is topographically distinct from access to semantics from orthography (reading); (3) Provides evidence that, within this region, there is modality-specific access to and from lexical semantics for both spoken and written modalities, in both word production and comprehension. Overall, this study contributes to our understanding of the neural architecture at the lexical orthography-semantic-phonological interface within left ventral temporal cortex. PMID:24833190

  13. Linear Temporal Logic (LTL) Based Monitoring of Smart Manufacturing Systems

    PubMed Central

    Heddy, Gerald; Huzaifa, Umer; Beling, Peter; Haimes, Yacov; Marvel, Jeremy; Weiss, Brian; LaViers, Amy

    2017-01-01

    The vision of Smart Manufacturing Systems (SMS) includes collaborative robots that can adapt to a range of scenarios. This vision requires a classification of multiple system behaviors, or sequences of movement, that can achieve the same high-level tasks. Likewise, this vision presents unique challenges regarding the management of environmental variables in concert with discrete, logic-based programming. Overcoming these challenges requires targeted performance and health monitoring of both the logical controller and the physical components of the robotic system. Prognostics and health management (PHM) defines a field of techniques and methods that enable condition-monitoring, diagnostics, and prognostics of physical elements, functional processes, overall systems, etc. PHM is warranted in this effort given that the controller is vulnerable to program changes, which propagate in unexpected ways, logical runtime exceptions, sensor failure, and even bit rot. The physical component’s health is affected by the wear and tear experienced by machines constantly in motion. The controller’s source of faults is inherently discrete, while the latter occurs in a manner that builds up continuously over time. Such a disconnect poses unique challenges for PHM. This paper presents a robotic monitoring system that captures and resolves this disconnect. This effort leverages supervisory robotic control and model checking with linear temporal logic (LTL), presenting them as a novel monitoring system for PHM. This methodology has been demonstrated in a MATLAB-based simulator for an industry inspired use-case in the context of PHM. Future work will use the methodology to develop adaptive, intelligent control strategies to evenly distribute wear on the joints of the robotic arms, maximizing the life of the system. PMID:28730154

  14. Experiments with Test Case Generation and Runtime Analysis

    NASA Technical Reports Server (NTRS)

    Artho, Cyrille; Drusinsky, Doron; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareanu, Corina; Rosu, Grigore; Visser, Willem; Koga, Dennis (Technical Monitor)

    2003-01-01

    Software testing is typically an ad hoc process where human testers manually write many test inputs and expected test results, perhaps automating their execution in a regression suite. This process is cumbersome and costly. This paper reports preliminary results on an approach to further automate this process. The approach consists of combining automated test case generation based on systematically exploring the program's input domain, with runtime analysis, where execution traces are monitored and verified against temporal logic specifications, or analyzed using advanced algorithms for detecting concurrency errors such as data races and deadlocks. The approach suggests to generate specifications dynamically per input instance rather than statically once-and-for-all. The paper describes experiments with variants of this approach in the context of two examples, a planetary rover controller and a space craft fault protection system.

  15. Cerebral Glucose Metabolism is Associated with Verbal but not Visual Memory Performance in Community-Dwelling Older Adults.

    PubMed

    Gardener, Samantha L; Sohrabi, Hamid R; Shen, Kai-Kai; Rainey-Smith, Stephanie R; Weinborn, Michael; Bates, Kristyn A; Shah, Tejal; Foster, Jonathan K; Lenzo, Nat; Salvado, Olivier; Laske, Christoph; Laws, Simon M; Taddei, Kevin; Verdile, Giuseppe; Martins, Ralph N

    2016-03-31

    Increasing evidence suggests that Alzheimer's disease (AD) sufferers show region-specific reductions in cerebral glucose metabolism, as measured by [18F]-fluoro-2-deoxyglucose positron emission tomography (18F-FDG PET). We investigated preclinical disease stage by cross-sectionally examining the association between global cognition, verbal and visual memory, and 18F-FDG PET standardized uptake value ratio (SUVR) in 43 healthy control individuals, subsequently focusing on differences between subjective memory complainers and non-memory complainers. The 18F-FDG PET regions of interest investigated include the hippocampus, amygdala, posterior cingulate, superior parietal, entorhinal cortices, frontal cortex, temporal cortex, and inferior parietal region. In the cohort as a whole, verbal logical memory immediate recall was positively associated with 18F-FDG PET SUVR in both the left hippocampus and right amygdala. There were no associations observed between global cognition, delayed recall in logical memory, or visual reproduction and 18F-FDG PET SUVR. Following stratification of the cohort into subjective memory complainers and non-complainers, verbal logical memory immediate recall was positively associated with 18F-FDG PET SUVR in the right amygdala in those with subjective memory complaints. There were no significant associations observed in non-memory complainers between 18F-FDG PET SUVR in regions of interest and cognitive performance. We observed subjective memory complaint-specific associations between 18F-FDG PET SUVR and immediate verbal memory performance in our cohort, however found no associations between delayed recall of verbal memory performance or visual memory performance. It is here argued that the neural mechanisms underlying verbal and visual memory performance may in fact differ in their pathways, and the characteristic reduction of 18F-FDG PET SUVR observed in this and previous studies likely reflects the pathophysiological changes in specific brain regions that occur in preclinical AD.

  16. Microfluidic Pneumatic Logic Circuits and Digital Pneumatic Microprocessors for Integrated Microfluidic Systems

    PubMed Central

    Rhee, Minsoung

    2010-01-01

    We have developed pneumatic logic circuits and microprocessors built with microfluidic channels and valves in polydimethylsiloxane (PDMS). The pneumatic logic circuits perform various combinational and sequential logic calculations with binary pneumatic signals (atmosphere and vacuum), producing cascadable outputs based on Boolean operations. A complex microprocessor is constructed from combinations of various logic circuits and receives pneumatically encoded serial commands at a single input line. The device then decodes the temporal command sequence by spatial parallelization, computes necessary logic calculations between parallelized command bits, stores command information for signal transportation and maintenance, and finally executes the command for the target devices. Thus, such pneumatic microprocessors will function as a universal on-chip control platform to perform complex parallel operations for large-scale integrated microfluidic devices. To demonstrate the working principles, we have built 2-bit, 3-bit, 4-bit, and 8-bit microprecessors to control various target devices for applications such as four color dye mixing, and multiplexed channel fluidic control. By significantly reducing the need for external controllers, the digital pneumatic microprocessor can be used as a universal on-chip platform to autonomously manipulate microfluids in a high throughput manner. PMID:19823730

  17. Simulation and Verification of Synchronous Set Relations in Rewriting Logic

    NASA Technical Reports Server (NTRS)

    Rocha, Camilo; Munoz, Cesar A.

    2011-01-01

    This paper presents a mathematical foundation and a rewriting logic infrastructure for the execution and property veri cation of synchronous set relations. The mathematical foundation is given in the language of abstract set relations. The infrastructure consists of an ordersorted rewrite theory in Maude, a rewriting logic system, that enables the synchronous execution of a set relation provided by the user. By using the infrastructure, existing algorithm veri cation techniques already available in Maude for traditional asynchronous rewriting, such as reachability analysis and model checking, are automatically available to synchronous set rewriting. The use of the infrastructure is illustrated with an executable operational semantics of a simple synchronous language and the veri cation of temporal properties of a synchronous system.

  18. Java PathExplorer: A Runtime Verification Tool

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)

    2001-01-01

    We describe recent work on designing an environment called Java PathExplorer for monitoring the execution of Java programs. This environment facilitates the testing of execution traces against high level specifications, including temporal logic formulae. In addition, it contains algorithms for detecting classical error patterns in concurrent programs, such as deadlocks and data races. An initial prototype of the tool has been applied to the executive module of the planetary Rover K9, developed at NASA Ames. In this paper we describe the background and motivation for the development of this tool, including comments on how it relates to formal methods tools as well as to traditional testing, and we then present the tool itself.

  19. A database perspective of the transition from single-use (ancillary-based) systems to integrated models supporting clinical care and research in a MUMPS-based system.

    PubMed

    Siegel, J; Kirkland, D

    1991-01-01

    The Composite Health Care System (CHCS), a MUMPS-based hospital information system (HIS), has evolved from the Decentralized Hospital Computer Program (DHCP) installed within VA Hospitals. The authors explore the evolution of an ancillary-based system toward an integrated model with a look at its current state and possible future. The history and relationships between orders of different types tie specific patient-related data into a logical and temporal model. Diagrams demonstrate how the database structure has evolved to support clinical needs for integration. It is suggested that a fully integrated model is capable of meeting traditional HIS needs.

  20. An Innovative Infrastructure with a Universal Geo-Spatiotemporal Data Representation Supporting Cost-Effective Integration of Diverse Earth Science Data

    NASA Technical Reports Server (NTRS)

    Rilee, Michael Lee; Kuo, Kwo-Sen

    2017-01-01

    The SpatioTemporal Adaptive Resolution Encoding (STARE) is a unifying scheme encoding geospatial and temporal information for organizing data on scalable computing/storage resources, minimizing expensive data transfers. STARE provides a compact representation that turns set-logic functions into integer operations, e.g. conditional sub-setting, taking into account representative spatiotemporal resolutions of the data in the datasets. STARE geo-spatiotemporally aligns data placements of diverse data on massive parallel resources to maximize performance. Automating important scientific functions (e.g. regridding) and computational functions (e.g. data placement) allows scientists to focus on domain-specific questions instead of expending their efforts and expertise on data processing. With STARE-enabled automation, SciDB (Scientific Database) plus STARE provides a database interface, reducing costly data preparation, increasing the volume and variety of interoperable data, and easing result sharing. Using SciDB plus STARE as part of an integrated analysis infrastructure dramatically eases combining diametrically different datasets.

  1. Improving the human readability of Arden Syntax medical logic modules using a concept-oriented terminology and object-oriented programming expressions.

    PubMed

    Choi, Jeeyae; Bakken, Suzanne; Lussier, Yves A; Mendonça, Eneida A

    2006-01-01

    Medical logic modules are a procedural representation for sharing task-specific knowledge for decision support systems. Based on the premise that clinicians may perceive object-oriented expressions as easier to read than procedural rules in Arden Syntax-based medical logic modules, we developed a method for improving the readability of medical logic modules. Two approaches were applied: exploiting the concept-oriented features of the Medical Entities Dictionary and building an executable Java program to replace Arden Syntax procedural expressions. The usability evaluation showed that 66% of participants successfully mapped all Arden Syntax rules to Java methods. These findings suggest that these approaches can play an essential role in the creation of human readable medical logic modules and can potentially increase the number of clinical experts who are able to participate in the creation of medical logic modules. Although our approaches are broadly applicable, we specifically discuss the relevance to concept-oriented nursing terminologies and automated processing of task-specific nursing knowledge.

  2. An efficient temporal database design method based on EER

    NASA Astrophysics Data System (ADS)

    Liu, Zhi; Huang, Jiping; Miao, Hua

    2007-12-01

    Many existing methods of modeling temporal information are based on logical model, which makes relational schema optimization more difficult and more complicated. In this paper, based on the conventional EER model, the author attempts to analyse and abstract temporal information in the phase of conceptual modelling according to the concrete requirement to history information. Then a temporal data model named BTEER is presented. BTEER not only retains all designing ideas and methods of EER which makes BTEER have good upward compatibility, but also supports the modelling of valid time and transaction time effectively at the same time. In addition, BTEER can be transformed to EER easily and automatically. It proves in practice, this method can model the temporal information well.

  3. Management of Temporal Constraints for Factory Scheduling.

    DTIC Science & Technology

    1987-06-01

    consistency of scheduling decisions were implemented in both the ISIS [Fox 84] and SOJA [LePape 85a] scheduling systems. More recent work with the...kinds of time propagation systems: the symbolic and the numeric ones. Symbolic systems combine relationships with a temporal logic a la Allen [Allen 81...maintains consistency by narrowing time windows associated with activities as decisions are made, and SOJA [LePape 85b] guarantees a schedule’s

  4. Brain Computation Is Organized via Power-of-Two-Based Permutation Logic.

    PubMed

    Xie, Kun; Fox, Grace E; Liu, Jun; Lyu, Cheng; Lee, Jason C; Kuang, Hui; Jacobs, Stephanie; Li, Meng; Liu, Tianming; Song, Sen; Tsien, Joe Z

    2016-01-01

    There is considerable scientific interest in understanding how cell assemblies-the long-presumed computational motif-are organized so that the brain can generate intelligent cognition and flexible behavior. The Theory of Connectivity proposes that the origin of intelligence is rooted in a power-of-two-based permutation logic ( N = 2 i -1), producing specific-to-general cell-assembly architecture capable of generating specific perceptions and memories, as well as generalized knowledge and flexible actions. We show that this power-of-two-based permutation logic is widely used in cortical and subcortical circuits across animal species and is conserved for the processing of a variety of cognitive modalities including appetitive, emotional and social information. However, modulatory neurons, such as dopaminergic (DA) neurons, use a simpler logic despite their distinct subtypes. Interestingly, this specific-to-general permutation logic remained largely intact although NMDA receptors-the synaptic switch for learning and memory-were deleted throughout adulthood, suggesting that the logic is developmentally pre-configured. Moreover, this computational logic is implemented in the cortex via combining a random-connectivity strategy in superficial layers 2/3 with nonrandom organizations in deep layers 5/6. This randomness of layers 2/3 cliques-which preferentially encode specific and low-combinatorial features and project inter-cortically-is ideal for maximizing cross-modality novel pattern-extraction, pattern-discrimination and pattern-categorization using sparse code, consequently explaining why it requires hippocampal offline-consolidation. In contrast, the nonrandomness in layers 5/6-which consists of few specific cliques but a higher portion of more general cliques projecting mostly to subcortical systems-is ideal for feedback-control of motivation, emotion, consciousness and behaviors. These observations suggest that the brain's basic computational algorithm is indeed organized by the power-of-two-based permutation logic. This simple mathematical logic can account for brain computation across the entire evolutionary spectrum, ranging from the simplest neural networks to the most complex.

  5. Brain Computation Is Organized via Power-of-Two-Based Permutation Logic

    PubMed Central

    Xie, Kun; Fox, Grace E.; Liu, Jun; Lyu, Cheng; Lee, Jason C.; Kuang, Hui; Jacobs, Stephanie; Li, Meng; Liu, Tianming; Song, Sen; Tsien, Joe Z.

    2016-01-01

    There is considerable scientific interest in understanding how cell assemblies—the long-presumed computational motif—are organized so that the brain can generate intelligent cognition and flexible behavior. The Theory of Connectivity proposes that the origin of intelligence is rooted in a power-of-two-based permutation logic (N = 2i–1), producing specific-to-general cell-assembly architecture capable of generating specific perceptions and memories, as well as generalized knowledge and flexible actions. We show that this power-of-two-based permutation logic is widely used in cortical and subcortical circuits across animal species and is conserved for the processing of a variety of cognitive modalities including appetitive, emotional and social information. However, modulatory neurons, such as dopaminergic (DA) neurons, use a simpler logic despite their distinct subtypes. Interestingly, this specific-to-general permutation logic remained largely intact although NMDA receptors—the synaptic switch for learning and memory—were deleted throughout adulthood, suggesting that the logic is developmentally pre-configured. Moreover, this computational logic is implemented in the cortex via combining a random-connectivity strategy in superficial layers 2/3 with nonrandom organizations in deep layers 5/6. This randomness of layers 2/3 cliques—which preferentially encode specific and low-combinatorial features and project inter-cortically—is ideal for maximizing cross-modality novel pattern-extraction, pattern-discrimination and pattern-categorization using sparse code, consequently explaining why it requires hippocampal offline-consolidation. In contrast, the nonrandomness in layers 5/6—which consists of few specific cliques but a higher portion of more general cliques projecting mostly to subcortical systems—is ideal for feedback-control of motivation, emotion, consciousness and behaviors. These observations suggest that the brain’s basic computational algorithm is indeed organized by the power-of-two-based permutation logic. This simple mathematical logic can account for brain computation across the entire evolutionary spectrum, ranging from the simplest neural networks to the most complex. PMID:27895562

  6. Predit: A temporal predictive framework for scheduling systems

    NASA Technical Reports Server (NTRS)

    Paolucci, E.; Patriarca, E.; Sem, M.; Gini, G.

    1992-01-01

    Scheduling can be formalized as a Constraint Satisfaction Problem (CSP). Within this framework activities belonging to a plan are interconnected via temporal constraints that account for slack among them. Temporal representation must include methods for constraints propagation and provide a logic for symbolic and numerical deductions. In this paper we describe a support framework for opportunistic reasoning in constraint directed scheduling. In order to focus the attention of an incremental scheduler on critical problem aspects, some discrete temporal indexes are presented. They are also useful for the prediction of the degree of resources contention. The predictive method expressed through our indexes can be seen as a Knowledge Source for an opportunistic scheduler with a blackboard architecture.

  7. A logical foundation for representation of clinical data.

    PubMed Central

    Campbell, K E; Das, A K; Musen, M A

    1994-01-01

    OBJECTIVE: A general framework for representation of clinical data that provides a declarative semantics of terms and that allows developers to define explicitly the relationships among both terms and combinations of terms. DESIGN: Use of conceptual graphs as a standard representation of logic and of an existing standardized vocabulary, the Systematized Nomenclature of Medicine (SNOMED International), for lexical elements. Concepts such as time, anatomy, and uncertainty must be modeled explicitly in a way that allows relation of these foundational concepts to surface-level clinical descriptions in a uniform manner. RESULTS: The proposed framework was used to model a simple radiology report, which included temporal references. CONCLUSION: Formal logic provides a framework for formalizing the representation of medical concepts. Actual implementations will be required to evaluate the practicality of this approach. PMID:7719805

  8. Conceptual model of comprehensive research metrics for improved human health and environment.

    PubMed

    Engel-Cox, Jill A; Van Houten, Bennett; Phelps, Jerry; Rose, Shyanika W

    2008-05-01

    Federal, state, and private research agencies and organizations have faced increasing administrative and public demand for performance measurement. Historically, performance measurement predominantly consisted of near-term outputs measured through bibliometrics. The recent focus is on accountability for investment based on long-term outcomes. Developing measurable outcome-based metrics for research programs has been particularly challenging, because of difficulty linking research results to spatially and temporally distant outcomes. Our objective in this review is to build a logic model and associated metrics through which to measure the contribution of environmental health research programs to improvements in human health, the environment, and the economy. We used expert input and literature research on research impact assessment. With these sources, we developed a logic model that defines the components and linkages between extramural environmental health research grant programs and the outputs and outcomes related to health and social welfare, environmental quality and sustainability, economics, and quality of life. The logic model focuses on the environmental health research portfolio of the National Institute of Environmental Health Sciences (NIEHS) Division of Extramural Research and Training. The model delineates pathways for contributions by five types of institutional partners in the research process: NIEHS, other government (federal, state, and local) agencies, grantee institutions, business and industry, and community partners. The model is being applied to specific NIEHS research applications and the broader research community. We briefly discuss two examples and discuss the strengths and limits of outcome-based evaluation of research programs.

  9. Towards the formal specification of the requirements and design of a processor interface unit: HOL listings

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.

    1993-01-01

    This technical report contains the HOL listings of the specification of the design and major portions of the requirements for a commercially developed processor interface unit (or PIU). The PIU is an interface chip performing memory interface, bus interface, and additional support services for a commercial microprocessor within a fault-tolerant computer system. This system, the Fault-Tolerant Embedded Processor (FTEP), is targeted towards applications in avionics and space requiring extremely high levels of mission reliability, extended maintenance-free operation, or both. This report contains the actual HOL listings of the PIU specification as it currently exists. Section two of this report contains general-purpose HOL theories that support the PIU specification. These theories include definitions for the hardware components used in the PIU, our implementation of bit words, and our implementation of temporal logic. Section three contains the HOL listings for the PIU design specification. Aside from the PIU internal bus (I-Bus), this specification is complete. Section four contains the HOL listings for a major portion of the PIU requirements specification. Specifically, it contains most of the definition for the PIU behavior associated with memory accesses initiated by the local processor.

  10. Interpreting Abstract Interpretations in Membership Equational Logic

    NASA Technical Reports Server (NTRS)

    Fischer, Bernd; Rosu, Grigore

    2001-01-01

    We present a logical framework in which abstract interpretations can be naturally specified and then verified. Our approach is based on membership equational logic which extends equational logics by membership axioms, asserting that a term has a certain sort. We represent an abstract interpretation as a membership equational logic specification, usually as an overloaded order-sorted signature with membership axioms. It turns out that, for any term, its least sort over this specification corresponds to its most concrete abstract value. Maude implements membership equational logic and provides mechanisms to calculate the least sort of a term efficiently. We first show how Maude can be used to get prototyping of abstract interpretations "for free." Building on the meta-logic facilities of Maude, we further develop a tool that automatically checks and abstract interpretation against a set of user-defined properties. This can be used to select an appropriate abstract interpretation, to characterize the specified loss of information during abstraction, and to compare different abstractions with each other.

  11. Study of Discussion Record Analysis Using Temporal Data Crystallization and Its Application to TV Scene Analysis

    DTIC Science & Technology

    2015-03-31

    analysis. For scene analysis, we use Temporal Data Crystallization (TDC), and for logical analysis, we use Speech Act theory and Toulmin Argumentation...utterance in the discussion record. (i) An utterance ID, and a speaker ID (ii) Speech acts (iii) Argument structure Speech act denotes...mediator is expected to use more OQs than CQs. When the speech act of an utterance is an argument, furthermore, we recognize the conclusion part

  12. A Rewriting-Based Approach to Trace Analysis

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)

    2002-01-01

    We present a rewriting-based algorithm for efficiently evaluating future time Linear Temporal Logic (LTL) formulae on finite execution traces online. While the standard models of LTL are infinite traces, finite traces appear naturally when testing and/or monitoring red applications that only run for limited time periods. The presented algorithm is implemented in the Maude executable specification language and essentially consists of a set of equations establishing an executable semantics of LTL using a simple formula transforming approach. The algorithm is further improved to build automata on-the-fly from formulae, using memoization. The result is a very efficient and small Maude program that can be used to monitor program executions. We furthermore present an alternative algorithm for synthesizing probably minimal observer finite state machines (or automata) from LTL formulae, which can be used to analyze execution traces without the need for a rewriting system, and can hence be used by observers written in conventional programming languages. The presented work is part of an ambitious runtime verification and monitoring project at NASA Ames, called PATHEXPLORER, and demonstrates that rewriting can be a tractable and attractive means for experimenting and implementing program monitoring logics.

  13. A framework for qualitative reasoning about solid objects

    NASA Technical Reports Server (NTRS)

    Davis, E.

    1987-01-01

    Predicting the behavior of a qualitatively described system of solid objects requires a combination of geometrical, temporal, and physical reasoning. Methods based upon formulating and solving differential equations are not adequate for robust prediction, since the behavior of a system over extended time may be much simpler than its behavior over local time. A first-order logic, in which one can state simple physical problems and derive their solution deductively, without recourse to solving the differential equations, is discussed. This logic is substantially more expressive and powerful than any previous AI representational system in this domain.

  14. The Everett-Wheeler interpretation and the open future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sudbery, Anthony

    2011-03-28

    I discuss the meaning of probability in the Everett-Wheeler interpretation of quantum mechanics, together with the problem of defining histories. To resolve these, I propose an understanding of probability arising from a form of temporal logic: the probability of a future-tense proposition is identified with its truth value in a many-valued and context-dependent logic. In short, probability is degree of truth. These ideas relate to traditional naive ideas of time and chance. Indeed, I argue that Everettian quantum mechanics is the only form of scientific theory that truly incorporates the perception that the future is open.

  15. Development of a methodology for assessing the safety of embedded software systems

    NASA Technical Reports Server (NTRS)

    Garrett, C. J.; Guarro, S. B.; Apostolakis, G. E.

    1993-01-01

    A Dynamic Flowgraph Methodology (DFM) based on an integrated approach to modeling and analyzing the behavior of software-driven embedded systems for assessing and verifying reliability and safety is discussed. DFM is based on an extension of the Logic Flowgraph Methodology to incorporate state transition models. System models which express the logic of the system in terms of causal relationships between physical variables and temporal characteristics of software modules are analyzed to determine how a certain state can be reached. This is done by developing timed fault trees which take the form of logical combinations of static trees relating the system parameters at different point in time. The resulting information concerning the hardware and software states can be used to eliminate unsafe execution paths and identify testing criteria for safety critical software functions.

  16. TraceContract

    NASA Technical Reports Server (NTRS)

    Kavelund, Klaus; Barringer, Howard

    2012-01-01

    TraceContract is an API (Application Programming Interface) for trace analysis. A trace is a sequence of events, and can, for example, be generated by a running program, instrumented appropriately to generate events. An event can be any data object. An example of a trace is a log file containing events that a programmer has found important to record during a program execution. Trace - Contract takes as input such a trace together with a specification formulated using the API and reports on any violations of the specification, potentially calling code (reactions) to be executed when violations are detected. The software is developed as an internal DSL (Domain Specific Language) in the Scala programming language. Scala is a relatively new programming language that is specifically convenient for defining such internal DSLs due to a number of language characteristics. This includes Scala s elegant combination of object-oriented and functional programming, a succinct notation, and an advanced type system. The DSL offers a combination of data-parameterized state machines and temporal logic, which is novel. As an extension of Scala, it is a very expressive and convenient log file analysis framework.

  17. Logic for Physicists

    NASA Astrophysics Data System (ADS)

    Pereyra, Nicolas A.

    2018-06-01

    This book gives a rigorous yet 'physics-focused' introduction to mathematical logic that is geared towards natural science majors. We present the science major with a robust introduction to logic, focusing on the specific knowledge and skills that will unavoidably be needed in calculus topics and natural science topics in general (rather than taking a philosophical-math-fundamental oriented approach that is commonly found in mathematical logic textbooks).

  18. On Mathematical Proving

    NASA Astrophysics Data System (ADS)

    Stefaneas, Petros; Vandoulakis, Ioannis M.

    2015-12-01

    This paper outlines a logical representation of certain aspects of the process of mathematical proving that are important from the point of view of Artificial Intelligence. Our starting-point is the concept of proof-event or proving, introduced by Goguen, instead of the traditional concept of mathematical proof. The reason behind this choice is that in contrast to the traditional static concept of mathematical proof, proof-events are understood as processes, which enables their use in Artificial Intelligence in such contexts, in which problem-solving procedures and strategies are studied. We represent proof-events as problem-centered spatio-temporal processes by means of the language of the calculus of events, which captures adequately certain temporal aspects of proof-events (i.e. that they have history and form sequences of proof-events evolving in time). Further, we suggest a "loose" semantics for the proof-events, by means of Kolmogorov's calculus of problems. Finally, we expose the intented interpretations for our logical model from the fields of automated theorem-proving and Web-based collective proving.

  19. "Antelope": a hybrid-logic model checker for branching-time Boolean GRN analysis

    PubMed Central

    2011-01-01

    Background In Thomas' formalism for modeling gene regulatory networks (GRNs), branching time, where a state can have more than one possible future, plays a prominent role. By representing a certain degree of unpredictability, branching time can model several important phenomena, such as (a) asynchrony, (b) incompletely specified behavior, and (c) interaction with the environment. Introducing more than one possible future for a state, however, creates a difficulty for ordinary simulators, because infinitely many paths may appear, limiting ordinary simulators to statistical conclusions. Model checkers for branching time, by contrast, are able to prove properties in the presence of infinitely many paths. Results We have developed Antelope ("Analysis of Networks through TEmporal-LOgic sPEcifications", http://turing.iimas.unam.mx:8080/AntelopeWEB/), a model checker for analyzing and constructing Boolean GRNs. Currently, software systems for Boolean GRNs use branching time almost exclusively for asynchrony. Antelope, by contrast, also uses branching time for incompletely specified behavior and environment interaction. We show the usefulness of modeling these two phenomena in the development of a Boolean GRN of the Arabidopsis thaliana root stem cell niche. There are two obstacles to a direct approach when applying model checking to Boolean GRN analysis. First, ordinary model checkers normally only verify whether or not a given set of model states has a given property. In comparison, a model checker for Boolean GRNs is preferable if it reports the set of states having a desired property. Second, for efficiency, the expressiveness of many model checkers is limited, resulting in the inability to express some interesting properties of Boolean GRNs. Antelope tries to overcome these two drawbacks: Apart from reporting the set of all states having a given property, our model checker can express, at the expense of efficiency, some properties that ordinary model checkers (e.g., NuSMV) cannot. This additional expressiveness is achieved by employing a logic extending the standard Computation-Tree Logic (CTL) with hybrid-logic operators. Conclusions We illustrate the advantages of Antelope when (a) modeling incomplete networks and environment interaction, (b) exhibiting the set of all states having a given property, and (c) representing Boolean GRN properties with hybrid CTL. PMID:22192526

  20. UML activity diagram swimlanes in logic controller design

    NASA Astrophysics Data System (ADS)

    Grobelny, Michał; Grobelna, Iwona

    2015-12-01

    Logic controller behavior can be specified using various techniques, including UML activity diagrams and control Petri nets. Each technique has its advantages and disadvantages. Application of both specification types in one project allows to take benefits from both of them. Additional elements of UML models make it possible to divide a specification into some parts, considered from other point of view (logic controller, user or system). The paper introduces an idea to use UML activity diagrams with swimlanes to increase the understandability of design models.

  1. A single-layer platform for Boolean logic and arithmetic through DNA excision in mammalian cells

    PubMed Central

    Weinberg, Benjamin H.; Hang Pham, N. T.; Caraballo, Leidy D.; Lozanoski, Thomas; Engel, Adrien; Bhatia, Swapnil; Wong, Wilson W.

    2017-01-01

    Genetic circuits engineered for mammalian cells often require extensive fine-tuning to perform their intended functions. To overcome this problem, we present a generalizable biocomputing platform that can engineer genetic circuits which function in human cells with minimal optimization. We used our Boolean Logic and Arithmetic through DNA Excision (BLADE) platform to build more than 100 multi-input-multi-output circuits. We devised a quantitative metric to evaluate the performance of the circuits in human embryonic kidney and Jurkat T cells. Of 113 circuits analysed, 109 functioned (96.5%) with the correct specified behavior without any optimization. We used our platform to build a three-input, two-output Full Adder and six-input, one-output Boolean Logic Look Up Table. We also used BLADE to design circuits with temporal small molecule-mediated inducible control and circuits that incorporate CRISPR/Cas9 to regulate endogenous mammalian genes. PMID:28346402

  2. Ascertaining and Graphically Representing the Logical Structure of Japanese Essays

    NASA Astrophysics Data System (ADS)

    Ishioka, Tsunenori

    To more accurately assess the logical structure of Japanese essays, I have devised a technique that uses end-of-sentence modality and demonstrative pronouns referencing earlier paragraphs as new indicators of structure in addition to conjunctive expressions which have hitherto often used for Japanese as well as for European languages. It is hoped that this will yield better results because conjunctive expressions are intentionally avoided in Japanese. I applied this technique to the editorial and commentary (Yoroku) columns of the Mainichi Shimbun newspaper and used it to represent the structure and development of the arguments made by these articles in the form of constellation diagrams which are used in the field of statistics. As a result, I found that this graph is useful in that it enables the overall distribution to be ascertained, and allows the temporal changes in the logical structure of the data in question to be ascertained.

  3. Declarative Programming with Temporal Constraints, in the Language CG.

    PubMed

    Negreanu, Lorina

    2015-01-01

    Specifying and interpreting temporal constraints are key elements of knowledge representation and reasoning, with applications in temporal databases, agent programming, and ambient intelligence. We present and formally characterize the language CG, which tackles this issue. In CG, users are able to develop time-dependent programs, in a flexible and straightforward manner. Such programs can, in turn, be coupled with evolving environments, thus empowering users to control the environment's evolution. CG relies on a structure for storing temporal information, together with a dedicated query mechanism. Hence, we explore the computational complexity of our query satisfaction problem. We discuss previous implementation attempts of CG and introduce a novel prototype which relies on logic programming. Finally, we address the issue of consistency and correctness of CG program execution, using the Event-B modeling approach.

  4. Software Development Technologies for Reactive, Real-Time, and Hybrid Systems: Summary of Research

    NASA Technical Reports Server (NTRS)

    Manna, Zohar

    1998-01-01

    This research is directed towards the implementation of a comprehensive deductive-algorithmic environment (toolkit) for the development and verification of high assurance reactive systems, especially concurrent, real-time, and hybrid systems. For this, we have designed and implemented the STCP (Stanford Temporal Prover) verification system. Reactive systems have an ongoing interaction with their environment, and their computations are infinite sequences of states. A large number of systems can be seen as reactive systems, including hardware, concurrent programs, network protocols, and embedded systems. Temporal logic provides a convenient language for expressing properties of reactive systems. A temporal verification methodology provides procedures for proving that a given system satisfies a given temporal property. The research covered necessary theoretical foundations as well as implementation and application issues.

  5. Reasoning About Relations

    ERIC Educational Resources Information Center

    Goodwin, Geoffrey P.; Johnson-Laird, P. N.

    2005-01-01

    Inferences about spatial, temporal, and other relations are ubiquitous. This article presents a novel model-based theory of such reasoning. The theory depends on 5 principles. (a) The structure of mental models is iconic as far as possible. (b) The logical consequences of relations emerge from models constructed from the meanings of the relations…

  6. ECO LOGIC INTERNATIONAL GAS-PHASE CHEMICAL REDUCTION PROCESS - THE THERMAL DESORPTION UNIT - APPLICATIONS ANALYSIS REPORT

    EPA Science Inventory

    ELI ECO Logic International, Inc.'s Thermal Desorption Unit (TDU) is specifically designed for use with Eco Logic's Gas Phase Chemical Reduction Process. The technology uses an externally heated bath of molten tin in a hydrogen atmosphere to desorb hazardous organic compounds fro...

  7. Describing the What and Why of Students' Difficulties in Boolean Logic

    ERIC Educational Resources Information Center

    Herman, Geoffrey L.; Loui, Michael C.; Kaczmarczyk, Lisa; Zilles, Craig

    2012-01-01

    The ability to reason with formal logic is a foundational skill for computer scientists and computer engineers that scaffolds the abilities to design, debug, and optimize. By interviewing students about their understanding of propositional logic and their ability to translate from English specifications to Boolean expressions, we characterized…

  8. 77 FR 35107 - Petition for Waiver of Compliance

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-12

    ... devices. CSX requests relief from 49 CFR 236.109 as it applies to variable timers within the program logic... program logic of the operating software. However, CSX notes that some microprocessor-based equipment have.../check sum/universal control number of the existing location specific application logic to the previously...

  9. Assessing host-specificity of Escherichia coli using a supervised learning logic-regression-based analysis of single nucleotide polymorphisms in intergenic regions.

    PubMed

    Zhi, Shuai; Li, Qiaozhi; Yasui, Yutaka; Edge, Thomas; Topp, Edward; Neumann, Norman F

    2015-11-01

    Host specificity in E. coli is widely debated. Herein, we used supervised learning logic-regression-based analysis of intergenic DNA sequence variability in E. coli in an attempt to identify single nucleotide polymorphism (SNP) biomarkers of E. coli that are associated with natural selection and evolution toward host specificity. Seven-hundred and eighty strains of E. coli were isolated from 15 different animal hosts. We utilized logic regression for analyzing DNA sequence data of three intergenic regions (flanked by the genes uspC-flhDC, csgBAC-csgDEFG, and asnS-ompF) to identify genetic biomarkers that could potentially discriminate E. coli based on host sources. Across 15 different animal hosts, logic regression successfully discriminated E. coli based on animal host source with relatively high specificity (i.e., among the samples of the non-target animal host, the proportion that correctly did not have the host-specific marker pattern) and sensitivity (i.e., among the samples from a given animal host, the proportion that correctly had the host-specific marker pattern), even after fivefold cross validation. Permutation tests confirmed that for most animals, host specific intergenic biomarkers identified by logic regression in E. coli were significantly associated with animal host source. The highest level of biomarker sensitivity was observed in deer isolates, with 82% of all deer E. coli isolates displaying a unique SNP pattern that was 98% specific to deer. Fifty-three percent of human isolates displayed a unique biomarker pattern that was 98% specific to humans. Twenty-nine percent of cattle isolates displayed a unique biomarker that was 97% specific to cattle. Interestingly, even within a related host group (i.e., Family: Canidae [domestic dogs and coyotes]), highly specific SNP biomarkers (98% and 99% specificity for dog and coyotes, respectively) were observed, with 21% of dog E. coli isolates displaying a unique dog biomarker and 61% of coyote isolates displaying a unique coyote biomarker. Application of a supervised learning method, such as logic regression, to DNA sequence analysis at certain intergenic regions demonstrates that some E. coli strains may evolve to become host-specific. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. A Logic Programming Testbed for Inductive Thought and Specification.

    ERIC Educational Resources Information Center

    Neff, Norman D.

    This paper describes applications of logic programming technology to the teaching of the inductive method in computer science and mathematics. It discusses the nature of inductive thought and its place in those fields of inquiry, arguing that a complete logic programming system for supporting inductive inference is not only feasible but necessary.…

  11. A Scala DSL for RETE-Based Runtime Verification

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus

    2013-01-01

    Runtime verification (RV) consists in part of checking execution traces against formalized specifications. Several systems have emerged, most of which support specification notations based on state machines, regular expressions, temporal logic, or grammars. The field of Artificial Intelligence (AI) has for an even longer period of time studied rule-based production systems, which at a closer look appear to be relevant for RV, although seemingly focused on slightly different application domains, such as for example business processes and expert systems. The core algorithm in many of these systems is the Rete algorithm. We have implemented a Rete-based runtime verification system, named LogFire (originally intended for offline log analysis but also applicable to online analysis), as an internal DSL in the Scala programming language, using Scala's support for defining DSLs. This combination appears attractive from a practical point of view. Our contribution is in part conceptual in arguing that such rule-based frameworks originating from AI may be suited for RV.

  12. Evaluation of a patient with suspected chronic demyelinating polyneuropathy.

    PubMed

    Jani-Acsadi, Agnes; Lewis, Richard A

    2013-01-01

    Demyelinating neuropathies are typically characterized by physiological slowing of conduction velocity and pathologically by segmental loss of myelin and in some instances, evidence of remyelination. Clinically, patients with demyelinating neuropathy can be seen with inherited disorders (Charcot-Marie-Tooth disease) or acquired disorders, typically immune-mediated or inflammatory. The acquired disorders can be either acute or subacute as seen in the acute inflammatory demyelinating polyneuropathy (AIDP) form of Guillain-Barré syndrome or chronic progressive or relapsing disorders such as chronic inflammatory demyelinating polyneuropathy. It is important to develop a logical approach to diagnosing these disorders. This requires an understanding of the clinical, genetic, physiological, and pathological features of these neuropathies. Clinically, important features to consider are the temporal progression, degree of symmetry, and involvement of proximal as well as distal muscles. Genetically, recognizing the different inheritance patterns and age of onset allow for a coordinated approach to determining a specific genotype. Physiologically, besides nerve conduction slowing, other physiological hallmarks of demyelination include temporal dispersion of compound motor action potentials (CMAP) on proximal stimulation, conduction block, and distal CMAP duration prolongation with certain patterns of involvement pointing to specific disorders. This chapter focuses on these various aspects of the evaluation of patients with chronic acquired demyelinating neuropathies to develop a comprehensive and thoughtful diagnostic concept. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. Coupling induced logical stochastic resonance

    NASA Astrophysics Data System (ADS)

    Aravind, Manaoj; Murali, K.; Sinha, Sudeshna

    2018-06-01

    In this work we will demonstrate the following result: when we have two coupled bistable sub-systems, each driven separately by an external logic input signal, the coupled system yields outputs that can be mapped to specific logic gate operations in a robust manner, in an optimal window of noise. So, though the individual systems receive only one logic input each, due to the interplay of coupling, nonlinearity and noise, they cooperatively respond to give a logic output that is a function of both inputs. Thus the emergent collective response of the system, due to the inherent coupling, in the presence of a noise floor, maps consistently to that of logic outputs of the two inputs, a phenomenon we term coupling induced Logical Stochastic Resonance. Lastly, we demonstrate our idea in proof of principle circuit experiments.

  14. PERCEPTUAL SYSTEMS IN READING--THE PREDICTION OF A TEMPORAL EYE-VOICE SPAN CONSTANT. PAPER.

    ERIC Educational Resources Information Center

    GEYER, JOHN JACOB

    A STUDY WAS CONDUCTED TO DELINEATE HOW PERCEPTION OCCURS DURING ORAL READING. FROM AN ANALYSIS OF CLASSICAL AND MODERN RESEARCH, A HEURISTIC MODEL WAS CONSTRUCTED WHICH DELINEATED THE DIRECTLY INTERACTING SYSTEMS POSTULATED AS FUNCTIONING DURING ORAL READING. THE MODEL AS OUTLINED WAS DIFFERENTIATED LOGICALLY INTO THREE MAJOR PROCESSING…

  15. Enhancing Television Literacy Skills among Preschool Children Through an Intervention Program in the Kindergarten.

    ERIC Educational Resources Information Center

    Tidhar, Chava E.

    1996-01-01

    A study of 150 preschoolers suggests that systematic teacher mediation can enhance children's interpretive skills of television material, such as the ability to: identify fantasy in relation to special effects; bridge temporal and logical gaps; identify elements of camera work and their visual implications; and make intelligent predictions based…

  16. Future Contingents, Freedom, and Foreknowledge

    ERIC Educational Resources Information Center

    Abouzahr, Mohammed S.

    2013-01-01

    This essay is a contribution to the new trend and old tradition of analyzing theological fatalism in light of its relationship to logical fatalism. All results pertain to branching temporal systems that use the A-theory and assume presentism. The project focuses on two kinds of views about branching time. One position is true futurism, which…

  17. Radiation Hardened Structured ASIC Platform with Compensation of Delay for Temperature and Voltage Variations for Multiple Redundant Temporal Voting Latch Technology

    NASA Technical Reports Server (NTRS)

    Ardalan, Sasan (Inventor)

    2018-01-01

    The invention relates to devices and methods of maintaining the current starved delay at a constant value across variations in voltage and temperature to increase the speed of operation of the sequential logic in the radiation hardened ASIC design.

  18. Diagnostic performance of optical coherence tomography ganglion cell--inner plexiform layer thickness measurements in early glaucoma.

    PubMed

    Mwanza, Jean-Claude; Budenz, Donald L; Godfrey, David G; Neelakantan, Arvind; Sayyad, Fouad E; Chang, Robert T; Lee, Richard K

    2014-04-01

    To evaluate the glaucoma diagnostic performance of ganglion cell inner-plexiform layer (GCIPL) parameters used individually and in combination with retinal nerve fiber layer (RNFL) or optic nerve head (ONH) parameters measured with Cirrus HD-OCT (Carl Zeiss Meditec, Inc, Dublin, CA). Prospective cross-sectional study. Fifty patients with early perimetric glaucoma and 49 age-matched healthy subjects. Three peripapillary RNFL and 3 macular GCIPL scans were obtained in 1 eye of each participant. A patient was considered glaucomatous if at least 2 of the 3 RNFL or GCIPL scans had the average or at least 1 sector measurement flagged at 1% to 5% or less than 1%. The diagnostic performance was determined for each GCIPL, RNFL, and ONH parameter as well as for binary or-logic and and-logic combinations of GCIPL with RNFL or ONH parameters. Sensitivity, specificity, positive likelihood ratio (PLR), and negative likelihood ratio (NLR). Among GCIPL parameters, the minimum had the best diagnostic performance (sensitivity, 82.0%; specificity, 87.8%; PLR, 6.69; and NLR, 0.21). Inferior quadrant was the best RNFL parameter (sensitivity, 74%; specificity, 95.9%; PLR, 18.13; and NLR, 0.27), as was rim area (sensitivity, 68%; specificity, 98%; PLR, 33.3; and NLR, 0.33) among ONH parameters. The or-logic combination of minimum GCIPL and average RNFL provided the overall best diagnostic performance (sensitivity, 94%; specificity, 85.7%; PRL, 6.58; and NLR, 0.07) as compared with the best RNFL, best ONH, and best and-logic combination (minimum GCIPL and inferior quadrant RNFL; sensitivity, 64%; specificity, 100%; PLR, infinity; and NPR, 0.36). The binary or-logic combination of minimum GCIPL and average RNFL or rim area provides better diagnostic performances than those of and-logic combinations or best single GCIPL, RNFL, or ONH parameters. This finding may be clinically valuable for the diagnosis of early glaucoma. Copyright © 2014 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  19. Pecan Research and Outreach in New Mexico: Logic Model Development and Change in Communication Paradigms

    ERIC Educational Resources Information Center

    Sammis, Theodore W.; Shukla, Manoj K.; Mexal, John G.; Wang, Junming; Miller, David R.

    2013-01-01

    Universities develop strategic planning documents, and as part of that planning process, logic models are developed for specific programs within the university. This article examines the long-standing pecan program at New Mexico State University and the deficiencies and successes in the evolution of its logic model. The university's agricultural…

  20. A DNA Logic Gate Automaton for Detection of Rabies and Other Lyssaviruses.

    PubMed

    Vijayakumar, Pavithra; Macdonald, Joanne

    2017-07-05

    Immediate activation of biosensors is not always desirable, particularly if activation is due to non-specific interactions. Here we demonstrate the use of deoxyribozyme-based logic gate networks arranged into visual displays to precisely control activation of biosensors, and demonstrate a prototype molecular automaton able to discriminate between seven different genotypes of Lyssaviruses, including Rabies virus. The device uses novel mixed-base logic gates to enable detection of the large diversity of Lyssavirus sequence populations, while an ANDNOT logic gate prevents non-specific activation across genotypes. The resultant device provides a user-friendly digital-like, but molecule-powered, dot-matrix text output for unequivocal results read-out that is highly relevant for point of care applications. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Axioms for Obligation and Robustness with Temporal Logic

    NASA Astrophysics Data System (ADS)

    French, Tim; McCabe-Dansted, John C.; Reynolds, Mark

    RoCTL* was proposed to model and specify the robustness of reactive systems. RoCTL* extended CTL* with the addition of Obligatory and Robustly operators, which quantify over failure-free paths and paths with one more failure respectively. This paper gives an axiomatisation for all the operators of RoCTL* with the exception of the Until operator; this fragment is able to express similar contrary-to-duty obligations to the full RoCTL* logic. We call this formal system NORA, and give a completeness proof. We also consider the fragments of the language containing only path quantifiers (but where variables are dependent on histories). We examine semantic properties and potential axiomatisations for these fragments.

  2. An architecture for designing fuzzy logic controllers using neural networks

    NASA Technical Reports Server (NTRS)

    Berenji, Hamid R.

    1991-01-01

    Described here is an architecture for designing fuzzy controllers through a hierarchical process of control rule acquisition and by using special classes of neural network learning techniques. A new method for learning to refine a fuzzy logic controller is introduced. A reinforcement learning technique is used in conjunction with a multi-layer neural network model of a fuzzy controller. The model learns by updating its prediction of the plant's behavior and is related to the Sutton's Temporal Difference (TD) method. The method proposed here has the advantage of using the control knowledge of an experienced operator and fine-tuning it through the process of learning. The approach is applied to a cart-pole balancing system.

  3. A reinforcement learning-based architecture for fuzzy logic control

    NASA Technical Reports Server (NTRS)

    Berenji, Hamid R.

    1992-01-01

    This paper introduces a new method for learning to refine a rule-based fuzzy logic controller. A reinforcement learning technique is used in conjunction with a multilayer neural network model of a fuzzy controller. The approximate reasoning based intelligent control (ARIC) architecture proposed here learns by updating its prediction of the physical system's behavior and fine tunes a control knowledge base. Its theory is related to Sutton's temporal difference (TD) method. Because ARIC has the advantage of using the control knowledge of an experienced operator and fine tuning it through the process of learning, it learns faster than systems that train networks from scratch. The approach is applied to a cart-pole balancing system.

  4. Programmable and multiparameter DNA-based logic platform for cancer recognition and targeted therapy.

    PubMed

    You, Mingxu; Zhu, Guizhi; Chen, Tao; Donovan, Michael J; Tan, Weihong

    2015-01-21

    The specific inventory of molecules on diseased cell surfaces (e.g., cancer cells) provides clinicians an opportunity for accurate diagnosis and intervention. With the discovery of panels of cancer markers, carrying out analyses of multiple cell-surface markers is conceivable. As a trial to accomplish this, we have recently designed a DNA-based device that is capable of performing autonomous logic-based analysis of two or three cancer cell-surface markers. Combining the specific target-recognition properties of DNA aptamers with toehold-mediated strand displacement reactions, multicellular marker-based cancer analysis can be realized based on modular AND, OR, and NOT Boolean logic gates. Specifically, we report here a general approach for assembling these modular logic gates to execute programmable and higher-order profiling of multiple coexisting cell-surface markers, including several found on cancer cells, with the capacity to report a diagnostic signal and/or deliver targeted photodynamic therapy. The success of this strategy demonstrates the potential of DNA nanotechnology in facilitating targeted disease diagnosis and effective therapy.

  5. Globally-Applicable Predictive Wildfire Model   a Temporal-Spatial GIS Based Risk Analysis Using Data Driven Fuzzy Logic Functions

    NASA Astrophysics Data System (ADS)

    van den Dool, G.

    2017-11-01

    This study (van den Dool, 2017) is a proof of concept for a global predictive wildfire model, in which the temporal-spatial characteristics of wildfires are placed in a Geographical Information System (GIS), and the risk analysis is based on data-driven fuzzy logic functions. The data sources used in this model are available as global datasets, but subdivided into three pilot areas: North America (California/Nevada), Europe (Spain), and Asia (Mongolia), and are downscaled to the highest resolution (3-arc second). The GIS is constructed around three themes: topography, fuel availability and climate. From the topographical data, six derived sub-themes are created and converted to a fuzzy membership based on the catchment area statistics. The fuel availability score is a composite of four data layers: land cover, wood loads, biomass, biovolumes. As input for the climatological sub-model reanalysed daily averaged, weather-related data is used, which is accumulated to a global weekly time-window (to account for the uncertainty within the climatological model) and forms the temporal component of the model. The final product is a wildfire risk score (from 0 to 1) by week, representing the average wildfire risk in an area. To compute the potential wildfire risk the sub-models are combined usinga Multi-Criteria Approach, and the model results are validated against the area under the Receiver Operating Characteristic curve.

  6. "Glitch Logic" and Applications to Computing and Information Security

    NASA Technical Reports Server (NTRS)

    Stoica, Adrian; Katkoori, Srinivas

    2009-01-01

    This paper introduces a new method of information processing in digital systems, and discusses its potential benefits to computing and information security. The new method exploits glitches caused by delays in logic circuits for carrying and processing information. Glitch processing is hidden to conventional logic analyses and undetectable by traditional reverse engineering techniques. It enables the creation of new logic design methods that allow for an additional controllable "glitch logic" processing layer embedded into a conventional synchronous digital circuits as a hidden/covert information flow channel. The combination of synchronous logic with specific glitch logic design acting as an additional computing channel reduces the number of equivalent logic designs resulting from synthesis, thus implicitly reducing the possibility of modification and/or tampering with the design. The hidden information channel produced by the glitch logic can be used: 1) for covert computing/communication, 2) to prevent reverse engineering, tampering, and alteration of design, and 3) to act as a channel for information infiltration/exfiltration and propagation of viruses/spyware/Trojan horses.

  7. Contradictory Reasoning Network: An EEG and fMRI Study

    PubMed Central

    Thai, Ngoc Jade; Seri, Stefano; Rotshtein, Pia; Tecchio, Franca

    2014-01-01

    Contradiction is a cornerstone of human rationality, essential for everyday life and communication. We investigated electroencephalographic (EEG) and functional magnetic resonance imaging (fMRI) in separate recording sessions during contradictory judgments, using a logical structure based on categorical propositions of the Aristotelian Square of Opposition (ASoO). The use of ASoO propositions, while controlling for potential linguistic or semantic confounds, enabled us to observe the spatial temporal unfolding of this contradictory reasoning. The processing started with the inversion of the logical operators corresponding to right middle frontal gyrus (rMFG-BA11) activation, followed by identification of contradictory statement associated with in the right inferior frontal gyrus (rIFG-BA47) activation. Right medial frontal gyrus (rMeFG, BA10) and anterior cingulate cortex (ACC, BA32) contributed to the later stages of process. We observed a correlation between the delayed latency of rBA11 response and the reaction time delay during inductive vs. deductive reasoning. This supports the notion that rBA11 is crucial for manipulating the logical operators. Slower processing time and stronger brain responses for inductive logic suggested that examples are easier to process than general principles and are more likely to simplify communication. PMID:24667491

  8. Contradictory reasoning network: an EEG and FMRI study.

    PubMed

    Porcaro, Camillo; Medaglia, Maria Teresa; Thai, Ngoc Jade; Seri, Stefano; Rotshtein, Pia; Tecchio, Franca

    2014-01-01

    Contradiction is a cornerstone of human rationality, essential for everyday life and communication. We investigated electroencephalographic (EEG) and functional magnetic resonance imaging (fMRI) in separate recording sessions during contradictory judgments, using a logical structure based on categorical propositions of the Aristotelian Square of Opposition (ASoO). The use of ASoO propositions, while controlling for potential linguistic or semantic confounds, enabled us to observe the spatial temporal unfolding of this contradictory reasoning. The processing started with the inversion of the logical operators corresponding to right middle frontal gyrus (rMFG-BA11) activation, followed by identification of contradictory statement associated with in the right inferior frontal gyrus (rIFG-BA47) activation. Right medial frontal gyrus (rMeFG, BA10) and anterior cingulate cortex (ACC, BA32) contributed to the later stages of process. We observed a correlation between the delayed latency of rBA11 response and the reaction time delay during inductive vs. deductive reasoning. This supports the notion that rBA11 is crucial for manipulating the logical operators. Slower processing time and stronger brain responses for inductive logic suggested that examples are easier to process than general principles and are more likely to simplify communication.

  9. Mosaic, self-similarity logic, and biological attraction principles: three explanatory instruments in biology.

    PubMed

    Agnati, Luigi F; Baluska, Frantisek; Barlow, Peter W; Guidolin, Diego

    2009-11-01

    From a structural standpoint, living organisms are organized like a nest of Russian matryoshka dolls, in which structures are buried within one another. From a temporal point of view, this type of organization is the result of a history comprised of a set of time backcloths which have accompanied the passage of living matter from its origins up to the present day. The aim of the present paper is to indicate a possible course of this 'passage through time, and suggest how today's complexity has been reached by living organisms. This investigation will employ three conceptual tools, namely the Mosaic, Self-Similarity Logic, and the Biological Attraction principles. Self-Similarity Logic indicates the self-consistency by which elements of a living system interact, irrespective of the spatiotemporal level under consideration. The term Mosaic indicates how, from the same set of elements assembled according to different patterns, it is possible to arrive at completely different constructions: hence, each system becomes endowed with different emergent properties. The Biological Attraction principle states that there is an inherent drive for association and merging of compatible elements at all levels of biological complexity. By analogy with the gravitation law in physics, biological attraction is based on the evidence that each living organism creates an attractive field around itself. This field acts as a sphere of influence that actively attracts similar fields of other biological systems, thereby modifying salient features of the interacting organisms. Three specific organizational levels of living matter, namely the molecular, cellular, and supracellular levels, have been considered in order to analyse and illustrate the interpretative as well as the predictive roles of each of these three explanatory principles.

  10. Mosaic, Self-Similarity Logic, and Biological Attraction principles

    PubMed Central

    Baluška, František; Barlow, Peter W; Guidolin, Diego

    2009-01-01

    From a structural standpoint, living organisms are organized like a nest of Russian matryoshka dolls, in which structures are buried within one another. From a temporal point of view, this type of organization is the result of a history comprised of a set of time backcloths which have accompanied the passage of living matter from its origins up to the present day. The aim of the present paper is to indicate a possible course of this ‘passage through time, and suggest how today’s complexity has been reached by living organisms. This investigation will employ three conceptual tools, namely the Mosaic, Self-Similarity Logic, and the Biological Attraction principles. Self-Similarity Logic indicates the self-consistency by which elements of a living system interact, irrespective of the spatiotemporal level under consideration. The term Mosaic indicates how, from the same set of elements assembled according to different patterns, it is possible to arrive at completely different constructions: hence, each system becomes endowed with different emergent properties. The Biological Attraction principle states that there is an inherent drive for association and merging of compatible elements at all levels of biological complexity. By analogy with the gravitation law in physics, biological attraction is based on the evidence that each living organism creates an attractive field around itself. This field acts as a sphere of influence that actively attracts similar fields of other biological systems, thereby modifying salient features of the interacting organisms. Three specific organizational levels of living matter, namely the molecular, cellular, and supracellular levels, have been considered in order to analyse and illustrate the interpretative as well as the predictive roles of each of these three explanatory principles. PMID:20195461

  11. Combining tabular, rule-based, and procedural knowledge in computer-based guidelines for childhood immunization.

    PubMed

    Miller, P L; Frawley, S J; Sayward, F G; Yasnoff, W A; Duncan, L; Fleming, D W

    1997-06-01

    IMM/Serve is a computer program which implements the clinical guidelines for childhood immunization. IMM/Serve accepts as input a child's immunization history. It then indicates which vaccinations are due and which vaccinations should be scheduled next. The clinical guidelines for immunization are quite complex and are modified quite frequently. As a result, it is important that IMM/Serve's knowledge be represented in a format that facilitates the maintenance of that knowledge as the field evolves over time. To achieve this goal, IMM/Serve uses four representations for different parts of its knowledge base: (1) Immunization forecasting parameters that specify the minimum ages and wait-intervals for each dose are stored in tabular form. (2) The clinical logic that determines which set of forecasting parameters applies for a particular patient in each vaccine series is represented using if-then rules. (3) The temporal logic that combines dates, ages, and intervals to calculate recommended dates, is expressed procedurally. (4) The screening logic that checks each previous dose for validity is performed using a decision table that combines minimum ages and wait intervals with a small amount of clinical logic. A knowledge maintenance tool, IMM/Def, has been developed to help maintain the rule-based logic. The paper describes the design of IMM/Serve and the rationale and role of the different forms of knowledge used.

  12. A Fuzzy Cognitive Model of aeolian instability across the South Texas Sandsheet

    NASA Astrophysics Data System (ADS)

    Houser, C.; Bishop, M. P.; Barrineau, C. P.

    2014-12-01

    Characterization of aeolian systems is complicated by rapidly changing surface-process regimes, spatio-temporal scale dependencies, and subjective interpretation of imagery and spatial data. This paper describes the development and application of analytical reasoning to quantify instability of an aeolian environment using scale-dependent information coupled with conceptual knowledge of process and feedback mechanisms. Specifically, a simple Fuzzy Cognitive Model (FCM) for aeolian landscape instability was developed that represents conceptual knowledge of key biophysical processes and feedbacks. Model inputs include satellite-derived surface biophysical and geomorphometric parameters. FCMs are a knowledge-based Artificial Intelligence (AI) technique that merges fuzzy logic and neural computing in which knowledge or concepts are structured as a web of relationships that is similar to both human reasoning and the human decision-making process. Given simple process-form relationships, the analytical reasoning model is able to map the influence of land management practices and the geomorphology of the inherited surface on aeolian instability within the South Texas Sandsheet. Results suggest that FCMs can be used to formalize process-form relationships and information integration analogous to human cognition with future iterations accounting for the spatial interactions and temporal lags across the sand sheets.

  13. Meeting the Deadline: Why, When and How

    NASA Technical Reports Server (NTRS)

    Dignum, Frank; Broersen, Jan; Dignum, Virginia; Meyer, John-Jules

    2004-01-01

    A normative system is defined as any set of interacting agents whose behavior can usefully be regarded as norm-directed. Most organizations, and more specifically institutions, fall under this definition. Interactions in these normative systems are regulated by normative templates that describe desired behavior in terms of deontic concepts (obligations, prohibitions and permissions), deadlines, violations and sanctions. Agreements between agents, and between an agent and the society, can then be specified by means of contracts. Contracts provide flexible but verifiable means to integrate society requirements and agent autonomy. and are an adequate means for the explicit specification of interactions. From the society perspective, it is important that these contracts adhere to the specifications described in the model of the organization. If we want to automate such verifications, we have to formalize the languages used for contracts and for the specification of organizations. The logic LCR is based on deontic temporal logic. LCR is an expressive language for describing interaction in multi-agent systems, including obligations with deadlines. Deadlines are important norms in most interactions between agents. Intuitively, a deadline states that an agent should perform an action before a certain point in time. The obligation to perform the action starts at the moment the deadline becomes active. E.g. when a contract is signed or approved. If the action is not performed in time a violation of the deadline occurs. It can be specified independently what measure has to be taken in this case. In this paper we investigate the deadline concept in more detail. The paper is organized as follows. Section 2 defines the variant of CTL we use. In section 3, we discuss the basic intuitions of deadlines. Section 4 presents a first intuitive formalization for deadlines. In section 5, we look at a more complex model for deadlines trying to catch some more practical aspects. Finally, in section 6 we present issues for future work and our conciusions.

  14. Synthesizing genetic sequential logic circuit with clock pulse generator.

    PubMed

    Chuang, Chia-Hua; Lin, Chun-Liang

    2014-05-28

    Rhythmic clock widely occurs in biological systems which controls several aspects of cell physiology. For the different cell types, it is supplied with various rhythmic frequencies. How to synthesize a specific clock signal is a preliminary but a necessary step to further development of a biological computer in the future. This paper presents a genetic sequential logic circuit with a clock pulse generator based on a synthesized genetic oscillator, which generates a consecutive clock signal whose frequency is an inverse integer multiple to that of the genetic oscillator. An analogous electronic waveform-shaping circuit is constructed by a series of genetic buffers to shape logic high/low levels of an oscillation input in a basic sinusoidal cycle and generate a pulse-width-modulated (PWM) output with various duty cycles. By controlling the threshold level of the genetic buffer, a genetic clock pulse signal with its frequency consistent to the genetic oscillator is synthesized. A synchronous genetic counter circuit based on the topology of the digital sequential logic circuit is triggered by the clock pulse to synthesize the clock signal with an inverse multiple frequency to the genetic oscillator. The function acts like a frequency divider in electronic circuits which plays a key role in the sequential logic circuit with specific operational frequency. A cascaded genetic logic circuit generating clock pulse signals is proposed. Based on analogous implement of digital sequential logic circuits, genetic sequential logic circuits can be constructed by the proposed approach to generate various clock signals from an oscillation signal.

  15. Two-step digit-set-restricted modified signed-digit addition-subtraction algorithm and its optoelectronic implementation.

    PubMed

    Qian, F; Li, G; Ruan, H; Jing, H; Liu, L

    1999-09-10

    A novel, to our knowledge, two-step digit-set-restricted modified signed-digit (MSD) addition-subtraction algorithm is proposed. With the introduction of the reference digits, the operand words are mapped into an intermediate carry word with all digits restricted to the set {1, 0} and an intermediate sum word with all digits restricted to the set {0, 1}, which can be summed to form the final result without carry generation. The operation can be performed in parallel by use of binary logic. An optical system that utilizes an electron-trapping device is suggested for accomplishing the required binary logic operations. By programming of the illumination of data arrays, any complex logic operations of multiple variables can be realized without additional temporal latency of the intermediate results. This technique has a high space-bandwidth product and signal-to-noise ratio. The main structure can be stacked to construct a compact optoelectronic MSD adder-subtracter.

  16. A Logic Model for Evaluating the Academic Health Department.

    PubMed

    Erwin, Paul Campbell; McNeely, Clea S; Grubaugh, Julie H; Valentine, Jennifer; Miller, Mark D; Buchanan, Martha

    2016-01-01

    Academic Health Departments (AHDs) are collaborative partnerships between academic programs and practice settings. While case studies have informed our understanding of the development and activities of AHDs, there has been no formal published evaluation of AHDs, either singularly or collectively. Developing a framework for evaluating AHDs has potential to further aid our understanding of how these relationships may matter. In this article, we present a general theory of change, in the form of a logic model, for how AHDs impact public health at the community level. We then present a specific example of how the logic model has been customized for a specific AHD. Finally, we end with potential research questions on the AHD based on these concepts. We conclude that logic models are valuable tools, which can be used to assess the value and ultimate impact of the AHD.

  17. When the Future Feels Worse than the Past: A Temporal Inconsistency in Moral Judgment

    ERIC Educational Resources Information Center

    Caruso, Eugene M.

    2010-01-01

    Logically, an unethical behavior performed yesterday should also be unethical if performed tomorrow. However, the present studies suggest that the timing of a transgression has a systematic effect on people's beliefs about its moral acceptability. Because people's emotional reactions tend to be more extreme for future events than for past events,…

  18. tOWL: a temporal Web Ontology Language.

    PubMed

    Milea, Viorel; Frasincar, Flavius; Kaymak, Uzay

    2012-02-01

    Through its interoperability and reasoning capabilities, the Semantic Web opens a realm of possibilities for developing intelligent systems on the Web. The Web Ontology Language (OWL) is the most expressive standard language for modeling ontologies, the cornerstone of the Semantic Web. However, up until now, no standard way of expressing time and time-dependent information in OWL has been provided. In this paper, we present a temporal extension of the very expressive fragment SHIN(D) of the OWL Description Logic language, resulting in the temporal OWL language. Through a layered approach, we introduce three extensions: 1) concrete domains, which allow the representation of restrictions using concrete domain binary predicates; 2) temporal representation , which introduces time points, relations between time points, intervals, and Allen's 13 interval relations into the language; and 3) timeslices/fluents, which implement a perdurantist view on individuals and allow for the representation of complex temporal aspects, such as process state transitions. We illustrate the expressiveness of the newly introduced language by using an example from the financial domain.

  19. Synthesizing Biomolecule-based Boolean Logic Gates

    PubMed Central

    Miyamoto, Takafumi; Razavi, Shiva; DeRose, Robert; Inoue, Takanari

    2012-01-01

    One fascinating recent avenue of study in the field of synthetic biology is the creation of biomolecule-based computers. The main components of a computing device consist of an arithmetic logic unit, the control unit, memory, and the input and output devices. Boolean logic gates are at the core of the operational machinery of these parts, hence to make biocomputers a reality, biomolecular logic gates become a necessity. Indeed, with the advent of more sophisticated biological tools, both nucleic acid- and protein-based logic systems have been generated. These devices function in the context of either test tubes or living cells and yield highly specific outputs given a set of inputs. In this review, we discuss various types of biomolecular logic gates that have been synthesized, with particular emphasis on recent developments that promise increased complexity of logic gate circuitry, improved computational speed, and potential clinical applications. PMID:23526588

  20. Logic models to predict continuous outputs based on binary inputs with an application to personalized cancer therapy

    PubMed Central

    Knijnenburg, Theo A.; Klau, Gunnar W.; Iorio, Francesco; Garnett, Mathew J.; McDermott, Ultan; Shmulevich, Ilya; Wessels, Lodewyk F. A.

    2016-01-01

    Mining large datasets using machine learning approaches often leads to models that are hard to interpret and not amenable to the generation of hypotheses that can be experimentally tested. We present ‘Logic Optimization for Binary Input to Continuous Output’ (LOBICO), a computational approach that infers small and easily interpretable logic models of binary input features that explain a continuous output variable. Applying LOBICO to a large cancer cell line panel, we find that logic combinations of multiple mutations are more predictive of drug response than single gene predictors. Importantly, we show that the use of the continuous information leads to robust and more accurate logic models. LOBICO implements the ability to uncover logic models around predefined operating points in terms of sensitivity and specificity. As such, it represents an important step towards practical application of interpretable logic models. PMID:27876821

  1. Synthesizing biomolecule-based Boolean logic gates.

    PubMed

    Miyamoto, Takafumi; Razavi, Shiva; DeRose, Robert; Inoue, Takanari

    2013-02-15

    One fascinating recent avenue of study in the field of synthetic biology is the creation of biomolecule-based computers. The main components of a computing device consist of an arithmetic logic unit, the control unit, memory, and the input and output devices. Boolean logic gates are at the core of the operational machinery of these parts, and hence to make biocomputers a reality, biomolecular logic gates become a necessity. Indeed, with the advent of more sophisticated biological tools, both nucleic acid- and protein-based logic systems have been generated. These devices function in the context of either test tubes or living cells and yield highly specific outputs given a set of inputs. In this review, we discuss various types of biomolecular logic gates that have been synthesized, with particular emphasis on recent developments that promise increased complexity of logic gate circuitry, improved computational speed, and potential clinical applications.

  2. Logic models to predict continuous outputs based on binary inputs with an application to personalized cancer therapy.

    PubMed

    Knijnenburg, Theo A; Klau, Gunnar W; Iorio, Francesco; Garnett, Mathew J; McDermott, Ultan; Shmulevich, Ilya; Wessels, Lodewyk F A

    2016-11-23

    Mining large datasets using machine learning approaches often leads to models that are hard to interpret and not amenable to the generation of hypotheses that can be experimentally tested. We present 'Logic Optimization for Binary Input to Continuous Output' (LOBICO), a computational approach that infers small and easily interpretable logic models of binary input features that explain a continuous output variable. Applying LOBICO to a large cancer cell line panel, we find that logic combinations of multiple mutations are more predictive of drug response than single gene predictors. Importantly, we show that the use of the continuous information leads to robust and more accurate logic models. LOBICO implements the ability to uncover logic models around predefined operating points in terms of sensitivity and specificity. As such, it represents an important step towards practical application of interpretable logic models.

  3. A Logical Analysis of Quantum Voting Protocols

    NASA Astrophysics Data System (ADS)

    Rad, Soroush Rafiee; Shirinkalam, Elahe; Smets, Sonja

    2017-12-01

    In this paper we provide a logical analysis of the Quantum Voting Protocol for Anonymous Surveying as developed by Horoshko and Kilin in (Phys. Lett. A 375, 1172-1175 2011). In particular we make use of the probabilistic logic of quantum programs as developed in (Int. J. Theor. Phys. 53, 3628-3647 2014) to provide a formal specification of the protocol and to derive its correctness. Our analysis is part of a wider program on the application of quantum logics to the formal verification of protocols in quantum communication and quantum computation.

  4. Different effects of anterior temporal lobectomy and selective amygdalohippocampectomy on verbal memory performance of patients with epilepsy.

    PubMed

    Boucher, Olivier; Dagenais, Emmanuelle; Bouthillier, Alain; Nguyen, Dang Khoa; Rouleau, Isabelle

    2015-11-01

    The advantage of selective amygdalohippocampectomy (SAH) over anterior temporal lobectomy (ATL) for the treatment of temporal lobe epilepsy (TLE) remains controversial. Because ATL is more extensive and involves the lateral and medial parts of the temporal lobe, it may be predicted that its impact on memory is more important than SAH, which involves resection of medial temporal structures only. However, several studies do not support this assumption. Possible explanations include task-specific factors such as the extent of semantic and syntactic information to be memorized and failure to control for main confounders. We compared preoperative vs. postoperative memory performance in 13 patients with SAH with 26 patients who underwent ATL matched on side of surgery, IQ, age at seizure onset, and age at surgery. Memory function was assessed using the Logical Memory subtest from the Wechsler Memory Scales - 3rd edition (LM-WMS), the Rey Auditory Verbal Learning Test (RAVLT), the Digit Span subtest from the Wechsler Adult Intelligence Scale, and the Rey-Osterrieth Complex Figure Test. Repeated measures analyses of variance revealed opposite effects of SAH and ATL on the two verbal learning memory tests. On the immediate recall trial of the LM-WMS, performance deteriorated after ATL in comparison with that after SAH. By contrast, on the delayed recognition trial of the RAVLT, performance deteriorated after SAH compared with that after ATL. However, additional analyses revealed that the latter finding was only observed when surgery was conducted in the right hemisphere. No interaction effects were found on other memory outcomes. The results are congruent with the view that tasks involving rich semantic content and syntactical structure are more sensitive to the effects of lateral temporal cortex resection as compared with mesiotemporal resection. The findings highlight the importance of task selection in the assessment of memory in patients undergoing TLE surgery. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Logic programming and metadata specifications

    NASA Technical Reports Server (NTRS)

    Lopez, Antonio M., Jr.; Saacks, Marguerite E.

    1992-01-01

    Artificial intelligence (AI) ideas and techniques are critical to the development of intelligent information systems that will be used to collect, manipulate, and retrieve the vast amounts of space data produced by 'Missions to Planet Earth.' Natural language processing, inference, and expert systems are at the core of this space application of AI. This paper presents logic programming as an AI tool that can support inference (the ability to draw conclusions from a set of complicated and interrelated facts). It reports on the use of logic programming in the study of metadata specifications for a small problem domain of airborne sensors, and the dataset characteristics and pointers that are needed for data access.

  6. Security Modeling and Correctness Proof Using Specware and Isabelle

    DTIC Science & Technology

    2008-12-01

    proving requires substantial knowledge and experience in logical calculus . 15. NUMBER OF PAGES 146 14. SUBJECT TERMS Formal Method, Theorem...although the actual proving requires substantial knowledge and experience in logical calculus . vi THIS PAGE INTENTIONALLY LEFT BLANK vii TABLE OF...formal language and provides tools for proving those formulas in a logical calculus ” [5]. We are demonstrating in this thesis that a specification in

  7. Enzyme-based logic gates and circuits-analytical applications and interfacing with electronics.

    PubMed

    Katz, Evgeny; Poghossian, Arshak; Schöning, Michael J

    2017-01-01

    The paper is an overview of enzyme-based logic gates and their short circuits, with specific examples of Boolean AND and OR gates, and concatenated logic gates composed of multi-step enzyme-biocatalyzed reactions. Noise formation in the biocatalytic reactions and its decrease by adding a "filter" system, converting convex to sigmoid response function, are discussed. Despite the fact that the enzyme-based logic gates are primarily considered as components of future biomolecular computing systems, their biosensing applications are promising for immediate practical use. Analytical use of the enzyme logic systems in biomedical and forensic applications is discussed and exemplified with the logic analysis of biomarkers of various injuries, e.g., liver injury, and with analysis of biomarkers characteristic of different ethnicity found in blood samples on a crime scene. Interfacing of enzyme logic systems with modified electrodes and semiconductor devices is discussed, giving particular attention to the interfaces functionalized with signal-responsive materials. Future perspectives in the design of the biomolecular logic systems and their applications are discussed in the conclusion. Graphical Abstract Various applications and signal-transduction methods are reviewed for enzyme-based logic systems.

  8. Towards Time Automata and Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Hutzler, G.; Klaudel, H.; Wang, D. Y.

    2004-01-01

    The design of reactive systems must comply with logical correctness (the system does what it is supposed to do) and timeliness (the system has to satisfy a set of temporal constraints) criteria. In this paper, we propose a global approach for the design of adaptive reactive systems, i.e., systems that dynamically adapt their architecture depending on the context. We use the timed automata formalism for the design of the agents' behavior. This allows evaluating beforehand the properties of the system (regarding logical correctness and timeliness), thanks to model-checking and simulation techniques. This model is enhanced with tools that we developed for the automatic generation of code, allowing to produce very quickly a running multi-agent prototype satisfying the properties of the model.

  9. Intelligent approach for analysis of respiratory signals and oxygen saturation in the sleep apnea/hypopnea syndrome.

    PubMed

    Moret-Bonillo, Vicente; Alvarez-Estévez, Diego; Fernández-Leal, Angel; Hernández-Pereira, Elena

    2014-01-01

    This work deals with the development of an intelligent approach for clinical decision making in the diagnosis of the Sleep Apnea/Hypopnea Syndrome, SAHS, from the analysis of respiratory signals and oxygen saturation in arterial blood, SaO2. In order to accomplish the task the proposed approach makes use of different artificial intelligence techniques and reasoning processes being able to deal with imprecise data. These reasoning processes are based on fuzzy logic and on temporal analysis of the information. The developed approach also takes into account the possibility of artifacts in the monitored signals. Detection and characterization of signal artifacts allows detection of false positives. Identification of relevant diagnostic patterns and temporal correlation of events is performed through the implementation of temporal constraints.

  10. Intelligent Approach for Analysis of Respiratory Signals and Oxygen Saturation in the Sleep Apnea/Hypopnea Syndrome

    PubMed Central

    Moret-Bonillo, Vicente; Alvarez-Estévez, Diego; Fernández-Leal, Angel; Hernández-Pereira, Elena

    2014-01-01

    This work deals with the development of an intelligent approach for clinical decision making in the diagnosis of the Sleep Apnea/Hypopnea Syndrome, SAHS, from the analysis of respiratory signals and oxygen saturation in arterial blood, SaO2. In order to accomplish the task the proposed approach makes use of different artificial intelligence techniques and reasoning processes being able to deal with imprecise data. These reasoning processes are based on fuzzy logic and on temporal analysis of the information. The developed approach also takes into account the possibility of artifacts in the monitored signals. Detection and characterization of signal artifacts allows detection of false positives. Identification of relevant diagnostic patterns and temporal correlation of events is performed through the implementation of temporal constraints. PMID:25035712

  11. Programmable and Multiparameter DNA-Based Logic Platform For Cancer Recognition and Targeted Therapy

    PubMed Central

    2014-01-01

    The specific inventory of molecules on diseased cell surfaces (e.g., cancer cells) provides clinicians an opportunity for accurate diagnosis and intervention. With the discovery of panels of cancer markers, carrying out analyses of multiple cell-surface markers is conceivable. As a trial to accomplish this, we have recently designed a DNA-based device that is capable of performing autonomous logic-based analysis of two or three cancer cell-surface markers. Combining the specific target-recognition properties of DNA aptamers with toehold-mediated strand displacement reactions, multicellular marker-based cancer analysis can be realized based on modular AND, OR, and NOT Boolean logic gates. Specifically, we report here a general approach for assembling these modular logic gates to execute programmable and higher-order profiling of multiple coexisting cell-surface markers, including several found on cancer cells, with the capacity to report a diagnostic signal and/or deliver targeted photodynamic therapy. The success of this strategy demonstrates the potential of DNA nanotechnology in facilitating targeted disease diagnosis and effective therapy. PMID:25361164

  12. Synthesizing genetic sequential logic circuit with clock pulse generator

    PubMed Central

    2014-01-01

    Background Rhythmic clock widely occurs in biological systems which controls several aspects of cell physiology. For the different cell types, it is supplied with various rhythmic frequencies. How to synthesize a specific clock signal is a preliminary but a necessary step to further development of a biological computer in the future. Results This paper presents a genetic sequential logic circuit with a clock pulse generator based on a synthesized genetic oscillator, which generates a consecutive clock signal whose frequency is an inverse integer multiple to that of the genetic oscillator. An analogous electronic waveform-shaping circuit is constructed by a series of genetic buffers to shape logic high/low levels of an oscillation input in a basic sinusoidal cycle and generate a pulse-width-modulated (PWM) output with various duty cycles. By controlling the threshold level of the genetic buffer, a genetic clock pulse signal with its frequency consistent to the genetic oscillator is synthesized. A synchronous genetic counter circuit based on the topology of the digital sequential logic circuit is triggered by the clock pulse to synthesize the clock signal with an inverse multiple frequency to the genetic oscillator. The function acts like a frequency divider in electronic circuits which plays a key role in the sequential logic circuit with specific operational frequency. Conclusions A cascaded genetic logic circuit generating clock pulse signals is proposed. Based on analogous implement of digital sequential logic circuits, genetic sequential logic circuits can be constructed by the proposed approach to generate various clock signals from an oscillation signal. PMID:24884665

  13. Temporal Reasoning and Default Logics.

    DTIC Science & Technology

    1985-10-01

    Aritificial Intelligence ", Computer Science Research Report, Yale University, forthcoming (1985). . 74 .-, A Axioms for Describing Persistences and Clipping...34Circumscription - A Form of Non-Monotonic Reasoning", Artificial Intelligence , vol. 13 (1980), pp. 27-39. [13] McCarthy, John, "Applications of...and P. J. Hayes, "Some philosophical problems from the standpoint of artificial intelligence ", in: B. Meltzer and D. Michie (eds.), Machine

  14. Extension of specification language for soundness and completeness of service workflow

    NASA Astrophysics Data System (ADS)

    Viriyasitavat, Wattana; Xu, Li Da; Bi, Zhuming; Sapsomboon, Assadaporn

    2018-05-01

    A Service Workflow is an aggregation of distributed services to fulfill specific functionalities. With ever increasing available services, the methodologies for the selections of the services against the given requirements become main research subjects in multiple disciplines. A few of researchers have contributed to the formal specification languages and the methods for model checking; however, existing methods have the difficulties to tackle with the complexity of workflow compositions. In this paper, we propose to formalize the specification language to reduce the complexity of the workflow composition. To this end, we extend a specification language with the consideration of formal logic, so that some effective theorems can be derived for the verification of syntax, semantics, and inference rules in the workflow composition. The logic-based approach automates compliance checking effectively. The Service Workflow Specification (SWSpec) has been extended and formulated, and the soundness, completeness, and consistency of SWSpec applications have been verified; note that a logic-based SWSpec is mandatory for the development of model checking. The application of the proposed SWSpec has been demonstrated by the examples with the addressed soundness, completeness, and consistency.

  15. Syntax, concepts, and logic in the temporal dynamics of language comprehension: evidence from event-related potentials.

    PubMed

    Steinhauer, Karsten; Drury, John E; Portner, Paul; Walenski, Matthew; Ullman, Michael T

    2010-05-01

    Logic has been intertwined with the study of language and meaning since antiquity, and such connections persist in present day research in linguistic theory (formal semantics) and cognitive psychology (e.g., studies of human reasoning). However, few studies in cognitive neuroscience have addressed logical dimensions of sentence-level language processing, and none have directly compared these aspects of processing with syntax and lexical/conceptual-semantics. We used ERPs to examine a violation paradigm involving "Negative Polarity Items" or NPIs (e.g., ever/any), which are sensitive to logical/truth-conditional properties of the environments in which they occur (e.g., presence/absence of negation in: John hasn't ever been to Paris, versus: John has *ever been to Paris). Previous studies examining similar types of contrasts found a mix of effects on familiar ERP components (e.g., LAN, N400, P600). We argue that their experimental designs and/or analyses were incapable of separating which effects are connected to NPI-licensing violations proper. Our design enabled statistical analyses teasing apart genuine violation effects from independent effects tied solely to lexical/contextual factors. Here unlicensed NPIs elicited a late P600 followed in onset by a late left anterior negativity (or "L-LAN"), an ERP profile which has also appeared elsewhere in studies targeting logical semantics. Crucially, qualitatively distinct ERP-profiles emerged for syntactic and conceptual semantic violations which we also tested here. We discuss how these findings may be linked to previous findings in the ERP literature. Apart from methodological recommendations, we suggest that the study of logical semantics may aid advancing our understanding of the underlying neurocognitive etiology of ERP components. 2010 Elsevier Ltd. All rights reserved.

  16. SYNTAX, CONCEPTS, AND LOGIC IN THE TEMPORAL DYNAMICS OF LANGUAGE COMPREHENSION: EVIDENCE FROM EVENT RELATED POTENTIALS

    PubMed Central

    Steinhauer, Karsten; Drury, John E.; Portner, Paul; Walenski, Matthew; Ullman, Michael T.

    2010-01-01

    Logic has been intertwined with the study of language and meaning since antiquity, and such connections persist in present day research in linguistic theory (formal semantics) and cognitive psychology (e.g., studies of human reasoning). However, few studies in cognitive neuroscience have addressed logical dimensions of sentence-level language processing, and none have directly compared these aspects of processing with syntax and lexical/conceptual-semantics. We used ERPs to examine a violation paradigm involving “Negative Polarity Items” or NPIs (e.g., ever/any), which are sensitive to logical/truth-conditional properties of the environments in which they occur (e.g., presence/absence of negation in: John hasn’t ever been to Paris, versus: John has *ever been to Paris). Previous studies examining similar types of contrasts found a mix of effects on familiar ERP components (e.g., LAN, N400, P600). We argue that their experimental designs and/or analyses were incapable of separating which effects are connected to NPI-licensing violations proper. Our design enabled statistical analyses teasing apart genuine violation effects from independent effects tied solely to lexical/contextual factors. Here unlicensed NPIs elicited a late P600 followed in onset by a late left anterior negativity (or “L-LAN”), an ERP profile which has also appeared elsewhere in studies targeting logical semantics. Crucially, qualitatively distinct ERP-profiles emerged for syntactic and conceptual semantic violations which we also tested here. We discuss how these findings may be linked to previous findings in the ERP literature. Apart from methodological recommendations, we suggest that the study of logical semantics may aid advancing our understanding of the underlying neurocognitive etiology of ERP components. PMID:20138065

  17. Frontotemporal Dementia Selectively Impairs Transitive Reasoning About Familiar Spatial Environments

    PubMed Central

    Vartanian, Oshin; Goel, Vinod; Tierney, Michael; Huey, Edward D.; Grafman, Jordan

    2010-01-01

    Although patients with frontotemporal dementia (FTD) are known to exhibit a wide range of cognitive and personality difficulties, some evidence suggests that there may be a degree of selectivity in their reasoning impairments. Based on a recent review of the neuroimaging literature on reasoning, the authors hypothesized that the presence or absence of familiar content may have a selective impact on the reasoning abilities of patients with FTD. Specifically, the authors predicted that patients with frontalvariant FTD would be more impaired when reasoning about transitive arguments involving familiar spatial environments than when reasoning about identical logical arguments involving unfamiliar spatial environments. As predicted, patients with FTD were less accurate than normal controls only when the content of arguments involved familiar spatial environments. These results indicate a degree of selectivity in the cognitive deficits of this patient population and suggest that the frontal-temporal lobe system may play a necessary role in reasoning about familiar material. PMID:19702415

  18. Modal and Temporal Argumentation Networks

    NASA Astrophysics Data System (ADS)

    Barringer, Howard; Gabbay, Dov M.

    The traditional Dung networks depict arguments as atomic and studies the relationships of attack between them. This can be generalised in two ways. One is to consider, for example, various forms of attack, support and feedback. Another is to add content to nodes and put there not just atomic arguments but more structure, for example, proofs in some logic or simply just formulas from a richer language. This paper offers to use temporal and modal language formulas to represent arguments in the nodes of a network. The suitable semantics for such networks is Kripke semantics. We also introduce a new key concept of usability of an argument.

  19. Built-in-test by signature inspection (bitsi)

    DOEpatents

    Bergeson, Gary C.; Morneau, Richard A.

    1991-01-01

    A system and method for fault detection for electronic circuits. A stimulus generator sends a signal to the input of the circuit under test. Signature inspection logic compares the resultant signal from test nodes on the circuit to an expected signal. If the signals do not match, the signature inspection logic sends a signal to the control logic for indication of fault detection in the circuit. A data input multiplexer between the test nodes of the circuit under test and the signature inspection logic can provide for identification of the specific node at fault by the signature inspection logic. Control logic responsive to the signature inspection logic conveys information about fault detection for use in determining the condition of the circuit. When used in conjunction with a system test controller, the built-in test by signature inspection system and method can be used to poll a plurality of circuits automatically and continuous for faults and record the results of such polling in the system test controller.

  20. Timed Up and Go test, atrophy of medial temporal areas and cognitive functions in community-dwelling older adults with normal cognition and mild cognitive impairment.

    PubMed

    Kose, Yujiro; Ikenaga, Masahiro; Yamada, Yosuke; Morimura, Kazuhiro; Takeda, Noriko; Ouma, Shinji; Tsuboi, Yoshio; Yamada, Tatsuo; Kimura, Misaka; Kiyonaga, Akira; Higaki, Yasuki; Tanaka, Hiroaki

    2016-12-01

    This study aimed to ascertain if performance on the Timed Up and Go (TUG) test is associated with indicators of brain volume and cognitive functions among community-dwelling older adults with normal cognition or mild cognitive impairment. Participants were 80 community-dwelling older adults aged 65-89years (44 men, 36 women), including 20 with mild cognitive impairment. Participants completed the TUG and a battery of cognitive assessments, including the Mini-Mental State Examination (MMSE), the Logical Memory I and II (LM-I, LM-II) subtests of the Wechsler Memory Scale-Revised; and the Trail Making Test A and B (TMT-A, TMT-B). Bilateral, right- and left-side medial temporal area atrophy as well as whole gray and white matter indices were determined with the Voxel-based Specific Regional Analysis System for Alzheimer's Disease. We divided participants into three groups based on TUG performance: "better" (≤6.9s); "normal" (7-10s); and "poor" (≥10.1s). Worse TMT-A and TMT-B performance showed significant independent associations with worse TUG performance (P<0.05, P<0.01 for trend, respectively). After adjusting for covariates, severe atrophy of bilateral, right-, and left-side medial temporal areas were significantly independently associated with worse TUG performance (P<0.05 for trend). However, no significant associations were found between MMSE, LM-I, LM-II, whole gray and white matter indices, and TUG performance. Worse TUG performance is related to poor performance on TMT-A and TMT-B, and is independently associated with severe medial temporal area atrophy in community-dwelling older adults. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. An Institutional Perspective on Accountable Care Organizations.

    PubMed

    Goodrick, Elizabeth; Reay, Trish

    2016-12-01

    We employ aspects of institutional theory to explore how Accountable Care Organizations (ACOs) can effectively manage the multiplicity of ideas and pressures within which they are embedded and consequently better serve patients and their communities. More specifically, we draw on the concept of institutional logics to highlight the importance of understanding the conflicting principles upon which ACOs were founded. Based on previous research conducted both inside and outside health care settings, we argue that ACOs can combine attention to these principles (or institutional logics) in different ways; the options fall on a continuum from (a) segregating the effects of multiple logics from each other by compartmentalizing responses to multiple logics to (b) fully hybridizing the different logics. We suggest that the most productive path for ACOs is to situate their approach between the two extremes of "segregating" and "fully hybridizing." This strategic approach allows ACOs to develop effective responses that combine logics without fully integrating them. We identify three ways that ACOs can embrace institutional complexity short of fully hybridizing disparate logics: (1) reinterpreting practices to make them compatible with other logics; (2) engaging in strategies that take advantage of existing synergy between conflicting logics; (3) creating opportunities for people at frontline to develop innovative ways of working that combine multiple logics. © The Author(s) 2016.

  2. Biosensors with Built-In Biomolecular Logic Gates for Practical Applications

    PubMed Central

    Lai, Yu-Hsuan; Sun, Sin-Cih; Chuang, Min-Chieh

    2014-01-01

    Molecular logic gates, designs constructed with biological and chemical molecules, have emerged as an alternative computing approach to silicon-based logic operations. These molecular computers are capable of receiving and integrating multiple stimuli of biochemical significance to generate a definitive output, opening a new research avenue to advanced diagnostics and therapeutics which demand handling of complex factors and precise control. In molecularly gated devices, Boolean logic computations can be activated by specific inputs and accurately processed via bio-recognition, bio-catalysis, and selective chemical reactions. In this review, we survey recent advances of the molecular logic approaches to practical applications of biosensors, including designs constructed with proteins, enzymes, nucleic acids, nanomaterials, and organic compounds, as well as the research avenues for future development of digitally operating “sense and act” schemes that logically process biochemical signals through networked circuits to implement intelligent control systems. PMID:25587423

  3. Content-based intermedia synchronization

    NASA Astrophysics Data System (ADS)

    Oh, Dong-Young; Sampath-Kumar, Srihari; Rangan, P. Venkat

    1995-03-01

    Inter-media synchronization methods developed until now have been based on syntactic timestamping of video frames and audio samples. These methods are not fully appropriate for the synchronization of multimedia objects which may have to be accessed individually by their contents, e.g. content-base data retrieval. We propose a content-based multimedia synchronization scheme in which a media stream is viewed as hierarchial composition of smaller objects which are logically structured based on the contents, and the synchronization is achieved by deriving temporal relations among logical units of media object. content-based synchronization offers several advantages such as, elimination of the need for time stamping, freedom from limitations of jitter, synchronization of independently captured media objects in video editing, and compensation for inherent asynchronies in capture times of video and audio.

  4. Human Action Recognition in Surveillance Videos using Abductive Reasoning on Linear Temporal Logic

    DTIC Science & Technology

    2012-08-29

    help of the optical flows (Lucas 75 and Kanade, 1981). 76 3.2 Atomic Propositions 77 isAt (ti, Oj, Lk)  Object Oj is at location Lk at time...simultaneously at two locations in the same frame. This can 84 be represented mathematically as: 85 isAt (ti, Oj, Lk... isAt (ti, Oj, Lm)  Lk   Lm

  5. Logical operations using phenyl ring

    NASA Astrophysics Data System (ADS)

    Patra, Moumita; Maiti, Santanu K.

    2018-02-01

    Exploiting the effects of quantum interference we put forward an idea of designing three primary logic gates, OR, AND and NOT, using a benzene molecule. Under a specific molecule-lead interface geometry, anti-resonant states appear which play the crucial role for AND and NOT operations, while for OR gate no such states are required. Our analysis leads to a possibility of designing logic gates using simple molecular structure which might be significant in the area of molecular electronics.

  6. Fuzzy logic applications to control engineering

    NASA Astrophysics Data System (ADS)

    Langari, Reza

    1993-12-01

    This paper presents the results of a project presently under way at Texas A&M which focuses on the use of fuzzy logic in integrated control of manufacturing systems. The specific problems investigated here include diagnosis of critical tool wear in machining of metals via a neuro-fuzzy algorithm, as well as compensation of friction in mechanical positioning systems via an adaptive fuzzy logic algorithm. The results indicate that fuzzy logic in conjunction with conventional algorithmic based approaches or neural nets can prove useful in dealing with the intricacies of control/monitoring of manufacturing systems and can potentially play an active role in multi-modal integrated control systems of the future.

  7. Peptide Logic Circuits Based on Chemoenzymatic Ligation for Programmable Cell Apoptosis.

    PubMed

    Li, Yong; Sun, Sujuan; Fan, Lin; Hu, Shanfang; Huang, Yan; Zhang, Ke; Nie, Zhou; Yao, Shouzhou

    2017-11-20

    A novel and versatile peptide-based bio-logic system capable of regulating cell function is developed using sortase A (SrtA), a peptide ligation enzyme, as a generic processor. By modular peptide design, we demonstrate that mammalian cells apoptosis can be programmed by peptide-based logic operations, including binary and combination gates (AND, INHIBIT, OR, and AND-INHIBIT), and a complex sequential logic circuit (multi-input keypad lock). Moreover, a proof-of-concept peptide regulatory circuit was developed to analyze the expression profile of cell-secreted protein biomarkers and trigger cancer-cell-specific apoptosis. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Broadband slow light in one-dimensional logically combined photonic crystals.

    PubMed

    Alagappan, G; Png, C E

    2015-01-28

    Here, we demonstrate the broadband slow light effects in a new family of one dimensional photonic crystals, which are obtained by logically combining two photonic crystals of slightly different periods. The logical combination slowly destroys the original translational symmetries of the individual photonic crystals. Consequently, the Bloch modes of the individual photonic crystals with different wavevectors couple with each other, creating a vast number of slow modes. Specifically, we describe a photonic crystal architecture that results from a logical "OR" mixture of two one dimensional photonic crystals with a periods ratio of r = R/(R - 1), where R > 2 is an integer. Such a logically combined architecture, exhibits a broad region of frequencies in which a dense number of slow modes with varnishing group velocities, appear naturally as Bloch modes.

  9. Selection Shapes Transcriptional Logic and Regulatory Specialization in Genetic Networks.

    PubMed

    Fogelmark, Karl; Peterson, Carsten; Troein, Carl

    2016-01-01

    Living organisms need to regulate their gene expression in response to environmental signals and internal cues. This is a computational task where genes act as logic gates that connect to form transcriptional networks, which are shaped at all scales by evolution. Large-scale mutations such as gene duplications and deletions add and remove network components, whereas smaller mutations alter the connections between them. Selection determines what mutations are accepted, but its importance for shaping the resulting networks has been debated. To investigate the effects of selection in the shaping of transcriptional networks, we derive transcriptional logic from a combinatorially powerful yet tractable model of the binding between DNA and transcription factors. By evolving the resulting networks based on their ability to function as either a simple decision system or a circadian clock, we obtain information on the regulation and logic rules encoded in functional transcriptional networks. Comparisons are made between networks evolved for different functions, as well as with structurally equivalent but non-functional (neutrally evolved) networks, and predictions are validated against the transcriptional network of E. coli. We find that the logic rules governing gene expression depend on the function performed by the network. Unlike the decision systems, the circadian clocks show strong cooperative binding and negative regulation, which achieves tight temporal control of gene expression. Furthermore, we find that transcription factors act preferentially as either activators or repressors, both when binding multiple sites for a single target gene and globally in the transcriptional networks. This separation into positive and negative regulators requires gene duplications, which highlights the interplay between mutation and selection in shaping the transcriptional networks.

  10. A computational study of liposome logic: towards cellular computing from the bottom up

    PubMed Central

    Smaldon, James; Romero-Campero, Francisco J.; Fernández Trillo, Francisco; Gheorghe, Marian; Alexander, Cameron

    2010-01-01

    In this paper we propose a new bottom-up approach to cellular computing, in which computational chemical processes are encapsulated within liposomes. This “liposome logic” approach (also called vesicle computing) makes use of supra-molecular chemistry constructs, e.g. protocells, chells, etc. as minimal cellular platforms to which logical functionality can be added. Modeling and simulations feature prominently in “top-down” synthetic biology, particularly in the specification, design and implementation of logic circuits through bacterial genome reengineering. The second contribution in this paper is the demonstration of a novel set of tools for the specification, modelling and analysis of “bottom-up” liposome logic. In particular, simulation and modelling techniques are used to analyse some example liposome logic designs, ranging from relatively simple NOT gates and NAND gates to SR-Latches, D Flip-Flops all the way to 3 bit ripple counters. The approach we propose consists of specifying, by means of P systems, gene regulatory network-like systems operating inside proto-membranes. This P systems specification can be automatically translated and executed through a multiscaled pipeline composed of dissipative particle dynamics (DPD) simulator and Gillespie’s stochastic simulation algorithm (SSA). Finally, model selection and analysis can be performed through a model checking phase. This is the first paper we are aware of that brings to bear formal specifications, DPD, SSA and model checking to the problem of modeling target computational functionality in protocells. Potential chemical routes for the laboratory implementation of these simulations are also discussed thus for the first time suggesting a potentially realistic physiochemical implementation for membrane computing from the bottom-up. PMID:21886681

  11. Students’ logical-mathematical intelligence profile

    NASA Astrophysics Data System (ADS)

    Arum, D. P.; Kusmayadi, T. A.; Pramudya, I.

    2018-04-01

    One of students’ characteristics which play an important role in learning mathematics is logical-mathematical intelligence. This present study aims to identify profile of students’ logical-mathematical intelligence in general and specifically in each indicator. It is also analyzed and described based on students’ sex. This research used qualitative method with case study strategy. The subjects involve 29 students of 9th grade that were selected by purposive sampling. Data in this research involve students’ logical-mathematical intelligence result and interview. The results show that students’ logical-mathematical intelligence was identified in the moderate level with the average score is 11.17 and 51.7% students in the range of the level. In addition, the level of both male and female students are also mostly in the moderate level. On the other hand, both male and female students’ logical-mathematical intelligence is strongly influenced by the indicator of ability to classify and understand patterns and relationships. Furthermore, the ability of comparison is the weakest indicator. It seems that students’ logical-mathematical intelligence is still not optimal because more than 50% students are identified in moderate and low level. Therefore, teachers need to design a lesson that can improve students’ logical-mathematical intelligence level, both in general and on each indicator.

  12. A CMake-Based Cross Platform Build System for Tcl/Tk

    DTIC Science & Technology

    2011-11-01

    expressing the logic for generating user-installable packages of the finished package. While specific com- pilation instructions are typically unique to each...Windows com- pilation . This presented a difficulty for the BRL- CAD project in that neither of these systems inte- grated well with BRL-CAD’s own build...build files. 2. Implement enough of the Tcl/Tk–specific com- pilation macro logic in CMake to support build- 1Twylite’s Coffee project uses CMake to

  13. Graphene-based aptamer logic gates and their application to multiplex detection.

    PubMed

    Wang, Li; Zhu, Jinbo; Han, Lei; Jin, Lihua; Zhu, Chengzhou; Wang, Erkang; Dong, Shaojun

    2012-08-28

    In this work, a GO/aptamer system was constructed to create multiplex logic operations and enable sensing of multiplex targets. 6-Carboxyfluorescein (FAM)-labeled adenosine triphosphate binding aptamer (ABA) and FAM-labeled thrombin binding aptamer (TBA) were first adsorbed onto graphene oxide (GO) to form a GO/aptamer complex, leading to the quenching of the fluorescence of FAM. We demonstrated that the unique GO/aptamer interaction and the specific aptamer-target recognition in the target/GO/aptamer system were programmable and could be utilized to regulate the fluorescence of FAM via OR and INHIBIT logic gates. The fluorescence changed according to different input combinations, and the integration of OR and INHIBIT logic gates provided an interesting approach for logic sensing applications where multiple target molecules were present. High-throughput fluorescence imagings that enabled the simultaneous processing of many samples by using the combinatorial logic gates were realized. The developed logic gates may find applications in further development of DNA circuits and advanced sensors for the identification of multiple targets in complex chemical environments.

  14. Detecting Payload Attacks on Programmable Logic Controllers (PLCs)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Huan

    Programmable logic controllers (PLCs) play critical roles in industrial control systems (ICS). Providing hardware peripherals and firmware support for control programs (i.e., a PLC’s “payload”) written in languages such as ladder logic, PLCs directly receive sensor readings and control ICS physical processes. An attacker with access to PLC development software (e.g., by compromising an engineering workstation) can modify the payload program and cause severe physical damages to the ICS. To protect critical ICS infrastructure, we propose to model runtime behaviors of legitimate PLC payload program and use runtime behavior monitoring in PLC firmware to detect payload attacks. By monitoring themore » I/O access patterns, network access patterns, as well as payload program timing characteristics, our proposed firmware-level detection mechanism can detect abnormal runtime behaviors of malicious PLC payload. Using our proof-of-concept implementation, we evaluate the memory and execution time overhead of implementing our proposed method and find that it is feasible to incorporate our method into existing PLC firmware. In addition, our evaluation results show that a wide variety of payload attacks can be effectively detected by our proposed approach. The proposed firmware-level payload attack detection scheme complements existing bumpin- the-wire solutions (e.g., external temporal-logic-based model checkers) in that it can detect payload attacks that violate realtime requirements of ICS operations and does not require any additional apparatus.« less

  15. Boolean Logic Tree of Label-Free Dual-Signal Electrochemical Aptasensor System for Biosensing, Three-State Logic Computation, and Keypad Lock Security Operation.

    PubMed

    Lu, Jiao Yang; Zhang, Xin Xing; Huang, Wei Tao; Zhu, Qiu Yan; Ding, Xue Zhi; Xia, Li Qiu; Luo, Hong Qun; Li, Nian Bing

    2017-09-19

    The most serious and yet unsolved problems of molecular logic computing consist in how to connect molecular events in complex systems into a usable device with specific functions and how to selectively control branchy logic processes from the cascading logic systems. This report demonstrates that a Boolean logic tree is utilized to organize and connect "plug and play" chemical events DNA, nanomaterials, organic dye, biomolecule, and denaturant for developing the dual-signal electrochemical evolution aptasensor system with good resettability for amplification detection of thrombin, controllable and selectable three-state logic computation, and keypad lock security operation. The aptasensor system combines the merits of DNA-functionalized nanoamplification architecture and simple dual-signal electroactive dye brilliant cresyl blue for sensitive and selective detection of thrombin with a wide linear response range of 0.02-100 nM and a detection limit of 1.92 pM. By using these aforementioned chemical events as inputs and the differential pulse voltammetry current changes at different voltages as dual outputs, a resettable three-input biomolecular keypad lock based on sequential logic is established. Moreover, the first example of controllable and selectable three-state molecular logic computation with active-high and active-low logic functions can be implemented and allows the output ports to assume a high impediment or nothing (Z) state in addition to the 0 and 1 logic levels, effectively controlling subsequent branchy logic computation processes. Our approach is helpful in developing the advanced controllable and selectable logic computing and sensing system in large-scale integration circuits for application in biomedical engineering, intelligent sensing, and control.

  16. An Argumentation Framework based on Paraconsistent Logic

    NASA Astrophysics Data System (ADS)

    Umeda, Yuichi; Takahashi, Takehisa; Sawamura, Hajime

    Argumentation is the most representative of intelligent activities of humans. Therefore, it is natural to think that it could have many implications for artificial intelligence and computer science as well. Specifically, argumentation may be considered a most primitive capability for interaction among computational agents. In this paper we present an argumentation framework based on the four-valued paraconsistent logic. Tolerance and acceptance of inconsistency that this logic has as its logical feature allow for arguments on inconsistent knowledge bases with which we are often confronted. We introduce various concepts for argumentation, such as arguments, attack relations, argument justification, preferential criteria of arguments based on social norms, and so on, in a way proper to the four-valued paraconsistent logic. Then, we provide the fixpoint semantics and dialectical proof theory for our argumentation framework. We also give the proofs of the soundness and completeness.

  17. Using a logic model to relate the strategic to the tactical in program planning and evaluation: an illustration based on social norms interventions.

    PubMed

    Keller, Adrienne; Bauerle, Jennifer A

    2009-01-01

    Logic models are a ubiquitous tool for specifying the tactics--including implementation and evaluation--of interventions in the public health, health and social behaviors arenas. Similarly, social norms interventions are a common strategy, particularly in college settings, to address hazardous drinking and other dangerous or asocial behaviors. This paper illustrates an extension of logic models to include strategic as well as tactical components, using a specific example developed for social norms interventions. Placing the evaluation of projects within the context of this kind of logic model addresses issues related to the lack of a research design to evaluate effectiveness.

  18. An adaptive maneuvering logic computer program for the simulation of one-on-one air-to-air combat. Volume 1: General description

    NASA Technical Reports Server (NTRS)

    Burgin, G. H.; Fogel, L. J.; Phelps, J. P.

    1975-01-01

    A technique for computer simulation of air combat is described. Volume 1 decribes the computer program and its development in general terms. Two versions of the program exist. Both incorporate a logic for selecting and executing air combat maneuvers with performance models of specific fighter aircraft. In the batch processing version the flight paths of two aircraft engaged in interactive aerial combat and controlled by the same logic are computed. The realtime version permits human pilots to fly air-to-air combat against the adaptive maneuvering logic (AML) in Langley Differential Maneuvering Simulator (DMS). Volume 2 consists of a detailed description of the computer programs.

  19. An ABS control logic based on wheel force measurement

    NASA Astrophysics Data System (ADS)

    Capra, D.; Galvagno, E.; Ondrak, V.; van Leeuwen, B.; Vigliani, A.

    2012-12-01

    The paper presents an anti-lock braking system (ABS) control logic based on the measurement of the longitudinal forces at the hub bearings. The availability of force information allows to design a logic that does not rely on the estimation of the tyre-road friction coefficient, since it continuously tries to exploit the maximum longitudinal tyre force. The logic is designed by means of computer simulation and then tested on a specific hardware in the loop test bench: the experimental results confirm that measured wheel force can lead to a significant improvement of the ABS performances in terms of stopping distance also in the presence of road with variable friction coefficient.

  20. Non-volatile logic gates based on planar Hall effect in magnetic films with two in-plane easy axes.

    PubMed

    Lee, Sangyeop; Bac, Seul-Ki; Choi, Seonghoon; Lee, Hakjoon; Yoo, Taehee; Lee, Sanghoon; Liu, Xinyu; Dobrowolska, M; Furdyna, Jacek K

    2017-04-25

    We discuss the use of planar Hall effect (PHE) in a ferromagnetic GaMnAs film with two in-plane easy axes as a means for achieving novel logic functionalities. We show that the switching of magnetization between the easy axes in a GaMnAs film depends strongly on the magnitude of the current flowing through the film due to thermal effects that modify its magnetic anisotropy. Planar Hall resistance in a GaMnAs film with two in-plane easy axes shows well-defined maxima and minima that can serve as two binary logic states. By choosing appropriate magnitudes of the input current for the GaMnAs Hall device, magnetic logic functions can then be achieved. Specifically, non-volatile logic functionalities such as AND, OR, NAND, and NOR gates can be obtained in such a device by selecting appropriate initial conditions. These results, involving a simple PHE device, hold promise for realizing programmable logic elements in magnetic electronics.

  1. Rewriting Logic Semantics of a Plan Execution Language

    NASA Technical Reports Server (NTRS)

    Dowek, Gilles; Munoz, Cesar A.; Rocha, Camilo

    2009-01-01

    The Plan Execution Interchange Language (PLEXIL) is a synchronous language developed by NASA to support autonomous spacecraft operations. In this paper, we propose a rewriting logic semantics of PLEXIL in Maude, a high-performance logical engine. The rewriting logic semantics is by itself a formal interpreter of the language and can be used as a semantic benchmark for the implementation of PLEXIL executives. The implementation in Maude has the additional benefit of making available to PLEXIL designers and developers all the formal analysis and verification tools provided by Maude. The formalization of the PLEXIL semantics in rewriting logic poses an interesting challenge due to the synchronous nature of the language and the prioritized rules defining its semantics. To overcome this difficulty, we propose a general procedure for simulating synchronous set relations in rewriting logic that is sound and, for deterministic relations, complete. We also report on the finding of two issues at the design level of the original PLEXIL semantics that were identified with the help of the executable specification in Maude.

  2. Temporal abstraction for the analysis of intensive care information

    NASA Astrophysics Data System (ADS)

    Hadad, Alejandro J.; Evin, Diego A.; Drozdowicz, Bartolomé; Chiotti, Omar

    2007-11-01

    This paper proposes a scheme for the analysis of time-stamped series data from multiple monitoring devices of intensive care units, using Temporal Abstraction concepts. This scheme is oriented to obtain a description of the patient state evolution in an unsupervised way. The case of study is based on a dataset clinically classified with Pulmonary Edema. For this dataset a trends based Temporal Abstraction mechanism is proposed, by means of a Behaviours Base of time-stamped series and then used in a classification step. Combining this approach with the introduction of expert knowledge, using Fuzzy Logic, and multivariate analysis by means of Self-Organizing Maps, a states characterization model is obtained. This model is feasible of being extended to different patients groups and states. The proposed scheme allows to obtain intermediate states descriptions through which it is passing the patient and that could be used to anticipate alert situations.

  3. A biomimetic colorimetric logic gate system based on multi-functional peptide-mediated gold nanoparticle assembly

    NASA Astrophysics Data System (ADS)

    Li, Yong; Li, Wang; He, Kai-Yu; Li, Pei; Huang, Yan; Nie, Zhou; Yao, Shou-Zhuo

    2016-04-01

    In natural biological systems, proteins exploit various functional peptide motifs to exert target response and activity switch, providing a functional and logic basis for complex cellular activities. Building biomimetic peptide-based bio-logic systems is highly intriguing but remains relatively unexplored due to limited logic recognition elements and complex signal outputs. In this proof-of-principle work, we attempted to address these problems by utilizing multi-functional peptide probes and the peptide-mediated nanoparticle assembly system. Here, the rationally designed peptide probes function as the dual-target responsive element specifically responsive to metal ions and enzymes as well as the mediator regulating the assembly of gold nanoparticles (AuNPs). Taking advantage of Zn2+ ions and chymotrypsin as the model inputs of metal ions and enzymes, respectively, we constructed the peptide logic system computed by the multi-functional peptide probes and outputted by the readable colour change of AuNPs. In this way, the representative binary basic logic gates (AND, OR, INHIBIT, NAND, IMPLICATION) have been achieved by delicately coding the peptide sequence, demonstrating the versatility of our logic system. Additionally, we demonstrated that the three-input combinational logic gate (INHIBIT-OR) could also be successfully integrated and applied as a multi-tasking biosensor for colorimetric detection of dual targets. This nanoparticle-based peptide logic system presents a valid strategy to illustrate peptide information processing and provides a practical platform for executing peptide computing or peptide-related multiplexing sensing, implying that the controllable nanomaterial assembly is a promising and potent methodology for the advancement of biomimetic bio-logic computation.In natural biological systems, proteins exploit various functional peptide motifs to exert target response and activity switch, providing a functional and logic basis for complex cellular activities. Building biomimetic peptide-based bio-logic systems is highly intriguing but remains relatively unexplored due to limited logic recognition elements and complex signal outputs. In this proof-of-principle work, we attempted to address these problems by utilizing multi-functional peptide probes and the peptide-mediated nanoparticle assembly system. Here, the rationally designed peptide probes function as the dual-target responsive element specifically responsive to metal ions and enzymes as well as the mediator regulating the assembly of gold nanoparticles (AuNPs). Taking advantage of Zn2+ ions and chymotrypsin as the model inputs of metal ions and enzymes, respectively, we constructed the peptide logic system computed by the multi-functional peptide probes and outputted by the readable colour change of AuNPs. In this way, the representative binary basic logic gates (AND, OR, INHIBIT, NAND, IMPLICATION) have been achieved by delicately coding the peptide sequence, demonstrating the versatility of our logic system. Additionally, we demonstrated that the three-input combinational logic gate (INHIBIT-OR) could also be successfully integrated and applied as a multi-tasking biosensor for colorimetric detection of dual targets. This nanoparticle-based peptide logic system presents a valid strategy to illustrate peptide information processing and provides a practical platform for executing peptide computing or peptide-related multiplexing sensing, implying that the controllable nanomaterial assembly is a promising and potent methodology for the advancement of biomimetic bio-logic computation. Electronic supplementary information (ESI) available: Additional figures (Tables S1-S3 and Fig. S1-S6). See DOI: 10.1039/c6nr01072e

  4. [Decision of mathematical logical tasks in sensory enriched environment (classical music)].

    PubMed

    Pavlygina, R A; Karamysheva, N N; Tutushkina, M V; Sakharov, D S; Davydov, V I

    2012-01-01

    The time of a decision of mathematical logical tasks (MLT) was decreased during classical musical accompaniment (power 35 and 65 dB). Music 85 dB did not influence on the process of decision of MLT. Decision without the musical accompaniment led to increasing of coherent function values in beta1, beta2, gamma frequency ranges in EEG of occipital areas with prevalence in a left hemisphere. A coherence of potentials was decreased in EEG of frontal cortex. Music decreasing of making-decision time enhanced left-sided EEG asymmetry The intrahemispheric and the interhemispheric coherences of frontal cortex were increased during the decision of MLT accompanied by music. Using of musical accompaniment 85 dB produced a right-side asymmetry in EEG and formed a focus of coherent connections in EEG of temporal area of a right hemisphere.

  5. Faster Evolution of More Multifunctional Logic Circuits

    NASA Technical Reports Server (NTRS)

    Stoica, Adrian; Zebulum, Ricardo

    2005-01-01

    A modification in a method of automated evolutionary synthesis of voltage-controlled multifunctional logic circuits makes it possible to synthesize more circuits in less time. Prior to the modification, the computations for synthesizing a four-function logic circuit by this method took about 10 hours. Using the method as modified, it is possible to synthesize a six-function circuit in less than half an hour. The concepts of automated evolutionary synthesis and voltage-controlled multifunctional logic circuits were described in a number of prior NASA Tech Briefs articles. To recapitulate: A circuit is designed to perform one of several different logic functions, depending on the value of an applied control voltage. The circuit design is synthesized following an automated evolutionary approach that is so named because it is modeled partly after the repetitive trial-and-error process of biological evolution. In this process, random populations of integer strings that encode electronic circuits play a role analogous to that of chromosomes. An evolved circuit is tested by computational simulation (prior to testing in real hardware to verify a final design). Then, in a fitness-evaluation step, responses of the circuit are compared with specifications of target responses and circuits are ranked according to how close they come to satisfying specifications. The results of the evaluation provide guidance for refining designs through further iteration.

  6. Business logic for geoprocessing of distributed geodata

    NASA Astrophysics Data System (ADS)

    Kiehle, Christian

    2006-12-01

    This paper describes the development of a business-logic component for the geoprocessing of distributed geodata. The business logic acts as a mediator between the data and the user, therefore playing a central role in any spatial information system. The component is used in service-oriented architectures to foster the reuse of existing geodata inventories. Based on a geoscientific case study of groundwater vulnerability assessment and mapping, the demands for such architectures are identified with special regard to software engineering tasks. Methods are derived from the field of applied Geosciences (Hydrogeology), Geoinformatics, and Software Engineering. In addition to the development of a business logic component, a forthcoming Open Geospatial Consortium (OGC) specification is introduced: the OGC Web Processing Service (WPS) specification. A sample application is introduced to demonstrate the potential of WPS for future information systems. The sample application Geoservice Groundwater Vulnerability is described in detail to provide insight into the business logic component, and demonstrate how information can be generated out of distributed geodata. This has the potential to significantly accelerate the assessment and mapping of groundwater vulnerability. The presented concept is easily transferable to other geoscientific use cases dealing with distributed data inventories. Potential application fields include web-based geoinformation systems operating on distributed data (e.g. environmental planning systems, cadastral information systems, and others).

  7. Property Specification Patterns for intelligence building software

    NASA Astrophysics Data System (ADS)

    Chun, Seungsu

    2018-03-01

    In this paper, through the property specification pattern research for Modal MU(μ) logical aspects present a single framework based on the pattern of intelligence building software. In this study, broken down by state property specification pattern classification of Dwyer (S) and action (A) and was subdivided into it again strong (A) and weaknesses (E). Through these means based on a hierarchical pattern classification of the property specification pattern analysis of logical aspects Mu(μ) was applied to the pattern classification of the examples used in the actual model checker. As a result, not only can a more accurate classification than the existing classification systems were easy to create and understand the attributes specified.

  8. Defining, illustrating and reflecting on logic analysis with an example from a professional development program.

    PubMed

    Tremblay, Marie-Claude; Brousselle, Astrid; Richard, Lucie; Beaudet, Nicole

    2013-10-01

    Program designers and evaluators should make a point of testing the validity of a program's intervention theory before investing either in implementation or in any type of evaluation. In this context, logic analysis can be a particularly useful option, since it can be used to test the plausibility of a program's intervention theory using scientific knowledge. Professional development in public health is one field among several that would truly benefit from logic analysis, as it appears to be generally lacking in theorization and evaluation. This article presents the application of this analysis method to an innovative public health professional development program, the Health Promotion Laboratory. More specifically, this paper aims to (1) define the logic analysis approach and differentiate it from similar evaluative methods; (2) illustrate the application of this method by a concrete example (logic analysis of a professional development program); and (3) reflect on the requirements of each phase of logic analysis, as well as on the advantages and disadvantages of such an evaluation method. Using logic analysis to evaluate the Health Promotion Laboratory showed that, generally speaking, the program's intervention theory appeared to have been well designed. By testing and critically discussing logic analysis, this article also contributes to further improving and clarifying the method. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. How Young Children Learn to Program with Sensor, Action, and Logic Blocks

    ERIC Educational Resources Information Center

    Wyeth, Peta

    2008-01-01

    Electronic Blocks are a new programming environment designed specifically for children aged between 3 and 8 years. These physical, stackable blocks include sensor blocks, action blocks, and logic blocks. By connecting these blocks, children can program a wide variety of structures that interact with one another and the environment. Electronic…

  10. A DNAzyme-mediated logic gate for programming molecular capture and release on DNA origami.

    PubMed

    Li, Feiran; Chen, Haorong; Pan, Jing; Cha, Tae-Gon; Medintz, Igor L; Choi, Jong Hyun

    2016-06-28

    Here we design a DNA origami-based site-specific molecular capture and release platform operated by a DNAzyme-mediated logic gate process. We show the programmability and versatility of this platform with small molecules, proteins, and nanoparticles, which may also be controlled by external light signals.

  11. Semi-Structured Interview Protocol for Constructing Logic Models

    ERIC Educational Resources Information Center

    Gugiu, P. Cristian; Rodriguez-Campos, Liliana

    2007-01-01

    This paper details a semi-structured interview protocol that evaluators can use to develop a logic model of a program's services and outcomes. The protocol presents a series of questions, which evaluators can ask of specific program informants, that are designed to: (1) identify key informants basic background and contextual information, (2)…

  12. Development of a corn and soybean labeling procedure for use with profile parameter classification

    NASA Technical Reports Server (NTRS)

    Magness, E. R. (Principal Investigator)

    1982-01-01

    Some essential processes for the development of a green-number-based logic for identifying (labeling) crops in LANDSAT imagery are documented. The supporting data and subsequent conclusions that resulted from development of a specific labeling logic for corn and soybean crops in the United States are recorded.

  13. Logical Aspects of Question-Answering by Computer.

    ERIC Educational Resources Information Center

    Kuhns, J. L.

    The problem of computerized question-answering is discussed in this paper from the point of view of certain technical, although elementary, notions of logic. Although the work reported herein has general application to the design of information systems, it is specifically motivated by the RAND Relational Data File. This system, for which a…

  14. Multi-variants synthesis of Petri nets for FPGA devices

    NASA Astrophysics Data System (ADS)

    Bukowiec, Arkadiusz; Doligalski, Michał

    2015-09-01

    There is presented new method of synthesis of application specific logic controllers for FPGA devices. The specification of control algorithm is made with use of control interpreted Petri net (PT type). It allows specifying parallel processes in easy way. The Petri net is decomposed into state-machine type subnets. In this case, each subnet represents one parallel process. For this purpose there are applied algorithms of coloring of Petri nets. There are presented two approaches of such decomposition: with doublers of macroplaces or with one global wait place. Next, subnets are implemented into two-level logic circuit of the controller. The levels of logic circuit are obtained as a result of its architectural decomposition. The first level combinational circuit is responsible for generation of next places and second level decoder is responsible for generation output symbols. There are worked out two variants of such circuits: with one shared operational memory or with many flexible distributed memories as a decoder. Variants of Petri net decomposition and structures of logic circuits can be combined together without any restrictions. It leads to existence of four variants of multi-variants synthesis.

  15. Logic programming to infer complex RNA expression patterns from RNA-seq data.

    PubMed

    Weirick, Tyler; Militello, Giuseppe; Ponomareva, Yuliya; John, David; Döring, Claudia; Dimmeler, Stefanie; Uchida, Shizuka

    2018-03-01

    To meet the increasing demand in the field, numerous long noncoding RNA (lncRNA) databases are available. Given many lncRNAs are specifically expressed in certain cell types and/or time-dependent manners, most lncRNA databases fall short of providing such profiles. We developed a strategy using logic programming to handle the complex organization of organs, their tissues and cell types as well as gender and developmental time points. To showcase this strategy, we introduce 'RenalDB' (http://renaldb.uni-frankfurt.de), a database providing expression profiles of RNAs in major organs focusing on kidney tissues and cells. RenalDB uses logic programming to describe complex anatomy, sample metadata and logical relationships defining expression, enrichment or specificity. We validated the content of RenalDB with biological experiments and functionally characterized two long intergenic noncoding RNAs: LOC440173 is important for cell growth or cell survival, whereas PAXIP1-AS1 is a regulator of cell death. We anticipate RenalDB will be used as a first step toward functional studies of lncRNAs in the kidney.

  16. Challenges to understanding spatial patterns of disease: philosophical alternatives to logical positivism.

    PubMed

    Mayer, J D

    1992-08-01

    Most studies of disease distribution, in medical geography and other related disciplines, have been empirical in nature and rooted in the assumptions of logical positivism. However, some of the more newly articulated philosophies of the social sciences, and of social theory, have much to add in the understanding of the processes and mechanisms underlying disease distribution. This paper represents a plea for creative synthesis between logical positivism and realism or structuration, and uses specific examples to suggest how disease distribution, as a surface phenomenon, can be explained using deeper analysis.

  17. Multiferroic nanomagnetic logic: Hybrid spintronics-straintronic paradigm for ultra-low energy computing

    NASA Astrophysics Data System (ADS)

    Salehi Fashami, Mohammad

    Excessive energy dissipation in CMOS devices during switching is the primary threat to continued downscaling of computing devices in accordance with Moore's law. In the quest for alternatives to traditional transistor based electronics, nanomagnet-based computing [1, 2] is emerging as an attractive alternative since: (i) nanomagnets are intrinsically more energy-efficient than transistors due to the correlated switching of spins [3], and (ii) unlike transistors, magnets have no leakage and hence have no standby power dissipation. However, large energy dissipation in the clocking circuit appears to be a barrier to the realization of ultra low power logic devices with such nanomagnets. To alleviate this issue, we propose the use of a hybrid spintronics-straintronics or straintronic nanomagnetic logic (SML) paradigm. This uses a piezoelectric layer elastically coupled to an elliptically shaped magnetostrictive nanomagnetic layer for both logic [4-6] and memory [7-8] and other information processing [9-10] applications that could potentially be 2-3 orders of magnitude more energy efficient than current CMOS based devices. This dissertation focuses on studying the feasibility, performance and reliability of such nanomagnetic logic circuits by simulating the nanoscale magnetization dynamics of dipole coupled nanomagnets clocked by stress. Specifically, the topics addressed are: 1. Theoretical study of multiferroic nanomagnetic arrays laid out in specific geometric patterns to implement a "logic wire" for unidirectional information propagation and a universal logic gate [4-6]. 2. Monte Carlo simulations of the magnetization trajectories in a simple system of dipole coupled nanomagnets and NAND gate described by the Landau-Lifshitz-Gilbert (LLG) equations simulated in the presence of random thermal noise to understand the dynamics switching error [11, 12] in such devices. 3. Arriving at a lower bound for energy dissipation as a function of switching error [13] for a practical nanomagnetic logic scheme. 4. Clocking of nanomagnetic logic with surface acoustic waves (SAW) to drastically decrease the lithographic burden needed to contact each multiferroic nanomagnet while maintaining pipelined information processing. 5. Nanomagnets with four (or higher states) implemented with shape engineering. Two types of magnet that encode four states: (i) diamond, and (ii) concave nanomagnets are studied for coherence of the switching process.

  18. Integration of Genetic Algorithms and Fuzzy Logic for Urban Growth Modeling

    NASA Astrophysics Data System (ADS)

    Foroutan, E.; Delavar, M. R.; Araabi, B. N.

    2012-07-01

    Urban growth phenomenon as a spatio-temporal continuous process is subject to spatial uncertainty. This inherent uncertainty cannot be fully addressed by the conventional methods based on the Boolean algebra. Fuzzy logic can be employed to overcome this limitation. Fuzzy logic preserves the continuity of dynamic urban growth spatially by choosing fuzzy membership functions, fuzzy rules and the fuzzification-defuzzification process. Fuzzy membership functions and fuzzy rule sets as the heart of fuzzy logic are rather subjective and dependent on the expert. However, due to lack of a definite method for determining the membership function parameters, certain optimization is needed to tune the parameters and improve the performance of the model. This paper integrates genetic algorithms and fuzzy logic as a genetic fuzzy system (GFS) for modeling dynamic urban growth. The proposed approach is applied for modeling urban growth in Tehran Metropolitan Area in Iran. Historical land use/cover data of Tehran Metropolitan Area extracted from the 1988 and 1999 Landsat ETM+ images are employed in order to simulate the urban growth. The extracted land use classes of the year 1988 include urban areas, street, vegetation areas, slope and elevation used as urban growth physical driving forces. Relative Operating Characteristic (ROC) curve as an fitness function has been used to evaluate the performance of the GFS algorithm. The optimum membership function parameter is applied for generating a suitability map for the urban growth. Comparing the suitability map and real land use map of 1999 gives the threshold value for the best suitability map which can simulate the land use map of 1999. The simulation outcomes in terms of kappa of 89.13% and overall map accuracy of 95.58% demonstrated the efficiency and reliability of the proposed model.

  19. Selection Shapes Transcriptional Logic and Regulatory Specialization in Genetic Networks

    PubMed Central

    Fogelmark, Karl; Peterson, Carsten; Troein, Carl

    2016-01-01

    Background Living organisms need to regulate their gene expression in response to environmental signals and internal cues. This is a computational task where genes act as logic gates that connect to form transcriptional networks, which are shaped at all scales by evolution. Large-scale mutations such as gene duplications and deletions add and remove network components, whereas smaller mutations alter the connections between them. Selection determines what mutations are accepted, but its importance for shaping the resulting networks has been debated. Methodology To investigate the effects of selection in the shaping of transcriptional networks, we derive transcriptional logic from a combinatorially powerful yet tractable model of the binding between DNA and transcription factors. By evolving the resulting networks based on their ability to function as either a simple decision system or a circadian clock, we obtain information on the regulation and logic rules encoded in functional transcriptional networks. Comparisons are made between networks evolved for different functions, as well as with structurally equivalent but non-functional (neutrally evolved) networks, and predictions are validated against the transcriptional network of E. coli. Principal Findings We find that the logic rules governing gene expression depend on the function performed by the network. Unlike the decision systems, the circadian clocks show strong cooperative binding and negative regulation, which achieves tight temporal control of gene expression. Furthermore, we find that transcription factors act preferentially as either activators or repressors, both when binding multiple sites for a single target gene and globally in the transcriptional networks. This separation into positive and negative regulators requires gene duplications, which highlights the interplay between mutation and selection in shaping the transcriptional networks. PMID:26927540

  20. Reflections on writing hydrologic reports

    USGS Publications Warehouse

    Olcott, Perry G.

    1987-01-01

    Reporting of scientific work should be characterized by a logical argument that is developed through presentation of the problem, tabulation and display of data pertinent to the problem , and testing and interpretation of the data to prove hypotheses that address the problem. Organization of the report is vital to developing this logical argument: it provides structure, continuity, logic, and emphasis to the presentation. Each part of the report serves a specific function and each is linked by a connecting logic, the logical argument of the report. Each scientific report normally has a title, table of contents, abstract, introduction, body (of the report), and summary and/or conclusions. Organization of sections within the body of the report is exactly parallel to overall organization; subjects presented in the section title are developed by logical subdivisions and pertinent discussion. The summary and/or conclusions section culminates the logical argument of the report by drawing together and quantitatively reiterating the principal conclusions developed in the discussion. Supplemental information on report content, background of the study, additional data or details on procedures, and other information of interest to the reader is presented in the foreward or preface, list of illustrations or tables, glossaries, and appendixes. (Lantz-PTT)

  1. Construction of high-dimensional universal quantum logic gates using a Λ system coupled with a whispering-gallery-mode microresonator.

    PubMed

    He, Ling Yan; Wang, Tie-Jun; Wang, Chuan

    2016-07-11

    High-dimensional quantum system provides a higher capacity of quantum channel, which exhibits potential applications in quantum information processing. However, high-dimensional universal quantum logic gates is difficult to achieve directly with only high-dimensional interaction between two quantum systems and requires a large number of two-dimensional gates to build even a small high-dimensional quantum circuits. In this paper, we propose a scheme to implement a general controlled-flip (CF) gate where the high-dimensional single photon serve as the target qudit and stationary qubits work as the control logic qudit, by employing a three-level Λ-type system coupled with a whispering-gallery-mode microresonator. In our scheme, the required number of interaction times between the photon and solid state system reduce greatly compared with the traditional method which decomposes the high-dimensional Hilbert space into 2-dimensional quantum space, and it is on a shorter temporal scale for the experimental realization. Moreover, we discuss the performance and feasibility of our hybrid CF gate, concluding that it can be easily extended to a 2n-dimensional case and it is feasible with current technology.

  2. MLM Builder: An Integrated Suite for Development and Maintenance of Arden Syntax Medical Logic Modules

    PubMed Central

    Sailors, R. Matthew

    1997-01-01

    The Arden Syntax specification for sharable computerized medical knowledge bases has not been widely utilized in the medical informatics community because of a lack of tools for developing Arden Syntax knowledge bases (Medical Logic Modules). The MLM Builder is a Microsoft Windows-hosted CASE (Computer Aided Software Engineering) tool designed to aid in the development and maintenance of Arden Syntax Medical Logic Modules (MLMs). The MLM Builder consists of the MLM Writer (an MLM generation tool), OSCAR (an anagram of Object-oriented ARden Syntax Compiler), a test database, and the MLManager (an MLM management information system). Working together, these components form a self-contained, unified development environment for the creation, testing, and maintenance of Arden Syntax Medical Logic Modules.

  3. Gender differences in post-temporal lobectomy verbal memory and relationships between MRI hippocampal volumes and preoperative verbal memory.

    PubMed

    Trenerry, M R; Jack, C R; Cascino, G D; Sharbrough, F W; Ivnik, R J

    1995-01-01

    Thirty-three men and 42 women who underwent left, and 26 men and 24 women who underwent right temporal lobectomy (TL) were studied retrospectively to determine if there were sex differences in (1) verbal memory outcome, and (2) relationships between verbal memory and magnetic resonance imaging (MRI) hippocampal volumes. All patients were left hemisphere language dominant. The surgical specimen and MRI were consistent only with mesial temporal sclerosis (MTS). Verbal memory was evaluated by Logical Memory percent retention (LMPER) from the Wechsler Memory Scale-Revised (WMS-R). Women experienced a significant improvement while men experienced a significant decline in postoperative LMPER. The difference between right and left hippocampal volumes predicted verbal memory outcome in both men and women. Preoperative LMPER was positively correlated with both the left and right hippocampal volumes in left TL women only. No verbal memory sex differences or correlations between LMPER and MRI data were found in the right TL group. The data support the presence of human neurocognitive sexual dimorphism. Verbal memory abilities supported by the hippocampus are less lateralized in women with left temporal lobe epilepsy and mesial temporal sclerosis. Women appear to have greater verbal memory plasticity following early left mesial temporal lobe insult.

  4. The right hemisphere's contribution to discourse processing: A study in temporal lobe epilepsy.

    PubMed

    Lomlomdjian, Carolina; Múnera, Claudia P; Low, Daniel M; Terpiluk, Verónica; Solís, Patricia; Abusamra, Valeria; Kochen, Silvia

    2017-08-01

    Discourse skills - in which the right hemisphere has an important role - enables verbal communication by selecting contextually relevant information and integrating it coherently to infer the correct meaning. However, language research in epilepsy has focused on single word analysis related mainly to left hemisphere processing. The purpose of this study was to investigate discourse abilities in patients with right lateralized medial temporal lobe epilepsy (RTLE) by comparing their performance to that of patients with left temporal lobe epilepsy (LTLE). 74 pharmacoresistant temporal lobe epilepsy (TLE) patients were evaluated: 34 with RTLE and 40 with LTLE. Subjects underwent a battery of tests that measure comprehension and production of conversational and narrative discourse. Disease related variables and general neuropsychological data were evaluated. The RTLE group presented deficits in interictal conversational and narrative discourse, with a disintegrated speech, lack of categorization and misinterpretation of social meaning. LTLE group, on the other hand, showed a tendency to lower performance in logical-temporal sequencing. RTLE patients showed discourse deficits which have been described in right hemisphere damaged patients due to other etiologies. Medial and anterior temporal lobe structures appear to link semantic, world knowledge, and social cognition associated areas to construct a contextually related coherent meaning. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Interspecific variation of warning calls in piranhas: a comparative analysis.

    PubMed

    Mélotte, Geoffrey; Vigouroux, Régis; Michel, Christian; Parmentier, Eric

    2016-10-26

    Fish sounds are known to be species-specific, possessing unique temporal and spectral features. We have recorded and compared sounds in eight piranha species to evaluate the potential role of acoustic communication as a driving force in clade diversification. All piranha species showed the same kind of sound-producing mechanism: sonic muscles originate on vertebrae and attach to a tendon surrounding the bladder ventrally. Contractions of the sound-producing muscles force swimbladder vibration and dictate the fundamental frequency. It results the calling features of the eight piranha species logically share many common characteristics. In all the species, the calls are harmonic sounds composed of multiple continuous cycles. However, the sounds of Serrasalmus elongatus (higher number of cycles and high fundamental frequency) and S. manueli (long cycle periods and low fundamental frequency) are clearly distinguishable from the other species. The sonic mechanism being largely conserved throughout piranha evolution, acoustic communication can hardly be considered as the main driving force in the diversification process. However, sounds of some species are clearly distinguishable despite the short space for variations supporting the need for specific communication. Behavioural studies are needed to clearly understand the eventual role of the calls during spawning events.

  6. The GuideLine Interchange Format

    PubMed Central

    Ohno-Machado, Lucila; Gennari, John H.; Murphy, Shawn N.; Jain, Nilesh L.; Tu, Samson W.; Oliver, Diane E.; Pattison-Gordon, Edward; Greenes, Robert A.; Shortliffe, Edward H.; Barnett, G. Octo

    1998-01-01

    Objective: To allow exchange of clinical practice guidelines among institutions and computer-based applications. Design: The GuideLine Interchange Format (GLIF) specification consists of the GLIF model and the GLIF syntax. The GLIF model is an object-oriented representation that consists of a set of classes for guideline entities, attributes for those classes, and data types for the attribute values. The GLIF syntax specifies the format of the test file that contains the encoding. Methods: Researchers from the InterMed Collaboratory at Columbia University, Harvard University (Brigham and Women's Hospital and Massachusetts General Hospital), and Stanford University analyzed four existing guideline systems to derive a set of requirements for guideline representation. The GLIF specification is a consensus representation developed through a brainstorming process. Four clinical guidelines were encoded in GLIF to assess its expressivity and to study the variability that occurs when two people from different sites encode the same guideline. Results: The encoders reported that GLIF was adequately expressive. A comparison of the encodings revealed substantial variability. Conclusion: GLIF was sufficient to model the guidelines for the four conditions that were examined. GLIF needs improvement in standard representation of medical concepts, criterion logic, temporal information, and uncertainty. PMID:9670133

  7. Interspecific variation of warning calls in piranhas: a comparative analysis

    PubMed Central

    Mélotte, Geoffrey; Vigouroux, Régis; Michel, Christian; Parmentier, Eric

    2016-01-01

    Fish sounds are known to be species-specific, possessing unique temporal and spectral features. We have recorded and compared sounds in eight piranha species to evaluate the potential role of acoustic communication as a driving force in clade diversification. All piranha species showed the same kind of sound-producing mechanism: sonic muscles originate on vertebrae and attach to a tendon surrounding the bladder ventrally. Contractions of the sound-producing muscles force swimbladder vibration and dictate the fundamental frequency. It results the calling features of the eight piranha species logically share many common characteristics. In all the species, the calls are harmonic sounds composed of multiple continuous cycles. However, the sounds of Serrasalmus elongatus (higher number of cycles and high fundamental frequency) and S. manueli (long cycle periods and low fundamental frequency) are clearly distinguishable from the other species. The sonic mechanism being largely conserved throughout piranha evolution, acoustic communication can hardly be considered as the main driving force in the diversification process. However, sounds of some species are clearly distinguishable despite the short space for variations supporting the need for specific communication. Behavioural studies are needed to clearly understand the eventual role of the calls during spawning events. PMID:27782184

  8. The music of morality and logic.

    PubMed

    Mesz, Bruno; Rodriguez Zivic, Pablo H; Cecchi, Guillermo A; Sigman, Mariano; Trevisan, Marcos A

    2015-01-01

    Musical theory has built on the premise that musical structures can refer to something different from themselves (Nattiez and Abbate, 1990). The aim of this work is to statistically corroborate the intuitions of musical thinkers and practitioners starting at least with Plato, that music can express complex human concepts beyond merely "happy" and "sad" (Mattheson and Lenneberg, 1958). To do so, we ask whether musical improvisations can be used to classify the semantic category of the word that triggers them. We investigated two specific domains of semantics: morality and logic. While morality has been historically associated with music, logic concepts, which involve more abstract forms of thought, are more rarely associated with music. We examined musical improvisations inspired by positive and negative morality (e.g., good and evil) and logic concepts (true and false), analyzing the associations between these words and their musical representations in terms of acoustic and perceptual features. We found that music conveys information about valence (good and true vs. evil and false) with remarkable consistency across individuals. This information is carried by several musical dimensions which act in synergy to achieve very high classification accuracy. Positive concepts are represented by music with more ordered pitch structure and lower harmonic and sensorial dissonance than negative concepts. Music also conveys information indicating whether the word which triggered it belongs to the domains of logic or morality (true vs. good), principally through musical articulation. In summary, improvisations consistently map logic and morality information to specific musical dimensions, testifying the capacity of music to accurately convey semantic information in domains related to abstract forms of thought.

  9. The music of morality and logic

    PubMed Central

    Mesz, Bruno; Rodriguez Zivic, Pablo H.; Cecchi, Guillermo A.; Sigman, Mariano; Trevisan, Marcos A.

    2015-01-01

    Musical theory has built on the premise that musical structures can refer to something different from themselves (Nattiez and Abbate, 1990). The aim of this work is to statistically corroborate the intuitions of musical thinkers and practitioners starting at least with Plato, that music can express complex human concepts beyond merely “happy” and “sad” (Mattheson and Lenneberg, 1958). To do so, we ask whether musical improvisations can be used to classify the semantic category of the word that triggers them. We investigated two specific domains of semantics: morality and logic. While morality has been historically associated with music, logic concepts, which involve more abstract forms of thought, are more rarely associated with music. We examined musical improvisations inspired by positive and negative morality (e.g., good and evil) and logic concepts (true and false), analyzing the associations between these words and their musical representations in terms of acoustic and perceptual features. We found that music conveys information about valence (good and true vs. evil and false) with remarkable consistency across individuals. This information is carried by several musical dimensions which act in synergy to achieve very high classification accuracy. Positive concepts are represented by music with more ordered pitch structure and lower harmonic and sensorial dissonance than negative concepts. Music also conveys information indicating whether the word which triggered it belongs to the domains of logic or morality (true vs. good), principally through musical articulation. In summary, improvisations consistently map logic and morality information to specific musical dimensions, testifying the capacity of music to accurately convey semantic information in domains related to abstract forms of thought. PMID:26191020

  10. mREST Interface Specification

    NASA Technical Reports Server (NTRS)

    McCartney, Patrick; MacLean, John

    2012-01-01

    mREST is an implementation of the REST architecture specific to the management and sharing of data in a system of logical elements. The purpose of this document is to clearly define the mREST interface protocol. The interface protocol covers all of the interaction between mREST clients and mREST servers. System-level requirements are not specifically addressed. In an mREST system, there are typically some backend interfaces between a Logical System Element (LSE) and the associated hardware/software system. For example, a network camera LSE would have a backend interface to the camera itself. These interfaces are specific to each type of LSE and are not covered in this document. There are also frontend interfaces that may exist in certain mREST manager applications. For example, an electronic procedure execution application may have a specialized interface for configuring the procedures. This interface would be application specific and outside of this document scope. mREST is intended to be a generic protocol which can be used in a wide variety of applications. A few scenarios are discussed to provide additional clarity but, in general, application-specific implementations of mREST are not specifically addressed. In short, this document is intended to provide all of the information necessary for an application developer to create mREST interface agents. This includes both mREST clients (mREST manager applications) and mREST servers (logical system elements, or LSEs).

  11. A biomimetic colorimetric logic gate system based on multi-functional peptide-mediated gold nanoparticle assembly.

    PubMed

    Li, Yong; Li, Wang; He, Kai-Yu; Li, Pei; Huang, Yan; Nie, Zhou; Yao, Shou-Zhuo

    2016-04-28

    In natural biological systems, proteins exploit various functional peptide motifs to exert target response and activity switch, providing a functional and logic basis for complex cellular activities. Building biomimetic peptide-based bio-logic systems is highly intriguing but remains relatively unexplored due to limited logic recognition elements and complex signal outputs. In this proof-of-principle work, we attempted to address these problems by utilizing multi-functional peptide probes and the peptide-mediated nanoparticle assembly system. Here, the rationally designed peptide probes function as the dual-target responsive element specifically responsive to metal ions and enzymes as well as the mediator regulating the assembly of gold nanoparticles (AuNPs). Taking advantage of Zn2+ ions and chymotrypsin as the model inputs of metal ions and enzymes, respectively, we constructed the peptide logic system computed by the multi-functional peptide probes and outputted by the readable colour change of AuNPs. In this way, the representative binary basic logic gates (AND, OR, INHIBIT, NAND, IMPLICATION) have been achieved by delicately coding the peptide sequence, demonstrating the versatility of our logic system. Additionally, we demonstrated that the three-input combinational logic gate (INHIBIT-OR) could also be successfully integrated and applied as a multi-tasking biosensor for colorimetric detection of dual targets. This nanoparticle-based peptide logic system presents a valid strategy to illustrate peptide information processing and provides a practical platform for executing peptide computing or peptide-related multiplexing sensing, implying that the controllable nanomaterial assembly is a promising and potent methodology for the advancement of biomimetic bio-logic computation.

  12. Application of actor level social characteristic indicator selection for the precursory detection of bullies in online social networks

    NASA Astrophysics Data System (ADS)

    White, Holly M.; Fields, Jeremy; Hall, Robert T.; White, Joshua S.

    2016-05-01

    Bullying is a national problem for families, courts, schools, and the economy. Social, educational, and professional lives of victims are affected. Early detection of bullies mitigates destructive effects of bullying. Our previous research found, given specific characteristics of an actor, actor logics can be developed utilizing input from natural language processing and graph analysis. Given similar characteristics of cyberbullies, in this paper, we create specific actor logics and apply these to a select social media dataset for the purpose of rapid identification of cyberbullying.

  13. Agent independent task planning

    NASA Technical Reports Server (NTRS)

    Davis, William S.

    1990-01-01

    Agent-Independent Planning is a technique that allows the construction of activity plans without regard to the agent that will perform them. Once generated, a plan is then validated and translated into instructions for a particular agent, whether a robot, crewmember, or software-based control system. Because Space Station Freedom (SSF) is planned for orbital operations for approximately thirty years, it will almost certainly experience numerous enhancements and upgrades, including upgrades in robotic manipulators. Agent-Independent Planning provides the capability to construct plans for SSF operations, independent of specific robotic systems, by combining techniques of object oriented modeling, nonlinear planning and temporal logic. Since a plan is validated using the physical and functional models of a particular agent, new robotic systems can be developed and integrated with existing operations in a robust manner. This technique also provides the capability to generate plans for crewmembers with varying skill levels, and later apply these same plans to more sophisticated robotic manipulators made available by evolutions in technology.

  14. Goals for Long-Range Research in Information Science.

    ERIC Educational Resources Information Center

    Pearson, Charls

    In order to discuss the research goals of information science (IS), both its logical and its specific nature must be determined. Peircean logical analysis shows that IS may be classified in three parts: pure science, applied science, and technology. The deficiency in the present state of the art is in the pure science, or theoretical portion, of…

  15. Person-Fit and the Rasch Model, with an Application to Knowledge of Logical Quantors.

    ERIC Educational Resources Information Center

    Molenaar, Ivo W.; Hoijtink, Herbert

    1996-01-01

    Some specific person-fit results for the Rasch model are presented, followed by an application to a test measuring knowledge of reasoning with logical quantors. Some issues are relevant to all attempts to use person-fit statistics in research, but the special role of the Rasch model is highlighted. (SLD)

  16. Redefining the "Public": Neoliberalism and the Corporate Appropriation of Public Education in Los Angeles

    ERIC Educational Resources Information Center

    Tate, Eliza

    2017-01-01

    This study examines how the discourse of the crisis and failing of public education creates space for the legitimization and deployment of neoliberal logic of education reform based on free market principles specifically in Los Angeles. It utilizes Critical Discourse Analysis to examine how these discourses and logic of neoliberal reform are…

  17. The Logic of Evaluative Argument. CSE Monograph Series in Evaluation, 7.

    ERIC Educational Resources Information Center

    House, Ernest R.

    Evaluation is an act of persuasion directed to a specific audience concerning the solution of a problem. The process of evaluation is prescribed by the nature of knowledge--which is generally complex, always uncertain (in varying degrees), and not always propositional--and by the nature of logic, which is always selective. In the process of…

  18. Dewey's Logic as a Methodological Grounding Point for Practitioner-Based Inquiry

    ERIC Educational Resources Information Center

    Demetrion, George

    2012-01-01

    The purpose of this essay is to draw out key insights from Dewey's important text "Logic: The Theory of Inquiry" to provide theoretical and practical support for the emergent field of teacher research. The specific focal point is the argument in Cochran-Smith and Lytle's "Inside/Outside: Teacher Research and Knowledge" on the significance of…

  19. Text Cohesion and Comprehension: A Comparison of Prose Analysis Systems.

    ERIC Educational Resources Information Center

    Varnhagen, Connie K.; Goldman, Susan R.

    To test three specific hypotheses about recall as a function of four categories of logical relations, a study was done to determine whether logical relations systems of prose analysis can be used to predict recall. Two descriptive passages of naturally occurring expository prose were used. Each text was parsed into 45 statements, consisting of…

  20. Are children with Specific Language Impairment competent with the pragmatics and logic of quantification?

    PubMed

    Katsos, Napoleon; Roqueta, Clara Andrés; Estevan, Rosa Ana Clemente; Cummins, Chris

    2011-04-01

    Specific Language Impairment (SLI) is understood to be a disorder that predominantly affects phonology, morphosyntax and/or lexical semantics. There is little conclusive evidence on whether children with SLI are challenged with regard to Gricean pragmatic maxims and on whether children with SLI are competent with the logical meaning of quantifying expressions. We use the comprehension of statements quantified with 'all', 'none', 'some', 'some…not', 'most' and 'not all' as a paradigm to study whether Spanish-speaking children with SLI are competent with the pragmatic maxim of informativeness, as well as with the logical meaning of these expressions. Children with SLI performed more poorly than a group of age-matched typically-developing peers, and both groups performed more poorly with pragmatics than with logical meaning. Moreover, children with SLI were disproportionately challenged by pragmatic meaning compared to their age-matched peers. However, the performance of children with SLI was comparable to that of a group of younger language-matched typically-developing children. The findings document that children with SLI do face difficulties with employing the maxim of informativeness, as well as with understanding the logical meaning of quantifiers, but also that these difficulties are in keeping with their overall language difficulties rather than exceeding them. The implications of these findings for SLI, linguistic theory, and clinical practice are discussed. Copyright © 2010 Elsevier B.V. All rights reserved.

  1. Linguistic Summarization of Video for Fall Detection Using Voxel Person and Fuzzy Logic

    PubMed Central

    Anderson, Derek; Luke, Robert H.; Keller, James M.; Skubic, Marjorie; Rantz, Marilyn; Aud, Myra

    2009-01-01

    In this paper, we present a method for recognizing human activity from linguistic summarizations of temporal fuzzy inference curves representing the states of a three-dimensional object called voxel person. A hierarchy of fuzzy logic is used, where the output from each level is summarized and fed into the next level. We present a two level model for fall detection. The first level infers the states of the person at each image. The second level operates on linguistic summarizations of voxel person’s states and inference regarding activity is performed. The rules used for fall detection were designed under the supervision of nurses to ensure that they reflect the manner in which elders perform these activities. The proposed framework is extremely flexible. Rules can be modified, added, or removed, allowing for per-resident customization based on knowledge about their cognitive and physical ability. PMID:20046216

  2. WNN 92; Proceedings of the 3rd Workshop on Neural Networks: Academic/Industrial/NASA/Defense, Auburn Univ., AL, Feb. 10-12, 1992 and South Shore Harbour, TX, Nov. 4-6, 1992

    NASA Technical Reports Server (NTRS)

    Padgett, Mary L. (Editor)

    1993-01-01

    The present conference discusses such neural networks (NN) related topics as their current development status, NN architectures, NN learning rules, NN optimization methods, NN temporal models, NN control methods, NN pattern recognition systems and applications, biological and biomedical applications of NNs, VLSI design techniques for NNs, NN systems simulation, fuzzy logic, and genetic algorithms. Attention is given to missileborne integrated NNs, adaptive-mixture NNs, implementable learning rules, an NN simulator for travelling salesman problem solutions, similarity-based forecasting, NN control of hypersonic aircraft takeoff, NN control of the Space Shuttle Arm, an adaptive NN robot manipulator controller, a synthetic approach to digital filtering, NNs for speech analysis, adaptive spline networks, an anticipatory fuzzy logic controller, and encoding operations for fuzzy associative memories.

  3. The past, present and future of cyber-physical systems: a focus on models.

    PubMed

    Lee, Edward A

    2015-02-26

    This paper is about better engineering of cyber-physical systems (CPSs) through better models. Deterministic models have historically proven extremely useful and arguably form the kingpin of the industrial revolution and the digital and information technology revolutions. Key deterministic models that have proven successful include differential equations, synchronous digital logic and single-threaded imperative programs. Cyber-physical systems, however, combine these models in such a way that determinism is not preserved. Two projects show that deterministic CPS models with faithful physical realizations are possible and practical. The first project is PRET, which shows that the timing precision of synchronous digital logic can be practically made available at the software level of abstraction. The second project is Ptides (programming temporally-integrated distributed embedded systems), which shows that deterministic models for distributed cyber-physical systems have practical faithful realizations. These projects are existence proofs that deterministic CPS models are possible and practical.

  4. Heavy-Ion Microbeam Fault Injection into SRAM-Based FPGA Implementations of Cryptographic Circuits

    NASA Astrophysics Data System (ADS)

    Li, Huiyun; Du, Guanghua; Shao, Cuiping; Dai, Liang; Xu, Guoqing; Guo, Jinlong

    2015-06-01

    Transistors hit by heavy ions may conduct transiently, thereby introducing transient logic errors. Attackers can exploit these abnormal behaviors and extract sensitive information from the electronic devices. This paper demonstrates an ion irradiation fault injection attack experiment into a cryptographic field-programmable gate-array (FPGA) circuit. The experiment proved that the commercial FPGA chip is vulnerable to low-linear energy transfer carbon irradiation, and the attack can cause the leakage of secret key bits. A statistical model is established to estimate the possibility of an effective fault injection attack on cryptographic integrated circuits. The model incorporates the effects from temporal, spatial, and logical probability of an effective attack on the cryptographic circuits. The rate of successful attack calculated from the model conforms well to the experimental results. This quantitative success rate model can help evaluate security risk for designers as well as for the third-party assessment organizations.

  5. REVIEWS OF TOPICAL PROBLEMS: 21st century: what is life from the perspective of physics?

    NASA Astrophysics Data System (ADS)

    Ivanitskii, Genrikh R.

    2010-07-01

    The evolution of the biophysical paradigm over 65 years since the publication in 1944 of Erwin Schrödinger's What is Life? The Physical Aspects of the Living Cell is reviewed. Based on the advances in molecular genetics, it is argued that all the features characteristic of living systems can also be found in nonliving ones. Ten paradoxes in logic and physics are analyzed that allow defining life in terms of a spatial-temporal hierarchy of structures and combinatory probabilistic logic. From the perspective of physics, life can be defined as resulting from a game involving interactions of matter one part of which acquires the ability to remember the success (or failure) probabilities from the previous rounds of the game, thereby increasing its chances for further survival in the next round. This part of matter is currently called living matter.

  6. The Past, Present and Future of Cyber-Physical Systems: A Focus on Models

    PubMed Central

    Lee, Edward A.

    2015-01-01

    This paper is about better engineering of cyber-physical systems (CPSs) through better models. Deterministic models have historically proven extremely useful and arguably form the kingpin of the industrial revolution and the digital and information technology revolutions. Key deterministic models that have proven successful include differential equations, synchronous digital logic and single-threaded imperative programs. Cyber-physical systems, however, combine these models in such a way that determinism is not preserved. Two projects show that deterministic CPS models with faithful physical realizations are possible and practical. The first project is PRET, which shows that the timing precision of synchronous digital logic can be practically made available at the software level of abstraction. The second project is Ptides (programming temporally-integrated distributed embedded systems), which shows that deterministic models for distributed cyber-physical systems have practical faithful realizations. These projects are existence proofs that deterministic CPS models are possible and practical. PMID:25730486

  7. Fuzzy Logic Decoupled Lateral Control for General Aviation Airplanes

    NASA Technical Reports Server (NTRS)

    Duerksen, Noel

    1997-01-01

    It has been hypothesized that a human pilot uses the same set of generic skills to control a wide variety of aircraft. If this is true, then it should be possible to construct an electronic controller which embodies this generic skill set such that it can successfully control different airplanes without being matched to a specific airplane. In an attempt to create such a system, a fuzzy logic controller was devised to control aileron or roll spoiler position. This controller was used to control bank angle for both a piston powered single engine aileron equipped airplane simulation and a business jet simulation which used spoilers for primary roll control. Overspeed, stall and overbank protection were incorporated in the form of expert systems supervisors and weighted fuzzy rules. It was found that by using the artificial intelligence techniques of fuzzy logic and expert systems, a generic lateral controller could be successfully used on two general aviation aircraft types that have very different characteristics. These controllers worked for both airplanes over their entire flight envelopes. The controllers for both airplanes were identical except for airplane specific limits (maximum allowable airspeed, throttle ]ever travel, etc.). This research validated the fact that the same fuzzy logic based controller can control two very different general aviation airplanes. It also developed the basic controller architecture and specific control parameters required for such a general controller.

  8. CellNOptR: a flexible toolkit to train protein signaling networks to data using multiple logic formalisms.

    PubMed

    Terfve, Camille; Cokelaer, Thomas; Henriques, David; MacNamara, Aidan; Goncalves, Emanuel; Morris, Melody K; van Iersel, Martijn; Lauffenburger, Douglas A; Saez-Rodriguez, Julio

    2012-10-18

    Cells process signals using complex and dynamic networks. Studying how this is performed in a context and cell type specific way is essential to understand signaling both in physiological and diseased situations. Context-specific medium/high throughput proteomic data measured upon perturbation is now relatively easy to obtain but formalisms that can take advantage of these features to build models of signaling are still comparatively scarce. Here we present CellNOptR, an open-source R software package for building predictive logic models of signaling networks by training networks derived from prior knowledge to signaling (typically phosphoproteomic) data. CellNOptR features different logic formalisms, from Boolean models to differential equations, in a common framework. These different logic model representations accommodate state and time values with increasing levels of detail. We provide in addition an interface via Cytoscape (CytoCopteR) to facilitate use and integration with Cytoscape network-based capabilities. Models generated with this pipeline have two key features. First, they are constrained by prior knowledge about the network but trained to data. They are therefore context and cell line specific, which results in enhanced predictive and mechanistic insights. Second, they can be built using different logic formalisms depending on the richness of the available data. Models built with CellNOptR are useful tools to understand how signals are processed by cells and how this is altered in disease. They can be used to predict the effect of perturbations (individual or in combinations), and potentially to engineer therapies that have differential effects/side effects depending on the cell type or context.

  9. CellNOptR: a flexible toolkit to train protein signaling networks to data using multiple logic formalisms

    PubMed Central

    2012-01-01

    Background Cells process signals using complex and dynamic networks. Studying how this is performed in a context and cell type specific way is essential to understand signaling both in physiological and diseased situations. Context-specific medium/high throughput proteomic data measured upon perturbation is now relatively easy to obtain but formalisms that can take advantage of these features to build models of signaling are still comparatively scarce. Results Here we present CellNOptR, an open-source R software package for building predictive logic models of signaling networks by training networks derived from prior knowledge to signaling (typically phosphoproteomic) data. CellNOptR features different logic formalisms, from Boolean models to differential equations, in a common framework. These different logic model representations accommodate state and time values with increasing levels of detail. We provide in addition an interface via Cytoscape (CytoCopteR) to facilitate use and integration with Cytoscape network-based capabilities. Conclusions Models generated with this pipeline have two key features. First, they are constrained by prior knowledge about the network but trained to data. They are therefore context and cell line specific, which results in enhanced predictive and mechanistic insights. Second, they can be built using different logic formalisms depending on the richness of the available data. Models built with CellNOptR are useful tools to understand how signals are processed by cells and how this is altered in disease. They can be used to predict the effect of perturbations (individual or in combinations), and potentially to engineer therapies that have differential effects/side effects depending on the cell type or context. PMID:23079107

  10. A reconfigurable NAND/NOR genetic logic gate

    PubMed Central

    2012-01-01

    Background Engineering genetic Boolean logic circuits is a major research theme of synthetic biology. By altering or introducing connections between genetic components, novel regulatory networks are built in order to mimic the behaviour of electronic devices such as logic gates. While electronics is a highly standardized science, genetic logic is still in its infancy, with few agreed standards. In this paper we focus on the interpretation of logical values in terms of molecular concentrations. Results We describe the results of computational investigations of a novel circuit that is able to trigger specific differential responses depending on the input standard used. The circuit can therefore be dynamically reconfigured (without modification) to serve as both a NAND/NOR logic gate. This multi-functional behaviour is achieved by a) varying the meanings of inputs, and b) using branch predictions (as in computer science) to display a constrained output. A thorough computational study is performed, which provides valuable insights for the future laboratory validation. The simulations focus on both single-cell and population behaviours. The latter give particular insights into the spatial behaviour of our engineered cells on a surface with a non-homogeneous distribution of inputs. Conclusions We present a dynamically-reconfigurable NAND/NOR genetic logic circuit that can be switched between modes of operation via a simple shift in input signal concentration. The circuit addresses important issues in genetic logic that will have significance for more complex synthetic biology applications. PMID:22989145

  11. A reconfigurable NAND/NOR genetic logic gate.

    PubMed

    Goñi-Moreno, Angel; Amos, Martyn

    2012-09-18

    Engineering genetic Boolean logic circuits is a major research theme of synthetic biology. By altering or introducing connections between genetic components, novel regulatory networks are built in order to mimic the behaviour of electronic devices such as logic gates. While electronics is a highly standardized science, genetic logic is still in its infancy, with few agreed standards. In this paper we focus on the interpretation of logical values in terms of molecular concentrations. We describe the results of computational investigations of a novel circuit that is able to trigger specific differential responses depending on the input standard used. The circuit can therefore be dynamically reconfigured (without modification) to serve as both a NAND/NOR logic gate. This multi-functional behaviour is achieved by a) varying the meanings of inputs, and b) using branch predictions (as in computer science) to display a constrained output. A thorough computational study is performed, which provides valuable insights for the future laboratory validation. The simulations focus on both single-cell and population behaviours. The latter give particular insights into the spatial behaviour of our engineered cells on a surface with a non-homogeneous distribution of inputs. We present a dynamically-reconfigurable NAND/NOR genetic logic circuit that can be switched between modes of operation via a simple shift in input signal concentration. The circuit addresses important issues in genetic logic that will have significance for more complex synthetic biology applications.

  12. Continuous variables logic via coupled automata using a DNAzyme cascade with feedback.

    PubMed

    Lilienthal, S; Klein, M; Orbach, R; Willner, I; Remacle, F; Levine, R D

    2017-03-01

    The concentration of molecules can be changed by chemical reactions and thereby offer a continuous readout. Yet computer architecture is cast in textbooks in terms of binary valued, Boolean variables. To enable reactive chemical systems to compute we show how, using the Cox interpretation of probability theory, one can transcribe the equations of chemical kinetics as a sequence of coupled logic gates operating on continuous variables. It is discussed how the distinct chemical identity of a molecule allows us to create a common language for chemical kinetics and Boolean logic. Specifically, the logic AND operation is shown to be equivalent to a bimolecular process. The logic XOR operation represents chemical processes that take place concurrently. The values of the rate constants enter the logic scheme as inputs. By designing a reaction scheme with a feedback we endow the logic gates with a built in memory because their output then depends on the input and also on the present state of the system. Technically such a logic machine is an automaton. We report an experimental realization of three such coupled automata using a DNAzyme multilayer signaling cascade. A simple model verifies analytically that our experimental scheme provides an integrator generating a power series that is third order in time. The model identifies two parameters that govern the kinetics and shows how the initial concentrations of the substrates are the coefficients in the power series.

  13. Distinct frontal regions for processing sentence syntax and story grammar.

    PubMed

    Sirigu, A; Cohen, L; Zalla, T; Pradat-Diehl, P; Van Eeckhout, P; Grafman, J; Agid, Y

    1998-12-01

    Time is a fundamental dimension of cognition. It is expressed in the sequential ordering of individual elements in a wide variety of activities such as language, motor control or in the broader domain of long range goal-directed actions. Several studies have shown the importance of the frontal lobes in sequencing information. The question addressed in this study is whether this brain region hosts a single supramodal sequence processor, or whether separate mechanisms are required for different kinds of temporally organised knowledge structures such as syntax and action knowledge. Here we show that so-called agrammatic patients, with lesions in Broca's area, ordered word groups correctly to form a logical sequence of actions but they were severely impaired when similar word groups had to be ordered as a syntactically well-formed sentence. The opposite performance was observed in patients with dorsolateral prefrontal lesions, that is, while their syntactic processing was intact at the sentence level, they demonstrated a pronounced deficit in producing temporally coherent sequences of actions. Anatomical reconstruction of lesions from brain scans revealed that the sentence and action grammar deficits involved distinct, non-overlapping sites within the frontal lobes. Finally, in a third group of patients whose lesions encompassed both Broca's area and the prefrontal cortex, the two types of deficits were found. We conclude that sequence processing is specific to knowledge domains and involves different networks within the frontal lobes.

  14. Can data from disparate long-term fish monitoring programs be used to increase our understanding of regional and continental trends in large river assemblages?

    USGS Publications Warehouse

    Counihan, Timothy D.; Waite, Ian R.; Casper, Andrew F.; Ward, David L.; Sauer, Jennifer S.; Irwin, Elise R.; Chapman, Colin G.; Ickes, Brian; Paukert, Craig P.; Kosovich, John J.; Bayer, Jennifer M.

    2018-01-01

    Understanding trends in the diverse resources provided by large rivers will help balance tradeoffs among stakeholders and inform strategies to mitigate the effects of landscape scale stressors such as climate change and invasive species. Absent a cohesive coordinated effort to assess trends in important large river resources, a logical starting point is to assess our ability to draw inferences from existing efforts. In this paper, we use a common analytical framework to analyze data from five disparate fish monitoring programs to better understand the nature of spatial and temporal trends in large river fish assemblages. We evaluated data from programs that monitor fishes in the Colorado, Columbia, Illinois, Mississippi, and Tallapoosa rivers using non-metric dimensional scaling ordinations and associated tests to evaluate trends in fish assemblage structure and native fish biodiversity. Our results indicate that fish assemblages exhibited significant spatial and temporal trends in all five of the rivers. We also document native species diversity trends that were variable within and between rivers and generally more evident in rivers with higher species richness and programs of longer duration. We discuss shared and basin-specific landscape level stressors. Having a basic understanding of the nature and extent of trends in fish assemblages is a necessary first step towards understanding factors affecting biodiversity and fisheries in large rivers.

  15. Can data from disparate long-term fish monitoring programs be used to increase our understanding of regional and continental trends in large river assemblages?

    PubMed Central

    Waite, Ian R.; Casper, Andrew F.; Ward, David L.; Sauer, Jennifer S.; Irwin, Elise R.; Chapman, Colin G.; Ickes, Brian S.; Paukert, Craig P.; Kosovich, John J.; Bayer, Jennifer M.

    2018-01-01

    Understanding trends in the diverse resources provided by large rivers will help balance tradeoffs among stakeholders and inform strategies to mitigate the effects of landscape scale stressors such as climate change and invasive species. Absent a cohesive coordinated effort to assess trends in important large river resources, a logical starting point is to assess our ability to draw inferences from existing efforts. In this paper, we use a common analytical framework to analyze data from five disparate fish monitoring programs to better understand the nature of spatial and temporal trends in large river fish assemblages. We evaluated data from programs that monitor fishes in the Colorado, Columbia, Illinois, Mississippi, and Tallapoosa rivers using non-metric dimensional scaling ordinations and associated tests to evaluate trends in fish assemblage structure and native fish biodiversity. Our results indicate that fish assemblages exhibited significant spatial and temporal trends in all five of the rivers. We also document native species diversity trends that were variable within and between rivers and generally more evident in rivers with higher species richness and programs of longer duration. We discuss shared and basin-specific landscape level stressors. Having a basic understanding of the nature and extent of trends in fish assemblages is a necessary first step towards understanding factors affecting biodiversity and fisheries in large rivers. PMID:29364953

  16. An overview of computer viruses in a research environment

    NASA Technical Reports Server (NTRS)

    Bishop, Matt

    1991-01-01

    The threat of attack by computer viruses is in reality a very small part of a much more general threat, specifically threats aimed at subverting computer security. Here, computer viruses are examined as a malicious logic in a research and development environment. A relation is drawn between the viruses and various models of security and integrity. Current research techniques aimed at controlling the threats posed to computer systems by threatening viruses in particular and malicious logic in general are examined. Finally, a brief examination of the vulnerabilities of research and development systems that malicious logic and computer viruses may exploit is undertaken.

  17. A Novel Reconfigurable Logic Unit Based on the DNA-Templated Potassium-Concentration-Dependent Supramolecular Assembly.

    PubMed

    Yang, Chunrong; Zou, Dan; Chen, Jianchi; Zhang, Linyan; Miao, Jiarong; Huang, Dan; Du, Yuanyuan; Yang, Shu; Yang, Qianfan; Tang, Yalin

    2018-03-15

    Plenty of molecular circuits with specific functions have been developed; however, logic units with reconfigurability, which could simplify the circuits and speed up the information process, are rarely reported. In this work, we designed a novel reconfigurable logic unit based on a DNA-templated, potassium-concentration-dependent, supramolecular assembly, which could respond to the input stimuli of H + and K + . By inputting different concentrations of K + , the logic unit could implement three significant functions, including a half adder, a half subtractor, and a 2-to-4 decoder. Considering its reconfigurable ability and good performance, the novel prototypes developed here may serve as a promising proof of principle in molecular computers. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. A suggested approach to the selection of chemical and biological protective clothing--meeting industry and emergency response needs for protection against a variety of hazards.

    PubMed

    Stull, Jeffrey O

    2004-01-01

    The paper describes the development of a comprehensive decision logic for selection and use of biological and chemical protective clothing (BCPC). The decision logic recognizes the separate areas of BCPC use among emergency, biological, and chemical hazards. The proposed decision logic provides a system for type classifying BCPC in terms of its compliance with existing standards (for emergency applications), the overall clothing integrity, and the material barrier performance. Type classification is offered for garments, gloves, footwear, and eye/face protection devices. On the basis of multiple, but simply designed flowcharts, the type of BCPC appropriate for specific biological and chemical hazards can be selected. The decision logic also provides supplemental considerations for choosing appropriate BCPC features.

  19. Structural Embeddings: Mechanization with Method

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar; Rushby, John

    1999-01-01

    The most powerful tools for analysis of formal specifications are general-purpose theorem provers and model checkers, but these tools provide scant methodological support. Conversely, those approaches that do provide a well-developed method generally have less powerful automation. It is natural, therefore, to try to combine the better-developed methods with the more powerful general-purpose tools. An obstacle is that the methods and the tools often employ very different logics. We argue that methods are separable from their logics and are largely concerned with the structure and organization of specifications. We, propose a technique called structural embedding that allows the structural elements of a method to be supported by a general-purpose tool, while substituting the logic of the tool for that of the method. We have found this technique quite effective and we provide some examples of its application. We also suggest how general-purpose systems could be restructured to support this activity better.

  20. Logical Differential Prediction Bayes Net, improving breast cancer diagnosis for older women.

    PubMed

    Nassif, Houssam; Wu, Yirong; Page, David; Burnside, Elizabeth

    2012-01-01

    Overdiagnosis is a phenomenon in which screening identities cancer which may not go on to cause symptoms or death. Women over 65 who develop breast cancer bear the heaviest burden of overdiagnosis. This work introduces novel machine learning algorithms to improve diagnostic accuracy of breast cancer in aging populations. At the same time, we aim at minimizing unnecessary invasive procedures (thus decreasing false positives) and concomitantly addressing overdiagnosis. We develop a novel algorithm. Logical Differential Prediction Bayes Net (LDP-BN), that calculates the risk of breast disease based on mammography findings. LDP-BN uses Inductive Logic Programming (ILP) to learn relational rules, selects older-specific differentially predictive rules, and incorporates them into a Bayes Net, significantly improving its performance. In addition, LDP-BN offers valuable insight into the classification process, revealing novel older-specific rules that link mass presence to invasive, and calcification presence and lack of detectable mass to DCIS.

  1. A Default Temporal Logic for Regulatory Conformance Checking

    DTIC Science & Technology

    2008-04-01

    proofs. In Section 4.3, we provide an axiomatization using Fitting’s sequent calculus [25]. Completeness is proved in Section 4.4. We conclude, in...axiomatize RefL. 4.3 Sequent Calculus We use Fitting’s sequent calculus [25]. A sequent is a statement of the form Γ → ∆, where Γ and ∆ are finite sets of...T.D., Vail, M.W., Anton , A.I.: Towards regulatory compliance: Extracting rights and obligations to align requirements with regulations. In

  2. A Spatial and Temporal Characterization of the Background Neutron Environment at the Navy and Marine Corps Memorial Stadium

    DTIC Science & Technology

    2017-04-01

    developing informed survey protocols. Experimental Method The neutron detector used in this research was the Large Neutron Sensor (LNS), containing...are useful in planning, conducting, and assessing the utility and limitations of radiation surveys using current state-of-the-art portable or...34,000. In a security environment, where large public venues may be a target for terrorist activity, the ability to survey venues for radio- logical

  3. A Rewriting Framework and Logic for Activities Subject to Regulations

    DTIC Science & Technology

    2015-02-28

    Regulations may be imposed by multiple governmental agencies as well as by institutional policies and protocols. Due to the complexity of both regulations and...positive number and decrements it; (3) A 0-test ri instruction is a branching instruction leading to one state if ri contains zero and to another state... insurance scenario discussed in (LMS09). De Young et al. describe in (DGJ+10) the challenges of formally specifying the temporal properties of regulations

  4. Predictive computation of genomic logic processing functions in embryonic development

    PubMed Central

    Peter, Isabelle S.; Faure, Emmanuel; Davidson, Eric H.

    2012-01-01

    Gene regulatory networks (GRNs) control the dynamic spatial patterns of regulatory gene expression in development. Thus, in principle, GRN models may provide system-level, causal explanations of developmental process. To test this assertion, we have transformed a relatively well-established GRN model into a predictive, dynamic Boolean computational model. This Boolean model computes spatial and temporal gene expression according to the regulatory logic and gene interactions specified in a GRN model for embryonic development in the sea urchin. Additional information input into the model included the progressive embryonic geometry and gene expression kinetics. The resulting model predicted gene expression patterns for a large number of individual regulatory genes each hour up to gastrulation (30 h) in four different spatial domains of the embryo. Direct comparison with experimental observations showed that the model predictively computed these patterns with remarkable spatial and temporal accuracy. In addition, we used this model to carry out in silico perturbations of regulatory functions and of embryonic spatial organization. The model computationally reproduced the altered developmental functions observed experimentally. Two major conclusions are that the starting GRN model contains sufficiently complete regulatory information to permit explanation of a complex developmental process of gene expression solely in terms of genomic regulatory code, and that the Boolean model provides a tool with which to test in silico regulatory circuitry and developmental perturbations. PMID:22927416

  5. Enzyme-Based Logic Gates and Networks with Output Signals Analyzed by Various Methods.

    PubMed

    Katz, Evgeny

    2017-07-05

    The paper overviews various methods that are used for the analysis of output signals generated by enzyme-based logic systems. The considered methods include optical techniques (optical absorbance, fluorescence spectroscopy, surface plasmon resonance), electrochemical techniques (cyclic voltammetry, potentiometry, impedance spectroscopy, conductivity measurements, use of field effect transistor devices, pH measurements), and various mechanoelectronic methods (using atomic force microscope, quartz crystal microbalance). Although each of the methods is well known for various bioanalytical applications, their use in combination with the biomolecular logic systems is rather new and sometimes not trivial. Many of the discussed methods have been combined with the use of signal-responsive materials to transduce and amplify biomolecular signals generated by the logic operations. Interfacing of biocomputing logic systems with electronics and "smart" signal-responsive materials allows logic operations be extended to actuation functions; for example, stimulating molecular release and switchable features of bioelectronic devices, such as biofuel cells. The purpose of this review article is to emphasize the broad variability of the bioanalytical systems applied for signal transduction in biocomputing processes. All bioanalytical systems discussed in the article are exemplified with specific logic gates and multi-gate networks realized with enzyme-based biocatalytic cascades. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Formally specifying the logic of an automatic guidance controller

    NASA Technical Reports Server (NTRS)

    Guaspari, David

    1990-01-01

    The following topics are covered in viewgraph form: (1) the Penelope Project; (2) the logic of an experimental automatic guidance control system for a 737; (3) Larch/Ada specification; (4) some failures of informal description; (5) description of mode changes caused by switches; (6) intuitive description of window status (chosen vs. current); (7) design of the code; (8) and specifying the code.

  7. Retro-causation, Minimum Contradictions and Non-locality

    NASA Astrophysics Data System (ADS)

    Kafatos, Menas; Nassikas, Athanassios A.

    2011-11-01

    Retro-causation has been experimentally verified by Bem and proposed by Kafatos in the form of space-time non-locality in the quantum framework. Every theory includes, beyond its specific axioms, the principles of logical communication (logical language), through which it is defined. This communication obeys the Aristotelian logic (Classical Logic), the Leibniz Sufficient Reason Principle, and a hidden axiom, which basically states that there is anterior-posterior relationship everywhere in communication. By means of a theorem discussed here, it can be proved that the communication mentioned implies contradictory statements, which can only be transcended through silence, i.e. the absence of any statements. Moreover, the breaking of silence is meaningful through the claim for minimum contradictions, which implies the existence of both a logical and an illogical dimension; contradictions refer to causality, implying its opposite, namely retro-causation, and the anterior posterior axiom, implying space-time non-locality. The purpose of this paper is to outline a framework accounting for retro-causation, through both purely theoretical and reality based points of view.

  8. Separating Business Logic from Medical Knowledge in Digital Clinical Workflows Using Business Process Model and Notation and Arden Syntax.

    PubMed

    de Bruin, Jeroen S; Adlassnig, Klaus-Peter; Leitich, Harald; Rappelsberger, Andrea

    2018-01-01

    Evidence-based clinical guidelines have a major positive effect on the physician's decision-making process. Computer-executable clinical guidelines allow for automated guideline marshalling during a clinical diagnostic process, thus improving the decision-making process. Implementation of a digital clinical guideline for the prevention of mother-to-child transmission of hepatitis B as a computerized workflow, thereby separating business logic from medical knowledge and decision-making. We used the Business Process Model and Notation language system Activiti for business logic and workflow modeling. Medical decision-making was performed by an Arden-Syntax-based medical rule engine, which is part of the ARDENSUITE software. We succeeded in creating an electronic clinical workflow for the prevention of mother-to-child transmission of hepatitis B, where institution-specific medical decision-making processes could be adapted without modifying the workflow business logic. Separation of business logic and medical decision-making results in more easily reusable electronic clinical workflows.

  9. Computing health quality measures using Informatics for Integrating Biology and the Bedside.

    PubMed

    Klann, Jeffrey G; Murphy, Shawn N

    2013-04-19

    The Health Quality Measures Format (HQMF) is a Health Level 7 (HL7) standard for expressing computable Clinical Quality Measures (CQMs). Creating tools to process HQMF queries in clinical databases will become increasingly important as the United States moves forward with its Health Information Technology Strategic Plan to Stages 2 and 3 of the Meaningful Use incentive program (MU2 and MU3). Informatics for Integrating Biology and the Bedside (i2b2) is one of the analytical databases used as part of the Office of the National Coordinator (ONC)'s Query Health platform to move toward this goal. Our goal is to integrate i2b2 with the Query Health HQMF architecture, to prepare for other HQMF use-cases (such as MU2 and MU3), and to articulate the functional overlap between i2b2 and HQMF. Therefore, we analyze the structure of HQMF, and then we apply this understanding to HQMF computation on the i2b2 clinical analytical database platform. Specifically, we develop a translator between two query languages, HQMF and i2b2, so that the i2b2 platform can compute HQMF queries. We use the HQMF structure of queries for aggregate reporting, which define clinical data elements and the temporal and logical relationships between them. We use the i2b2 XML format, which allows flexible querying of a complex clinical data repository in an easy-to-understand domain-specific language. The translator can represent nearly any i2b2-XML query as HQMF and execute in i2b2 nearly any HQMF query expressible in i2b2-XML. This translator is part of the freely available reference implementation of the QueryHealth initiative. We analyze limitations of the conversion and find it covers many, but not all, of the complex temporal and logical operators required by quality measures. HQMF is an expressive language for defining quality measures, and it will be important to understand and implement for CQM computation, in both meaningful use and population health. However, its current form might allow complexity that is intractable for current database systems (both in terms of implementation and computation). Our translator, which supports the subset of HQMF currently expressible in i2b2-XML, may represent the beginnings of a practical compromise. It is being pilot-tested in two Query Health demonstration projects, and it can be further expanded to balance computational tractability with the advanced features needed by measure developers.

  10. Computing Health Quality Measures Using Informatics for Integrating Biology and the Bedside

    PubMed Central

    Murphy, Shawn N

    2013-01-01

    Background The Health Quality Measures Format (HQMF) is a Health Level 7 (HL7) standard for expressing computable Clinical Quality Measures (CQMs). Creating tools to process HQMF queries in clinical databases will become increasingly important as the United States moves forward with its Health Information Technology Strategic Plan to Stages 2 and 3 of the Meaningful Use incentive program (MU2 and MU3). Informatics for Integrating Biology and the Bedside (i2b2) is one of the analytical databases used as part of the Office of the National Coordinator (ONC)’s Query Health platform to move toward this goal. Objective Our goal is to integrate i2b2 with the Query Health HQMF architecture, to prepare for other HQMF use-cases (such as MU2 and MU3), and to articulate the functional overlap between i2b2 and HQMF. Therefore, we analyze the structure of HQMF, and then we apply this understanding to HQMF computation on the i2b2 clinical analytical database platform. Specifically, we develop a translator between two query languages, HQMF and i2b2, so that the i2b2 platform can compute HQMF queries. Methods We use the HQMF structure of queries for aggregate reporting, which define clinical data elements and the temporal and logical relationships between them. We use the i2b2 XML format, which allows flexible querying of a complex clinical data repository in an easy-to-understand domain-specific language. Results The translator can represent nearly any i2b2-XML query as HQMF and execute in i2b2 nearly any HQMF query expressible in i2b2-XML. This translator is part of the freely available reference implementation of the QueryHealth initiative. We analyze limitations of the conversion and find it covers many, but not all, of the complex temporal and logical operators required by quality measures. Conclusions HQMF is an expressive language for defining quality measures, and it will be important to understand and implement for CQM computation, in both meaningful use and population health. However, its current form might allow complexity that is intractable for current database systems (both in terms of implementation and computation). Our translator, which supports the subset of HQMF currently expressible in i2b2-XML, may represent the beginnings of a practical compromise. It is being pilot-tested in two Query Health demonstration projects, and it can be further expanded to balance computational tractability with the advanced features needed by measure developers. PMID:23603227

  11. Computerized engineering logic for nuclear procurement and dedication processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tulay, M.P.

    1996-12-31

    In an attempt to better meet the needs of operations and maintenance organizations, many nuclear utility procurement engineering groups have simplified their procedures, developed on-line tools for performing the specification of replacement items, and developed relational databases containing part-level information necessary to automate the procurement process. Although these improvements have helped to reduce the engineering necessary to properly specify and accept/dedicate items for nuclear safety-related applications, a number of utilities have recognized that additional long-term savings can be realized by integrating a computerized logic to assist technical procurement engineering personnel. The most commonly used logic follows the generic processes containedmore » in Electric Power Research Institute (EPRI) published guidelines. The processes are typically customized to some extent to accommodate each utility`s organizational structure, operating procedures, and strategic goals. This paper will discuss a typical logic that integrates the technical evaluation, acceptance, and receipt inspection and testing processes. The logic this paper will describe has been successfully integrated at a growing number of nuclear utilities and has produced numerous positive results. The application of the logic ensures that utility-wide standards or procedures, common among multi-site utilities, are followed.« less

  12. A Logic for Reasoning About Time-Dependent Access Control Policies

    DTIC Science & Technology

    2008-05-20

    41 4.3.2 Filling Painkiller Prescriptions . . . . . . . . . . . . . . . . . . . . . . . . . . 42 4.3.3 A Homework Assignment Administration System...4.3.2 Filling Painkiller Prescriptions We now consider the specification of pharmacy policies for dispensing painkilling medications in ηL logic. To...prevent addiction, painkillers are tightly regulated. A patient must submit a valid doctor’s prescription to the pharmacist and may only receive a few

  13. Trinary Associative Memory Would Recognize Machine Parts

    NASA Technical Reports Server (NTRS)

    Liu, Hua-Kuang; Awwal, Abdul Ahad S.; Karim, Mohammad A.

    1991-01-01

    Trinary associative memory combines merits and overcomes major deficiencies of unipolar and bipolar logics by combining them in three-valued logic that reverts to unipolar or bipolar binary selectively, as needed to perform specific tasks. Advantage of associative memory: one obtains access to all parts of it simultaneously on basis of content, rather than address, of data. Consequently, used to exploit fully parallelism and speed of optical computing.

  14. An universal read-out controller

    NASA Astrophysics Data System (ADS)

    Manz, S.; Abel, N.; Gebelein, J.; Kebschull, U.

    2010-11-01

    Since 2007 we design and develop a ROC (read-out controller) for FAIR's data-acquisition. While our first implementation solely focused on the nXYTER, today we are also designing and implementing readout logic for the GET4 which is supposed to be part of the ToF detector. Furthermore, we fully support both Ethernet and Optical transport as two transparent solutions. The usage of a strict modularization of the Read Out Controller enables us to provide an Universal ROC where front-end specific logic and transport logic can be combined in a very flexible way. Fault tolerance techniques are only required for some of those modules and hence are only implemented there.

  15. A Mechanism of Modeling and Verification for SaaS Customization Based on TLA

    NASA Astrophysics Data System (ADS)

    Luan, Shuai; Shi, Yuliang; Wang, Haiyang

    With the gradually mature of SOA and the rapid development of Internet, SaaS has become a popular software service mode. The customized action of SaaS is usually subject to internal and external dependency relationships. This paper first introduces a method for modeling customization process based on Temporal Logic of Actions, and then proposes a verification algorithm to assure that each step in customization will not cause unpredictable influence on system and follow the related rules defined by SaaS provider.

  16. Fuzzy Logic Decoupled Longitudinal Control for General Aviation Airplanes

    NASA Technical Reports Server (NTRS)

    Duerksen, Noel

    1996-01-01

    It has been hypothesized that a human pilot uses the same set of generic skills to control a wide variety of aircraft. If this is true, then it should be possible to construct an electronic controller which embodies this generic skill set such that it can successfully control difference airplanes without being matched to a specific airplane. In an attempt to create such a system, a fuzzy logic controller was devised to control throttle position and another to control elevator position. These two controllers were used to control flight path angle and airspeed for both a piston powered single engine airplane simulation and a business jet simulation. Overspeed protection and stall protection were incorporated in the form of expert systems supervisors. It was found that by using the artificial intelligence techniques of fuzzy logic and expert systems, a generic longitudinal controller could be successfully used on two general aviation aircraft types that have very difference characteristics. These controllers worked for both airplanes over their entire flight envelopes including configuration changes. The controllers for both airplanes were identical except for airplane specific limits (maximum allowable airspeed, throttle lever travel, etc.). The controllers also handled configuration changes without mode switching or knowledge of the current configuration. This research validated the fact that the same fuzzy logic based controller can control two very different general aviation airplanes. It also developed the basic controller architecture and specific control parameters required for such a general controller.

  17. Design and simulation of programmable relational optoelectronic time-pulse coded processors as base elements for sorting neural networks

    NASA Astrophysics Data System (ADS)

    Krasilenko, Vladimir G.; Nikolsky, Alexander I.; Lazarev, Alexander A.; Lazareva, Maria V.

    2010-05-01

    In the paper we show that the biologically motivated conception of time-pulse encoding usage gives a set of advantages (single methodological basis, universality, tuning simplicity, learning and programming et al) at creation and design of sensor systems with parallel input-output and processing for 2D structures hybrid and next generations neuro-fuzzy neurocomputers. We show design principles of programmable relational optoelectronic time-pulse encoded processors on the base of continuous logic, order logic and temporal waves processes. We consider a structure that execute analog signal extraction, analog and time-pulse coded variables sorting. We offer optoelectronic realization of such base relational order logic element, that consists of time-pulse coded photoconverters (pulse-width and pulse-phase modulators) with direct and complementary outputs, sorting network on logical elements and programmable commutation blocks. We make technical parameters estimations of devices and processors on such base elements by simulation and experimental research: optical input signals power 0.2 - 20 uW, processing time 1 - 10 us, supply voltage 1 - 3 V, consumption power 10 - 100 uW, extended functional possibilities, learning possibilities. We discuss some aspects of possible rules and principles of learning and programmable tuning on required function, relational operation and realization of hardware blocks for modifications of such processors. We show that it is possible to create sorting machines, neural networks and hybrid data-processing systems with untraditional numerical systems and pictures operands on the basis of such quasiuniversal hardware simple blocks with flexible programmable tuning.

  18. Symbolic LTL Compilation for Model Checking: Extended Abstract

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Y.; Vardi, Moshe Y.

    2007-01-01

    In Linear Temporal Logic (LTL) model checking, we check LTL formulas representing desired behaviors against a formal model of the system designed to exhibit these behaviors. To accomplish this task, the LTL formulas must be translated into automata [21]. We focus on LTL compilation by investigating LTL satisfiability checking via a reduction to model checking. Having shown that symbolic LTL compilation algorithms are superior to explicit automata construction algorithms for this task [16], we concentrate here on seeking a better symbolic algorithm.We present experimental data comparing algorithmic variations such as normal forms, encoding methods, and variable ordering and examine their effects on performance metrics including processing time and scalability. Safety critical systems, such as air traffic control, life support systems, hazardous environment controls, and automotive control systems, pervade our daily lives, yet testing and simulation alone cannot adequately verify their reliability [3]. Model checking is a promising approach to formal verification for safety critical systems which involves creating a formal mathematical model of the system and translating desired safety properties into a formal specification for this model. The complement of the specification is then checked against the system model. When the model does not satisfy the specification, model-checking tools accompany this negative answer with a counterexample, which points to an inconsistency between the system and the desired behaviors and aids debugging efforts.

  19. Authoring and verification of clinical guidelines: a model driven approach.

    PubMed

    Pérez, Beatriz; Porres, Ivan

    2010-08-01

    The goal of this research is to provide a framework to enable authoring and verification of clinical guidelines. The framework is part of a larger research project aimed at improving the representation, quality and application of clinical guidelines in daily clinical practice. The verification process of a guideline is based on (1) model checking techniques to verify guidelines against semantic errors and inconsistencies in their definition, (2) combined with Model Driven Development (MDD) techniques, which enable us to automatically process manually created guideline specifications and temporal-logic statements to be checked and verified regarding these specifications, making the verification process faster and cost-effective. Particularly, we use UML statecharts to represent the dynamics of guidelines and, based on this manually defined guideline specifications, we use a MDD-based tool chain to automatically process them to generate the input model of a model checker. The model checker takes the resulted model together with the specific guideline requirements, and verifies whether the guideline fulfils such properties. The overall framework has been implemented as an Eclipse plug-in named GBDSSGenerator which, particularly, starting from the UML statechart representing a guideline, allows the verification of the guideline against specific requirements. Additionally, we have established a pattern-based approach for defining commonly occurring types of requirements in guidelines. We have successfully validated our overall approach by verifying properties in different clinical guidelines resulting in the detection of some inconsistencies in their definition. The proposed framework allows (1) the authoring and (2) the verification of clinical guidelines against specific requirements defined based on a set of property specification patterns, enabling non-experts to easily write formal specifications and thus easing the verification process. Copyright 2010 Elsevier Inc. All rights reserved.

  20. Collective network routing

    DOEpatents

    Hoenicke, Dirk

    2014-12-02

    Disclosed are a unified method and apparatus to classify, route, and process injected data packets into a network so as to belong to a plurality of logical networks, each implementing a specific flow of data on top of a common physical network. The method allows to locally identify collectives of packets for local processing, such as the computation of the sum, difference, maximum, minimum, or other logical operations among the identified packet collective. Packets are injected together with a class-attribute and an opcode attribute. Network routers, employing the described method, use the packet attributes to look-up the class-specific route information from a local route table, which contains the local incoming and outgoing directions as part of the specifically implemented global data flow of the particular virtual network.

  1. A Logical Process Calculus

    NASA Technical Reports Server (NTRS)

    Cleaveland, Rance; Luettgen, Gerald; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    This paper presents the Logical Process Calculus (LPC), a formalism that supports heterogeneous system specifications containing both operational and declarative subspecifications. Syntactically, LPC extends Milner's Calculus of Communicating Systems with operators from the alternation-free linear-time mu-calculus (LT(mu)). Semantically, LPC is equipped with a behavioral preorder that generalizes Hennessy's and DeNicola's must-testing preorder as well as LT(mu's) satisfaction relation, while being compositional for all LPC operators. From a technical point of view, the new calculus is distinguished by the inclusion of: (1) both minimal and maximal fixed-point operators and (2) an unimple-mentability predicate on process terms, which tags inconsistent specifications. The utility of LPC is demonstrated by means of an example highlighting the benefits of heterogeneous system specification.

  2. Certifying Domain-Specific Policies

    NASA Technical Reports Server (NTRS)

    Lowry, Michael; Pressburger, Thomas; Rosu, Grigore; Koga, Dennis (Technical Monitor)

    2001-01-01

    Proof-checking code for compliance to safety policies potentially enables a product-oriented approach to certain aspects of software certification. To date, previous research has focused on generic, low-level programming-language properties such as memory type safety. In this paper we consider proof-checking higher-level domain -specific properties for compliance to safety policies. The paper first describes a framework related to abstract interpretation in which compliance to a class of certification policies can be efficiently calculated Membership equational logic is shown to provide a rich logic for carrying out such calculations, including partiality, for certification. The architecture for a domain-specific certifier is described, followed by an implemented case study. The case study considers consistency of abstract variable attributes in code that performs geometric calculations in Aerospace systems.

  3. Improving the use of health data for health system strengthening.

    PubMed

    Nutley, Tara; Reynolds, Heidi W

    2013-02-13

    Good quality and timely data from health information systems are the foundation of all health systems. However, too often data sit in reports, on shelves or in databases and are not sufficiently utilised in policy and program development, improvement, strategic planning and advocacy. Without specific interventions aimed at improving the use of data produced by information systems, health systems will never fully be able to meet the needs of the populations they serve. To employ a logic model to describe a pathway of how specific activities and interventions can strengthen the use of health data in decision making to ultimately strengthen the health system. A logic model was developed to provide a practical strategy for developing, monitoring and evaluating interventions to strengthen the use of data in decision making. The model draws on the collective strengths and similarities of previous work and adds to those previous works by making specific recommendations about interventions and activities that are most proximate to affect the use of data in decision making. The model provides an organizing framework for how interventions and activities work to strengthen the systematic demand, synthesis, review, and use of data. The logic model and guidance are presented to facilitate its widespread use and to enable improved data-informed decision making in program review and planning, advocacy, policy development. Real world examples from the literature support the feasible application of the activities outlined in the model. The logic model provides specific and comprehensive guidance to improve data demand and use. It can be used to design, monitor and evaluate interventions, and to improve demand for, and use of, data in decision making. As more interventions are implemented to improve use of health data, those efforts need to be evaluated.

  4. Electron lithography STAR design guidelines. Part 3: The mosaic transistor array applied to custom microprocessors. Part 4: Stores logic arrays, SLAs implemented with clocked CMOS

    NASA Technical Reports Server (NTRS)

    Trotter, J. D.

    1982-01-01

    The Mosaic Transistor Array is an extension of the STAR system developed by NASA which has dedicated field cells designed to be specifically used in semicustom microprocessor applications. The Sandia radiation hard bulk CMOS process is utilized in order to satisfy the requirements of space flights. A design philosophy is developed which utilizes the strengths and recognizes the weaknesses of the Sandia process. A style of circuitry is developed which incorporates the low power and high drive capability of CMOS. In addition the density achieved is better than that for classic CMOS, although not as good as for NMOS. The basic logic functions for a data path are designed with compatible interface to the STAR grid system. In this manner either random logic or PLA type structures can be utilized for the control logic.

  5. Rapid Prototyping of Application Specific Signal Processors (RASSP)

    DTIC Science & Technology

    1992-10-01

    as well as government, research and and COMPASS , and how the improved plan academic institutions. CFI believes that effective might fit in with the... Compass ). libraries for COTS parts Tools and standards would be strongly based on - Ease of Use VHDL in its latest form(s). Block 2 would take * Open...EDIF Comrcial Rel:wased * Logic Inc. capture for Proprietary boards graphical language Logic Compass Schematic Proprietary EDIF; Commercial Released

  6. Improving the Accuracy and Scalability of Discriminative Learning Methods for Markov Logic Networks

    DTIC Science & Technology

    2011-05-01

    9 2.2 Inductive Logic Programming and Aleph . . . . . . . . . . . . 10 2.3 MLNs and Alchemy ...positive examples. Aleph allows users to customize each of 10 these steps, and thereby supports a variety of specific algorithms. 2.3 MLNs and Alchemy An...tural motifs. By limiting the search to each unique motif, LSM is able to find good clauses in an efficient manner. Alchemy (Kok, Singla, Richardson

  7. Optimum ArFi laser bandwidth for 10nm node logic imaging performance

    NASA Astrophysics Data System (ADS)

    Alagna, Paolo; Zurita, Omar; Timoshkov, Vadim; Wong, Patrick; Rechtsteiner, Gregory; Baselmans, Jan; Mailfert, Julien

    2015-03-01

    Lithography process window (PW) and CD uniformity (CDU) requirements are being challenged with scaling across all device types. Aggressive PW and yield specifications put tight requirements on scanner performance, especially on focus budgets resulting in complicated systems for focus control. In this study, an imec N10 Logic-type test vehicle was used to investigate the E95 bandwidth impact on six different Metal 1 Logic features. The imaging metrics that track the impact of light source E95 bandwidth on performance of hot spots are: process window (PW), line width roughness (LWR), and local critical dimension uniformity (LCDU). In the first section of this study, the impact of increasing E95 bandwidth was investigated to observe the lithographic process control response of the specified logic features. In the second section, a preliminary assessment of the impact of lower E95 bandwidth was performed. The impact of lower E95 bandwidth on local intensity variability was monitored through the CDU of line end features and the LWR power spectral density (PSD) of line/space patterns. The investigation found that the imec N10 test vehicle (with OPC optimized for standard E95 bandwidth of300fm) features exposed at 200fm showed pattern specific responses, suggesting areas of potential interest for further investigation.

  8. Ordering Traces Logically to Identify Lateness in Message Passing Programs

    DOE PAGES

    Isaacs, Katherine E.; Gamblin, Todd; Bhatele, Abhinav; ...

    2015-03-30

    Event traces are valuable for understanding the behavior of parallel programs. However, automatically analyzing a large parallel trace is difficult, especially without a specific objective. We aid this endeavor by extracting a trace's logical structure, an ordering of trace events derived from happened-before relationships, while taking into account developer intent. Using this structure, we can calculate an operation's delay relative to its peers on other processes. The logical structure also serves as a platform for comparing and clustering processes as well as highlighting communication patterns in a trace visualization. We present an algorithm for determining this idealized logical structure frommore » traces of message passing programs, and we develop metrics to quantify delays and differences among processes. We implement our techniques in Ravel, a parallel trace visualization tool that displays both logical and physical timelines. Rather than showing the duration of each operation, we display where delays begin and end, and how they propagate. As a result, we apply our approach to the traces of several message passing applications, demonstrating the accuracy of our extracted structure and its utility in analyzing these codes.« less

  9. Short Term Load Forecasting with Fuzzy Logic Systems for power system planning and reliability-A Review

    NASA Astrophysics Data System (ADS)

    Holmukhe, R. M.; Dhumale, Mrs. Sunita; Chaudhari, Mr. P. S.; Kulkarni, Mr. P. P.

    2010-10-01

    Load forecasting is very essential to the operation of Electricity companies. It enhances the energy efficient and reliable operation of power system. Forecasting of load demand data forms an important component in planning generation schedules in a power system. The purpose of this paper is to identify issues and better method for load foecasting. In this paper we focus on fuzzy logic system based short term load forecasting. It serves as overview of the state of the art in the intelligent techniques employed for load forecasting in power system planning and reliability. Literature review has been conducted and fuzzy logic method has been summarized to highlight advantages and disadvantages of this technique. The proposed technique for implementing fuzzy logic based forecasting is by Identification of the specific day and by using maximum and minimum temperature for that day and finally listing the maximum temperature and peak load for that day. The results show that Load forecasting where there are considerable changes in temperature parameter is better dealt with Fuzzy Logic system method as compared to other short term forecasting techniques.

  10. Using Ontologies to Formalize Services Specifications in Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Breitman, Karin Koogan; Filho, Aluizio Haendchen; Haeusler, Edward Hermann

    2004-01-01

    One key issue in multi-agent systems (MAS) is their ability to interact and exchange information autonomously across applications. To secure agent interoperability, designers must rely on a communication protocol that allows software agents to exchange meaningful information. In this paper we propose using ontologies as such communication protocol. Ontologies capture the semantics of the operations and services provided by agents, allowing interoperability and information exchange in a MAS. Ontologies are a formal, machine processable, representation that allows to capture the semantics of a domain and, to derive meaningful information by way of logical inference. In our proposal we use a formal knowledge representation language (OWL) that translates into Description Logics (a subset of first order logic), thus eliminating ambiguities and providing a solid base for machine based inference. The main contribution of this approach is to make the requirements explicit, centralize the specification in a single document (the ontology itself), at the same that it provides a formal, unambiguous representation that can be processed by automated inference machines.

  11. A rule-based approach to model checking of UML state machines

    NASA Astrophysics Data System (ADS)

    Grobelna, Iwona; Grobelny, Michał; Stefanowicz, Łukasz

    2016-12-01

    In the paper a new approach to formal verification of control process specification expressed by means of UML state machines in version 2.x is proposed. In contrast to other approaches from the literature, we use the abstract and universal rule-based logical model suitable both for model checking (using the nuXmv model checker), but also for logical synthesis in form of rapid prototyping. Hence, a prototype implementation in hardware description language VHDL can be obtained that fully reflects the primary, already formally verified specification in form of UML state machines. Presented approach allows to increase the assurance that implemented system meets the user-defined requirements.

  12. Inseparability of science history and discovery

    NASA Astrophysics Data System (ADS)

    Herndon, J. M.

    2010-04-01

    Science is very much a logical progression through time. Progressing along a logical path of discovery is rather like following a path through the wilderness. Occasionally the path splits, presenting a choice; the correct logical interpretation leads to further progress, the wrong choice leads to confusion. By considering deeply the relevant science history, one might begin to recognize past faltering in the logical progression of observations and ideas and, perhaps then, to discover new, more precise understanding. The following specific examples of science faltering are described from a historical perspective: (1) Composition of the Earth's inner core; (2) Giant planet internal energy production; (3) Physical impossibility of Earth-core convection and Earth-mantle convection, and; (4) Thermonuclear ignition of stars. For each example, a revised logical progression is described, leading, respectively, to: (1) Understanding the endo-Earth's composition; (2) The concept of nuclear georeactor origin of geo- and planetary magnetic fields; (3) The invalidation and replacement of plate tectonics; and, (4) Understanding the basis for the observed distribution of luminous stars in galaxies. These revised logical progressions clearly show the inseparability of science history and discovery. A different and more fundamental approach to making scientific discoveries than the frequently discussed variants of the scientific method is this: An individual ponders and through tedious efforts arranges seemingly unrelated observations into a logical sequence in the mind so that causal relationships become evident and new understanding emerges, showing the path for new observations, for new experiments, for new theoretical considerations, and for new discoveries. Science history is rich in "seemingly unrelated observations" just waiting to be logically and causally related to reveal new discoveries.

  13. Continuous variables logic via coupled automata using a DNAzyme cascade with feedback† †Electronic supplementary information (ESI) available. See DOI: 10.1039/c6sc03892a Click here for additional data file.

    PubMed Central

    Lilienthal, S.; Klein, M.; Orbach, R.; Willner, I.; Remacle, F.

    2017-01-01

    The concentration of molecules can be changed by chemical reactions and thereby offer a continuous readout. Yet computer architecture is cast in textbooks in terms of binary valued, Boolean variables. To enable reactive chemical systems to compute we show how, using the Cox interpretation of probability theory, one can transcribe the equations of chemical kinetics as a sequence of coupled logic gates operating on continuous variables. It is discussed how the distinct chemical identity of a molecule allows us to create a common language for chemical kinetics and Boolean logic. Specifically, the logic AND operation is shown to be equivalent to a bimolecular process. The logic XOR operation represents chemical processes that take place concurrently. The values of the rate constants enter the logic scheme as inputs. By designing a reaction scheme with a feedback we endow the logic gates with a built in memory because their output then depends on the input and also on the present state of the system. Technically such a logic machine is an automaton. We report an experimental realization of three such coupled automata using a DNAzyme multilayer signaling cascade. A simple model verifies analytically that our experimental scheme provides an integrator generating a power series that is third order in time. The model identifies two parameters that govern the kinetics and shows how the initial concentrations of the substrates are the coefficients in the power series. PMID:28507669

  14. Some comments on Dr Iglesias's paper, 'In vitro fertilisation: the major issues'.

    PubMed Central

    Mill, J M

    1986-01-01

    In an article in an earlier edition of the Journal of Medical Ethics (1) Dr Iglesias bases her analysis upon the mediaeval interpretation of Platonic metaphysics and Aristotelian logic as given by Aquinas. Propositional forms are applied to the analysis of experience. This results in a very abstract analysis. The essential connection of events and their changing temporal relationships are ignored. The dichotomy between body and soul is a central concept. The unchanging elements in experience are assumed to be more real than the actual world of experienced process. Such a view makes the analysis of the temporal factors in experience impossible. Its abstractness is quite unsuitable for the analysis of the ontological structure and development of the neonate from fertilisation to birth. A N Whitehead made the notion of organism central to his philosophy. He refused to place human experience outside nature, or admit dualism. His philosophy of organism is an attempt to uncover the essential elements connecting human experience with the physical and biological sciences. Time, change and process are, in his view, more real than the static abstractions obtainable by the use of the fallacy of misplaced concreteness. Use of the latter negates the essential connectedness of events and the importance of temporarily and change (2). In this paper I argue that the embryo, being an organism, is not analysable in terms of thinghood. It is a process. To apply Aristotelian logical concepts to it is to distort the real nature of the datum. PMID:3959039

  15. A computer program for the generation of logic networks from task chart data

    NASA Technical Reports Server (NTRS)

    Herbert, H. E.

    1980-01-01

    The Network Generation Program (NETGEN), which creates logic networks from task chart data is presented. NETGEN is written in CDC FORTRAN IV (Extended) and runs in a batch mode on the CDC 6000 and CYBER 170 series computers. Data is input via a two-card format and contains information regarding the specific tasks in a project. From this data, NETGEN constructs a logic network of related activities with each activity having unique predecessor and successor nodes, activity duration, descriptions, etc. NETGEN then prepares this data on two files that can be used in the Project Planning Analysis and Reporting System Batch Network Scheduling program and the EZPERT graphics program.

  16. Quantum dot-based local field imaging reveals plasmon-based interferometric logic in silver nanowire networks.

    PubMed

    Wei, Hong; Li, Zhipeng; Tian, Xiaorui; Wang, Zhuoxian; Cong, Fengzi; Liu, Ning; Zhang, Shunping; Nordlander, Peter; Halas, Naomi J; Xu, Hongxing

    2011-02-09

    We show that the local electric field distribution of propagating plasmons along silver nanowires can be imaged by coating the nanowires with a layer of quantum dots, held off the surface of the nanowire by a nanoscale dielectric spacer layer. In simple networks of silver nanowires with two optical inputs, control of the optical polarization and phase of the input fields directs the guided waves to a specific nanowire output. The QD-luminescent images of these structures reveal that a complete family of phase-dependent, interferometric logic functions can be performed on these simple networks. These results show the potential for plasmonic waveguides to support compact interferometric logic operations.

  17. Gene Function Hypotheses for the Campylobacter jejuni Glycome Generated by a Logic-Based Approach

    PubMed Central

    Sternberg, Michael J.E.; Tamaddoni-Nezhad, Alireza; Lesk, Victor I.; Kay, Emily; Hitchen, Paul G.; Cootes, Adrian; van Alphen, Lieke B.; Lamoureux, Marc P.; Jarrell, Harold C.; Rawlings, Christopher J.; Soo, Evelyn C.; Szymanski, Christine M.; Dell, Anne; Wren, Brendan W.; Muggleton, Stephen H.

    2013-01-01

    Increasingly, experimental data on biological systems are obtained from several sources and computational approaches are required to integrate this information and derive models for the function of the system. Here, we demonstrate the power of a logic-based machine learning approach to propose hypotheses for gene function integrating information from two diverse experimental approaches. Specifically, we use inductive logic programming that automatically proposes hypotheses explaining the empirical data with respect to logically encoded background knowledge. We study the capsular polysaccharide biosynthetic pathway of the major human gastrointestinal pathogen Campylobacter jejuni. We consider several key steps in the formation of capsular polysaccharide consisting of 15 genes of which 8 have assigned function, and we explore the extent to which functions can be hypothesised for the remaining 7. Two sources of experimental data provide the information for learning—the results of knockout experiments on the genes involved in capsule formation and the absence/presence of capsule genes in a multitude of strains of different serotypes. The machine learning uses the pathway structure as background knowledge. We propose assignments of specific genes to five previously unassigned reaction steps. For four of these steps, there was an unambiguous optimal assignment of gene to reaction, and to the fifth, there were three candidate genes. Several of these assignments were consistent with additional experimental results. We therefore show that the logic-based methodology provides a robust strategy to integrate results from different experimental approaches and propose hypotheses for the behaviour of a biological system. PMID:23103756

  18. Gene function hypotheses for the Campylobacter jejuni glycome generated by a logic-based approach.

    PubMed

    Sternberg, Michael J E; Tamaddoni-Nezhad, Alireza; Lesk, Victor I; Kay, Emily; Hitchen, Paul G; Cootes, Adrian; van Alphen, Lieke B; Lamoureux, Marc P; Jarrell, Harold C; Rawlings, Christopher J; Soo, Evelyn C; Szymanski, Christine M; Dell, Anne; Wren, Brendan W; Muggleton, Stephen H

    2013-01-09

    Increasingly, experimental data on biological systems are obtained from several sources and computational approaches are required to integrate this information and derive models for the function of the system. Here, we demonstrate the power of a logic-based machine learning approach to propose hypotheses for gene function integrating information from two diverse experimental approaches. Specifically, we use inductive logic programming that automatically proposes hypotheses explaining the empirical data with respect to logically encoded background knowledge. We study the capsular polysaccharide biosynthetic pathway of the major human gastrointestinal pathogen Campylobacter jejuni. We consider several key steps in the formation of capsular polysaccharide consisting of 15 genes of which 8 have assigned function, and we explore the extent to which functions can be hypothesised for the remaining 7. Two sources of experimental data provide the information for learning-the results of knockout experiments on the genes involved in capsule formation and the absence/presence of capsule genes in a multitude of strains of different serotypes. The machine learning uses the pathway structure as background knowledge. We propose assignments of specific genes to five previously unassigned reaction steps. For four of these steps, there was an unambiguous optimal assignment of gene to reaction, and to the fifth, there were three candidate genes. Several of these assignments were consistent with additional experimental results. We therefore show that the logic-based methodology provides a robust strategy to integrate results from different experimental approaches and propose hypotheses for the behaviour of a biological system. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Sameness and the self: philosophical and psychological considerations

    PubMed Central

    Klein, Stanley B.

    2014-01-01

    In this paper I examine the concept of cross-temporal personal identity (diachronicity). This particular form of identity has vexed theorists for centuries—e.g., how can a person maintain a belief in the sameness of self over time in the face of continual psychological and physical change? I first discuss various forms of the sameness relation and the criteria that justify their application. I then examine philosophical and psychological treatments of personal diachronicity (for example, Locke's psychological connectedness theory; the role of episodic memory) and find each lacking on logical grounds, empirical grounds or both. I conclude that to achieve a successful resolution of the issue of the self as a temporal continuant we need to draw a sharp distinction between the feeling of the sameness of one's self and the evidence marshaled in support of that feeling. PMID:24523707

  20. Modeling and performance analysis of QoS data

    NASA Astrophysics Data System (ADS)

    Strzeciwilk, Dariusz; Zuberek, Włodzimierz M.

    2016-09-01

    The article presents the results of modeling and analysis of data transmission performance on systems that support quality of service. Models are designed and tested, taking into account multiservice network architecture, i.e. supporting the transmission of data related to different classes of traffic. Studied were mechanisms of traffic shaping systems, which are based on the Priority Queuing with an integrated source of data and the various sources of data that is generated. Discussed were the basic problems of the architecture supporting QoS and queuing systems. Designed and built were models based on Petri nets, supported by temporal logics. The use of simulation tools was to verify the mechanisms of shaping traffic with the applied queuing algorithms. It is shown that temporal models of Petri nets can be effectively used in the modeling and analysis of the performance of computer networks.

  1. Generating Concise Rules for Human Motion Retrieval

    NASA Astrophysics Data System (ADS)

    Mukai, Tomohiko; Wakisaka, Ken-Ichi; Kuriyama, Shigeru

    This paper proposes a method for retrieving human motion data with concise retrieval rules based on the spatio-temporal features of motion appearance. Our method first converts motion clip into a form of clausal language that represents geometrical relations between body parts and their temporal relationship. A retrieval rule is then learned from the set of manually classified examples using inductive logic programming (ILP). ILP automatically discovers the essential rule in the same clausal form with a user-defined hypothesis-testing procedure. All motions are indexed using this clausal language, and the desired clips are retrieved by subsequence matching using the rule. Such rule-based retrieval offers reasonable performance and the rule can be intuitively edited in the same language form. Consequently, our method enables efficient and flexible search from a large dataset with simple query language.

  2. Training Signaling Pathway Maps to Biochemical Data with Constrained Fuzzy Logic: Quantitative Analysis of Liver Cell Responses to Inflammatory Stimuli

    PubMed Central

    Morris, Melody K.; Saez-Rodriguez, Julio; Clarke, David C.; Sorger, Peter K.; Lauffenburger, Douglas A.

    2011-01-01

    Predictive understanding of cell signaling network operation based on general prior knowledge but consistent with empirical data in a specific environmental context is a current challenge in computational biology. Recent work has demonstrated that Boolean logic can be used to create context-specific network models by training proteomic pathway maps to dedicated biochemical data; however, the Boolean formalism is restricted to characterizing protein species as either fully active or inactive. To advance beyond this limitation, we propose a novel form of fuzzy logic sufficiently flexible to model quantitative data but also sufficiently simple to efficiently construct models by training pathway maps on dedicated experimental measurements. Our new approach, termed constrained fuzzy logic (cFL), converts a prior knowledge network (obtained from literature or interactome databases) into a computable model that describes graded values of protein activation across multiple pathways. We train a cFL-converted network to experimental data describing hepatocytic protein activation by inflammatory cytokines and demonstrate the application of the resultant trained models for three important purposes: (a) generating experimentally testable biological hypotheses concerning pathway crosstalk, (b) establishing capability for quantitative prediction of protein activity, and (c) prediction and understanding of the cytokine release phenotypic response. Our methodology systematically and quantitatively trains a protein pathway map summarizing curated literature to context-specific biochemical data. This process generates a computable model yielding successful prediction of new test data and offering biological insight into complex datasets that are difficult to fully analyze by intuition alone. PMID:21408212

  3. Programmable Logic Application Notes

    NASA Technical Reports Server (NTRS)

    Katz, Richard

    1998-01-01

    This column will be provided each quarter as a source for reliability, radiation results, NASA capabilities, and other information on programmable logic devices and related applications. This quarter's column will include some announcements and some recent radiation test results and evaluations of interest. Specifically, the following topics will be covered: the Military and Aerospace Applications of Programmable Devices and Technologies Conference to be held at GSFC in September, 1998, proton test results, and some total dose results.

  4. A specification of 3D manipulation in virtual environments

    NASA Technical Reports Server (NTRS)

    Su, S. Augustine; Furuta, Richard

    1994-01-01

    In this paper we discuss the modeling of three basic kinds of 3-D manipulations in the context of a logical hand device and our virtual panel architecture. The logical hand device is a useful software abstraction representing hands in virtual environments. The virtual panel architecture is the 3-D component of the 2-D window systems. Both of the abstractions are intended to form the foundation for adaptable 3-D manipulation.

  5. The Forensic Potential of Flash Memory

    DTIC Science & Technology

    2009-09-01

    limit range of 10 to 100 years before data is lost [12]. 5. Flash Memory Logical Structure The logical structure of flash memory from least to...area is not standardized and is manufacturer specific. This information will be used by the wear leveling algorithms and as such will be proprietary...memory cells, the manufacturers of the flash implement a wear leveling algorithm . In contrast, a magnetic disk in an overwrite operation will reuse the

  6. Trust Management and Security in Satellite Telecommand Processing

    DTIC Science & Technology

    2011-03-24

    include XREP, NICE, and P- Grid . These systems aggregate the perception of entities in the system to calculate a local reputation value for a specific...peripheral used is a Universal Asynchronous Receiver Transmitter ( UART ) which is connected to a Recommended Standard 232 (RS232) transceiver onboard [49...satellite, a logic analyzer was connected to monitor UART signals on the test board. The logic analyzer used for this testing was a USBee ZX module

  7. Neuropsychological functioning and brain energetics of drug resistant mesial temporal lobe epilepsy patients.

    PubMed

    Osório, Camila Moreira; Latini, Alexandra; Leal, Rodrigo Bainy; de Oliveira Thais, Maria Emília Rodrigues; Vascouto, Helena Dresch; Remor, Aline Pertile; Lopes, Mark William; Linhares, Marcelo Neves; Ben, Juliana; de Paula Martins, Roberta; Prediger, Rui Daniel; Hoeller, Alexandre Ademar; Markowitsch, Hans Joachim; Wolf, Peter; Lin, Kátia; Walz, Roger

    2017-12-01

    Interictal hypometabolism is commonly measured by 18-fluoro-deoxyglucose Positron Emission Tomography (FDG-PET) in the temporal lobe of patients with mesial temporal lobe epilepsy (MTLE-HS). Left temporal lobe interictal FDG-PET hypometabolism has been associated with verbal memory impairment, while right temporal lobe FDG-PET hypometabolism is associated with nonverbal memory impairment. The biochemical mechanisms involved in these findings remain unknown. In comparison to healthy controls (n=21), surgically treated patients with MTLE-HS (n=32, left side=17) had significant lower scores in the Rey Auditory Verbal Learning Test (RAVLT retention and delayed), Logical Memory II (LMII), Boston Naming test (BNT), Letter Fluency and Category Fluency. We investigated whether enzymatic activities of the mitochondrial enzymes Complex I (C I), Complex II (C II), Complex IV (C IV) and Succinate Dehydrogenase (SDH) from the resected samples of the middle temporal neocortex (mTCx), amygdala (AMY) and hippocampus (HIP) were associated with performance in the RAVLT, LMII, BNT and fluency tests of our patients. After controlling for the side of hippocampus sclerosis, years of education, disease duration, antiepileptic treatment and seizure outcome after surgery, no independent associations were observed between the cognitive test scores and the analyzed mitochondrial enzymatic activities (p>0.37). Results indicate that memory and language impairment observed in MTLE-HS patients are not strongly associated with the levels of mitochondrial CI, CII, SDH and C IV enzymatic activities in the temporal lobe structures ipsilateral to the HS lesion. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Reflections on discrimination and health in India.

    PubMed

    Srivatsan, R

    2015-01-01

    This is a speculative paper on the structure of caste-based discrimination in India. It sketches the field by a) proposing four empirical and historical examples of discrimination in different medical situations; b) suggesting an analytical framework composed of domain, register, temporality and intensity of discrimination; c) proposing that in the Indian historical context, discrimination masks itself, hiding its character behind the veneer of secular ideas; d) arguing that discrimination is not some unfortunate residue of backwardness in modern society that will go away, but is the force of social hierarchy transforming itself into a fully modern capitalist culture. The paper then arrives at the understanding that discrimination is pandemic across India. The conclusion suggests that in India today, we need proposals, hypotheses and arguments that help us establish the ethical framework for meaningful empirical research that sociological studies of medical ethics and the epidemiology of discrimination can pursue. Its method is that of logical and speculative argument based on experience, with examples of different forms of discrimination to clarify the point being made. No specific research was undertaken for this purpose since the paper is not empirically based.

  9. Real-Time Reliability Verification for UAV Flight Control System Supporting Airworthiness Certification.

    PubMed

    Xu, Haiyang; Wang, Ping

    2016-01-01

    In order to verify the real-time reliability of unmanned aerial vehicle (UAV) flight control system and comply with the airworthiness certification standard, we proposed a model-based integration framework for modeling and verification of time property. Combining with the advantages of MARTE, this framework uses class diagram to create the static model of software system, and utilizes state chart to create the dynamic model. In term of the defined transformation rules, the MARTE model could be transformed to formal integrated model, and the different part of the model could also be verified by using existing formal tools. For the real-time specifications of software system, we also proposed a generating algorithm for temporal logic formula, which could automatically extract real-time property from time-sensitive live sequence chart (TLSC). Finally, we modeled the simplified flight control system of UAV to check its real-time property. The results showed that the framework could be used to create the system model, as well as precisely analyze and verify the real-time reliability of UAV flight control system.

  10. Real-Time Reliability Verification for UAV Flight Control System Supporting Airworthiness Certification

    PubMed Central

    Xu, Haiyang; Wang, Ping

    2016-01-01

    In order to verify the real-time reliability of unmanned aerial vehicle (UAV) flight control system and comply with the airworthiness certification standard, we proposed a model-based integration framework for modeling and verification of time property. Combining with the advantages of MARTE, this framework uses class diagram to create the static model of software system, and utilizes state chart to create the dynamic model. In term of the defined transformation rules, the MARTE model could be transformed to formal integrated model, and the different part of the model could also be verified by using existing formal tools. For the real-time specifications of software system, we also proposed a generating algorithm for temporal logic formula, which could automatically extract real-time property from time-sensitive live sequence chart (TLSC). Finally, we modeled the simplified flight control system of UAV to check its real-time property. The results showed that the framework could be used to create the system model, as well as precisely analyze and verify the real-time reliability of UAV flight control system. PMID:27918594

  11. Severe anterograde amnesia with onset in childhood as a result of anoxic encephalopathy.

    PubMed

    Broman, M; Rose, A L; Hotson, G; Casey, C M

    1997-03-01

    Our patient (M.S.) had an abrupt onset of amnesia due to a respiratory arrest at the age of 8 years and has been followed by one of us (A.L.R.) for 19 years. A specially designed MRI study indicated that the neuroanatomical localization of his lesion is restricted to the hippocampal formation bilaterally. Comparison of M.S.'s present IQ and academic scores with earlier scores revealed that his literacy skills, certain basic language functions and vocabulary development were arrested by his memory disorder. In contrast, development of mathematical skill was less curtailed, and verbal and nonverbal logical abilities developed to adult levels. Neuropsychological examination at the age of 27 years elicited a pattern of memory deficits similar to those found in a case (H.M.) of known mesial temporal lobe damage in adulthood. The neuropsychological pattern revealed those aspects of cognitive development that do, and those that do not, require intact memory. The limitations to intellectual development imposed by severe amnesia in childhood are not pervasive, but rather, are limited to specific types of abilities.

  12. Technologies for Controlled, Local Delivery of siRNA

    PubMed Central

    Sarett, Samantha M.; Nelson, Christopher E.; Duvall, Craig L.

    2015-01-01

    The discovery of RNAi in the late 1990s unlocked a new realm of therapeutic possibilities by enabling potent and specific silencing of theoretically any desired genetic target. Better elucidation of the mechanism of action, the impact of chemical modifications that stabilize and reduce nonspecific effects of siRNA molecules, and the key design considerations for effective delivery systems has spurred progress toward developing clinically-successful siRNA therapies. A logical aim for initial siRNA translation is local therapies, as delivering siRNA directly to its site of action helps to ensure that a sufficient dose reaches the target tissue, lessens the potential for off-target side effects, and circumvents the substantial systemic delivery barriers. While topical siRNA delivery has progressed into numerous clinical trials, an enormous opportunity also exists to develop sustained-release, local delivery systems that enable both spatial and temporal control of gene silencing. This review focuses on material platforms that establish both localized and controlled gene silencing, with emphasis on the systems that show most promise for clinical translation. PMID:26476177

  13. In situ amplification of intracellular microRNA with MNAzyme nanodevices for multiplexed imaging, logic operation, and controlled drug release.

    PubMed

    Zhang, Penghui; He, Zhimei; Wang, Chen; Chen, Jiangning; Zhao, Jingjing; Zhu, Xuena; Li, Chen-Zhong; Min, Qianhao; Zhu, Jun-Jie

    2015-01-27

    MicroRNAs (miRNAs), as key regulators in gene expression networks, have participated in many biological processes, including cancer initiation, progression, and metastasis, indicative of potential diagnostic biomarkers and therapeutic targets. To tackle the low abundance of miRNAs in a single cell, we have developed programmable nanodevices with MNAzymes to realize stringent recognition and in situ amplification of intracellular miRNAs for multiplexed detection and controlled drug release. As a proof of concept, miR-21 and miR-145, respectively up- and down-expressed in most tumor tissues, were selected as endogenous cancer indicators and therapy triggers to test the efficacy of the photothermal nanodevices. The sequence programmability and specificity of MNAzyme motifs enabled the fluorescent turn-on probes not only to sensitively profile the distributions of miR-21/miR-145 in cell lysates of HeLa, HL-60, and NIH 3T3 (9632/0, 14147/0, 2047/421 copies per cell, respectively) but also to visualize trace amounts of miRNAs in a single cell, allowing logic operation for graded cancer risk assessment and dynamic monitoring of therapy response by confocal microscopy and flow cytometry. Furthermore, through general molecular design, the MNAzyme motifs could serve as three-dimensional gatekeepers to lock the doxorubicin inside the nanocarriers. The drug nanocarriers were exclusively internalized into the target tumor cells via aptamer-guided recognition and reopened by the endogenous miRNAs, where the drug release rates could be spatial-temporally controlled by the modulation of miRNA expression. Integrated with miRNA profiling techniques, the designed nanodevices can provide general strategy for disease diagnosis, prognosis, and combination treatment with chemotherapy and gene therapy.

  14. Mapping and improving frequency, accuracy, and interpretation of land cover change: Classifying coastal Louisiana with 1990, 1993, 1996, and 1999 Landsat Thematic Mapper image data

    USGS Publications Warehouse

    Nelson, G.; Ramsey, Elijah W.; Rangoonwala, A.

    2005-01-01

    Landsat Thematic Mapper images and collateral data sources were used to classify the land cover of the Mermentau River Basin within the chenier coastal plain and the adjacent uplands of Louisiana, USA. Landcover classes followed that of the National Oceanic and Atmospheric Administration's Coastal Change Analysis Program; however, classification methods needed to be developed to meet these national standards. Our first classification was limited to the Mermentau River Basin (MRB) in southcentral Louisiana, and the years of 1990, 1993, and 1996. To overcome problems due to class spectral inseparable, spatial and spectra continuums, mixed landcovers, and abnormal transitions, we separated the coastal area into regions of commonality and applying masks to specific land mixtures. Over the three years and 14 landcover classes (aggregating the cultivated land and grassland, and water and floating vegetation classes), overall accuracies ranged from 82% to 90%. To enhance landcover change interpretation, three indicators were introduced as Location Stability, Residence stability, and Turnover. Implementing methods substantiated in the multiple date MRB classification, we spatially extended the classification to the entire Louisiana coast and temporally extended the original 1990, 1993, 1996 classifications to 1999 (Figure 1). We also advanced the operational functionality of the classification and increased the credibility of change detection results. Increased operational functionality that resulted in diminished user input was for the most part gained by implementing a classification logic based on forbidden transitions. The logic detected and corrected misclassifications and mostly alleviated the necessity of subregion separation prior to the classification. The new methods provided an improved ability for more timely detection and response to landcover impact. ?? 2005 IEEE.

  15. SOME IS NOT ENOUGH: QUANTIFIER COMPREHENSION IN CORTICOBASAL SYNDROME AND BEHAVIORAL VARIANT FRONTOTEMPORAL DEMENTIA

    PubMed Central

    Morgan, Brianna; Gross, Rachel; Clark, Robin; Dreyfuss, Michael; Boller, Ashley; Camp, Emily; Liang, Tsao-Wei; Avants, Brian; McMillan, Corey; Grossman, Murray

    2011-01-01

    Quantifiers are very common in everyday speech, but we know little about their cognitive basis or neural representation. The present study examined comprehension of three classes of quantifiers that depend on different cognitive components in patients with focal neurodegenerative diseases. Patients evaluated the truth-value of a sentence containing a quantifier relative to a picture illustrating a small number of familiar objects, and performance was related to MRI grey matter atrophy using voxel-based morphometry. We found that patients with corticobasal syndrome (CBS) and posterior cortical atrophy (PCA) are significantly impaired in their comprehension of Cardinal Quantifiers (e.g. “At least three birds are on the branch”), due in part to their deficit in quantity knowledge. MRI analyses related this deficit to temporal-parietal atrophy found in CBS/PCA. We also found that patients with behavioral variant frontotemporal dementia (bvFTD) are significantly impaired in their comprehension of Logical Quantifiers (e.g. “Some the birds are on the branch”), associated with a simple form of perceptual logic, and this correlated with their deficit on executive measures. This deficit was related to disease in rostral prefrontal cortex in bvFTD. These patients were also impaired in their comprehension of Majority Quantifiers (e.g. “At least half of the birds are on the branch”), and this too was correlated with their deficit on executive measures. This was related to disease in the basal ganglia interrupting a frontal-striatal loop critical for executive functioning. These findings suggest that a large-scale frontal-parietal neural network plays a crucial role in quantifier comprehension, and that comprehension of specific classes of quantifiers may be selectively impaired in patients with focal neurodegenerative conditions in these areas. PMID:21930136

  16. Short-term prediction of solar energy in Saudi Arabia using automated-design fuzzy logic systems

    PubMed Central

    2017-01-01

    Solar energy is considered as one of the main sources for renewable energy in the near future. However, solar energy and other renewable energy sources have a drawback related to the difficulty in predicting their availability in the near future. This problem affects optimal exploitation of solar energy, especially in connection with other resources. Therefore, reliable solar energy prediction models are essential to solar energy management and economics. This paper presents work aimed at designing reliable models to predict the global horizontal irradiance (GHI) for the next day in 8 stations in Saudi Arabia. The designed models are based on computational intelligence methods of automated-design fuzzy logic systems. The fuzzy logic systems are designed and optimized with two models using fuzzy c-means clustering (FCM) and simulated annealing (SA) algorithms. The first model uses FCM based on the subtractive clustering algorithm to automatically design the predictor fuzzy rules from data. The second model is using FCM followed by simulated annealing algorithm to enhance the prediction accuracy of the fuzzy logic system. The objective of the predictor is to accurately predict next-day global horizontal irradiance (GHI) using previous-day meteorological and solar radiation observations. The proposed models use observations of 10 variables of measured meteorological and solar radiation data to build the model. The experimentation and results of the prediction are detailed where the root mean square error of the prediction was approximately 88% for the second model tuned by simulated annealing compared to 79.75% accuracy using the first model. This results demonstrate a good modeling accuracy of the second model despite that the training and testing of the proposed models were carried out using spatially and temporally independent data. PMID:28806754

  17. Programmed optoelectronic time-pulse coded relational processor as base element for sorting neural networks

    NASA Astrophysics Data System (ADS)

    Krasilenko, Vladimir G.; Bardachenko, Vitaliy F.; Nikolsky, Alexander I.; Lazarev, Alexander A.

    2007-04-01

    In the paper we show that the biologically motivated conception of the use of time-pulse encoding gives the row of advantages (single methodological basis, universality, simplicity of tuning, training and programming et al) at creation and designing of sensor systems with parallel input-output and processing, 2D-structures of hybrid and neuro-fuzzy neurocomputers of next generations. We show principles of construction of programmable relational optoelectronic time-pulse coded processors, continuous logic, order logic and temporal waves processes, that lie in basis of the creation. We consider structure that executes extraction of analog signal of the set grade (order), sorting of analog and time-pulse coded variables. We offer optoelectronic realization of such base relational elements of order logic, which consists of time-pulse coded phototransformers (pulse-width and pulse-phase modulators) with direct and complementary outputs, sorting network on logical elements and programmable commutations blocks. We make estimations of basic technical parameters of such base devices and processors on their basis by simulation and experimental research: power of optical input signals - 0.200-20 μW, processing time - microseconds, supply voltage - 1.5-10 V, consumption power - hundreds of microwatts per element, extended functional possibilities, training possibilities. We discuss some aspects of possible rules and principles of training and programmable tuning on the required function, relational operation and realization of hardware blocks for modifications of such processors. We show as on the basis of such quasiuniversal hardware simple block and flexible programmable tuning it is possible to create sorting machines, neural networks and hybrid data-processing systems with the untraditional numerical systems and pictures operands.

  18. Short-term prediction of solar energy in Saudi Arabia using automated-design fuzzy logic systems.

    PubMed

    Almaraashi, Majid

    2017-01-01

    Solar energy is considered as one of the main sources for renewable energy in the near future. However, solar energy and other renewable energy sources have a drawback related to the difficulty in predicting their availability in the near future. This problem affects optimal exploitation of solar energy, especially in connection with other resources. Therefore, reliable solar energy prediction models are essential to solar energy management and economics. This paper presents work aimed at designing reliable models to predict the global horizontal irradiance (GHI) for the next day in 8 stations in Saudi Arabia. The designed models are based on computational intelligence methods of automated-design fuzzy logic systems. The fuzzy logic systems are designed and optimized with two models using fuzzy c-means clustering (FCM) and simulated annealing (SA) algorithms. The first model uses FCM based on the subtractive clustering algorithm to automatically design the predictor fuzzy rules from data. The second model is using FCM followed by simulated annealing algorithm to enhance the prediction accuracy of the fuzzy logic system. The objective of the predictor is to accurately predict next-day global horizontal irradiance (GHI) using previous-day meteorological and solar radiation observations. The proposed models use observations of 10 variables of measured meteorological and solar radiation data to build the model. The experimentation and results of the prediction are detailed where the root mean square error of the prediction was approximately 88% for the second model tuned by simulated annealing compared to 79.75% accuracy using the first model. This results demonstrate a good modeling accuracy of the second model despite that the training and testing of the proposed models were carried out using spatially and temporally independent data.

  19. The Priority Inversion Problem and Real-Time Symbolic Model Checking

    DTIC Science & Technology

    1993-04-23

    real time systems unpredictable in subtle ways. This makes it more difficult to implement and debug such systems. Our work discusses this problem and presents one possible solution. The solution is formalized and verified using temporal logic model checking techniques. In order to perform the verification, the BDD-based symbolic model checking algorithm given in previous works was extended to handle real-time properties using the bounded until operator. We believe that this algorithm, which is based on discrete time, is able to handle many real-time properties

  20. Toward a Social Theory of Sexual Risk Behavior Among Men in the Armed Services: Understanding the Military Occupational Habitus

    DTIC Science & Technology

    2013-10-08

    make sense of and act on the risk of bodily harm with regard to their own sexual behaviors. We conclude by outlining our theoretical concept so that it...occupation. Pierre Bourdieu’s conception of the habitus is useful for developing a framework to ‘‘uncover the bodily and cultural logic of epidemiologically...imposes itself at the deepest level of the bodily dispositions through a particular way of regulating the use of time, the temporal distribution of

  1. The ANMLite Language and Logic for Specifying Planning Problems

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Siminiceanu, Radu I.; Munoz, Cesar A.

    2007-01-01

    We present the basic concepts of the ANMLite planning language. We discuss various aspects of specifying a plan in terms of constraints and checking the existence of a solution with the help of a model checker. The constructs of the ANMLite language have been kept as simple as possible in order to reduce complexity and simplify the verification problem. We illustrate the language with a specification of the space shuttle crew activity model that was constructed under the Spacecraft Autonomy for Vehicles and Habitats (SAVH) project. The main purpose of this study was to explore the implications of choosing a robust logic behind the specification of constraints, rather than simply proposing a new planning language.

  2. Exploration of picture grammars, grammar learning, and inductive logic programming for image understanding

    NASA Astrophysics Data System (ADS)

    Ducksbury, P. G.; Kennedy, C.; Lock, Z.

    2003-09-01

    Grammars have been used for the formal specification of programming languages, and there are a number of commercial products which now use grammars. However, these have tended to be focused mainly on flow control type applications. In this paper, we consider the potential use of picture grammars and inductive logic programming in generic image understanding applications, such as object recognition. A number of issues are considered, such as what type of grammar needs to be used, how to construct the grammar with its associated attributes, difficulties encountered with parsing grammars followed by issues of automatically learning grammars using a genetic algorithm. The concept of inductive logic programming is then introduced as a method that can overcome some of the earlier difficulties.

  3. Swapnaushadhi: the embedded logic of dreams and medical innovation in Bengal.

    PubMed

    Mukharji, Projit Bihari

    2014-09-01

    Numerous medicines in South Asia have their origins in dreams. Deities, saints and other supernatural beings frequently appear in dreams to instruct dreamers about specific remedies, therapeutic techniques, modes of care etc. These therapies challenge available models of historicising dreams. Once we overcome these challenges and unearth the embedded logic of these dreams, we begin to discern in them a dynamic institution that enabled and sustained therapeutic change within a 'traditional' medical milieu.

  4. An Analysis of Categorical and Quantitative Methods for Planning Under Uncertainty

    PubMed Central

    Langlotz, Curtis P.; Shortliffe, Edward H.

    1988-01-01

    Decision theory and logical reasoning are both methods for representing and solving medical decision problems. We analyze the usefulness of these two approaches to medical therapy planning by establishing a simple correspondence between decision theory and non-monotonic logic, a formalization of categorical logical reasoning. The analysis indicates that categorical approaches to planning can be viewed as comprising two decision-theoretic concepts: probabilities (degrees of belief in planning hypotheses) and utilities (degrees of desirability of planning outcomes). We present and discuss examples of the following lessons from this decision-theoretic view of categorical (nonmonotonic) reasoning: (1) Decision theory and artificial intelligence techniques are intended to solve different components of the planning problem. (2) When considered in the context of planning under uncertainty, nonmonotonic logics do not retain the domain-independent characteristics of classical logical reasoning for planning under certainty. (3) Because certain nonmonotonic programming paradigms (e.g., frame-based inheritance, rule-based planning, protocol-based reminders) are inherently problem-specific, they may be inappropriate to employ in the solution of certain types of planning problems. We discuss how these conclusions affect several current medical informatics research issues, including the construction of “very large” medical knowledge bases.

  5. Metacognition and abstract reasoning.

    PubMed

    Markovits, Henry; Thompson, Valerie A; Brisson, Janie

    2015-05-01

    The nature of people's meta-representations of deductive reasoning is critical to understanding how people control their own reasoning processes. We conducted two studies to examine whether people have a metacognitive representation of abstract validity and whether familiarity alone acts as a separate metacognitive cue. In Study 1, participants were asked to make a series of (1) abstract conditional inferences, (2) concrete conditional inferences with premises having many potential alternative antecedents and thus specifically conducive to the production of responses consistent with conditional logic, or (3) concrete problems with premises having relatively few potential alternative antecedents. Participants gave confidence ratings after each inference. Results show that confidence ratings were positively correlated with logical performance on abstract problems and concrete problems with many potential alternatives, but not with concrete problems with content less conducive to normative responses. Confidence ratings were higher with few alternatives than for abstract content. Study 2 used a generation of contrary-to-fact alternatives task to improve levels of abstract logical performance. The resulting increase in logical performance was mirrored by increases in mean confidence ratings. Results provide evidence for a metacognitive representation based on logical validity, and show that familiarity acts as a separate metacognitive cue.

  6. Logical Fallacies and the Abuse of Climate Science: Fire, Water, and Ice

    NASA Astrophysics Data System (ADS)

    Gleick, P. H.

    2012-12-01

    Good policy without good science and analysis is unlikely. Good policy with bad science is even more unlikely. Unfortunately, there is a long history of abuse or misuse of science in fields with ideological, religious, or economically controversial policy implications, such as planetary physics during the time of Galileo, the evolution debate, or climate change. Common to these controversies are what are known as "logical fallacies" -- patterns of reasoning that are always -- or at least commonly -- wrong due to a flaw in the structure of the argument that renders the argument invalid. All scientists should understand the nature of logical fallacies in order to (1) avoid making mistakes and reaching unsupported conclusion, (2) help them understand and refute the flaws in arguments made by others, and (3) aid in communicating science to the public. This talk will present a series of logical fallacies often made in the climate science debate, including "arguments from ignorance," "arguments from error," "arguments from misinterpretation," and "cherry picking." Specific examples will be presented in the area of temperature analysis, water resources, and ice dynamics, with a focus on selective use or misuse of data.; "Argument from Error" - an amusing example of a logical fallacy.

  7. Levels of conflict in reasoning modulate right lateral prefrontal cortex.

    PubMed

    Stollstorff, Melanie; Vartanian, Oshin; Goel, Vinod

    2012-01-05

    Right lateral prefrontal cortex (rlPFC) has previously been implicated in logical reasoning under conditions of conflict. A functional magnetic resonance imaging (fMRI) study was conducted to explore its role in conflict more precisely. Specifically, we distinguished between belief-logic conflict and belief-content conflict, and examined the role of rlPFC under each condition. The results demonstrated that a specific region of rlPFC is consistently activated under both types of conflict. Moreover, the results of a parametric analysis demonstrated that the same region was modulated by the level of conflict contained in reasoning arguments. This supports the idea that this specific region is engaged to resolve conflict, including during deductive reasoning. This article is part of a Special Issue entitled "The Cognitive Neuroscience of Thought". Copyright © 2011 Elsevier B.V. All rights reserved.

  8. LogiKit - assisting complex logic specification and implementation for embedded control systems

    NASA Astrophysics Data System (ADS)

    Diglio, A.; Nicolodi, B.

    2002-07-01

    LogiKit provides an overall lifecycle solution. LogiKit is a powerful software engineering case toolkit for requirements specification, simulation and documentation. LogiKit also provides an automatic ADA software design, code and unit test generator.

  9. Parallel and Multivalued Logic by the Two-Dimensional Photon-Echo Response of a Rhodamine–DNA Complex

    PubMed Central

    2015-01-01

    Implementing parallel and multivalued logic operations at the molecular scale has the potential to improve the miniaturization and efficiency of a new generation of nanoscale computing devices. Two-dimensional photon-echo spectroscopy is capable of resolving dynamical pathways on electronic and vibrational molecular states. We experimentally demonstrate the implementation of molecular decision trees, logic operations where all possible values of inputs are processed in parallel and the outputs are read simultaneously, by probing the laser-induced dynamics of populations and coherences in a rhodamine dye mounted on a short DNA duplex. The inputs are provided by the bilinear interactions between the molecule and the laser pulses, and the output values are read from the two-dimensional molecular response at specific frequencies. Our results highlights how ultrafast dynamics between multiple molecular states induced by light–matter interactions can be used as an advantage for performing complex logic operations in parallel, operations that are faster than electrical switching. PMID:25984269

  10. Towards Resilient Critical Infrastructures: Application of Type-2 Fuzzy Logic in Embedded Network Security Cyber Sensor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ondrej Linda; Todd Vollmer; Jim Alves-Foss

    2011-08-01

    Resiliency and cyber security of modern critical infrastructures is becoming increasingly important with the growing number of threats in the cyber-environment. This paper proposes an extension to a previously developed fuzzy logic based anomaly detection network security cyber sensor via incorporating Type-2 Fuzzy Logic (T2 FL). In general, fuzzy logic provides a framework for system modeling in linguistic form capable of coping with imprecise and vague meanings of words. T2 FL is an extension of Type-1 FL which proved to be successful in modeling and minimizing the effects of various kinds of dynamic uncertainties. In this paper, T2 FL providesmore » a basis for robust anomaly detection and cyber security state awareness. In addition, the proposed algorithm was specifically developed to comply with the constrained computational requirements of low-cost embedded network security cyber sensors. The performance of the system was evaluated on a set of network data recorded from an experimental cyber-security test-bed.« less

  11. Fluorescence intensity positivity classification of Hep-2 cells images using fuzzy logic

    NASA Astrophysics Data System (ADS)

    Sazali, Dayang Farzana Abang; Janier, Josefina Barnachea; May, Zazilah Bt.

    2014-10-01

    Indirect Immunofluorescence (IIF) is a good standard used for antinuclear autoantibody (ANA) test using Hep-2 cells to determine specific diseases. Different classifier algorithm methods have been proposed in previous works however, there still no valid set as a standard to classify the fluorescence intensity. This paper presents the use of fuzzy logic to classify the fluorescence intensity and to determine the positivity of the Hep-2 cell serum samples. The fuzzy algorithm involves the image pre-processing by filtering the noises and smoothen the image, converting the red, green and blue (RGB) color space of images to luminosity layer, chromaticity layer "a" and "b" (LAB) color space where the mean value of the lightness and chromaticity layer "a" was extracted and classified by using fuzzy logic algorithm based on the standard score ranges of antinuclear autoantibody (ANA) fluorescence intensity. Using 100 data sets of positive and intermediate fluorescence intensity for testing the performance measurements, the fuzzy logic obtained an accuracy of intermediate and positive class as 85% and 87% respectively.

  12. Non Linear Programming (NLP) Formulation for Quantitative Modeling of Protein Signal Transduction Pathways

    PubMed Central

    Morris, Melody K.; Saez-Rodriguez, Julio; Lauffenburger, Douglas A.; Alexopoulos, Leonidas G.

    2012-01-01

    Modeling of signal transduction pathways plays a major role in understanding cells' function and predicting cellular response. Mathematical formalisms based on a logic formalism are relatively simple but can describe how signals propagate from one protein to the next and have led to the construction of models that simulate the cells response to environmental or other perturbations. Constrained fuzzy logic was recently introduced to train models to cell specific data to result in quantitative pathway models of the specific cellular behavior. There are two major issues in this pathway optimization: i) excessive CPU time requirements and ii) loosely constrained optimization problem due to lack of data with respect to large signaling pathways. Herein, we address both issues: the former by reformulating the pathway optimization as a regular nonlinear optimization problem; and the latter by enhanced algorithms to pre/post-process the signaling network to remove parts that cannot be identified given the experimental conditions. As a case study, we tackle the construction of cell type specific pathways in normal and transformed hepatocytes using medium and large-scale functional phosphoproteomic datasets. The proposed Non Linear Programming (NLP) formulation allows for fast optimization of signaling topologies by combining the versatile nature of logic modeling with state of the art optimization algorithms. PMID:23226239

  13. Malleable architecture generator for FPGA computing

    NASA Astrophysics Data System (ADS)

    Gokhale, Maya; Kaba, James; Marks, Aaron; Kim, Jang

    1996-10-01

    The malleable architecture generator (MARGE) is a tool set that translates high-level parallel C to configuration bit streams for field-programmable logic based computing systems. MARGE creates an application-specific instruction set and generates the custom hardware components required to perform exactly those computations specified by the C program. In contrast to traditional fixed-instruction processors, MARGE's dynamic instruction set creation provides for efficient use of hardware resources. MARGE processes intermediate code in which each operation is annotated by the bit lengths of the operands. Each basic block (sequence of straight line code) is mapped into a single custom instruction which contains all the operations and logic inherent in the block. A synthesis phase maps the operations comprising the instructions into register transfer level structural components and control logic which have been optimized to exploit functional parallelism and function unit reuse. As a final stage, commercial technology-specific tools are used to generate configuration bit streams for the desired target hardware. Technology- specific pre-placed, pre-routed macro blocks are utilized to implement as much of the hardware as possible. MARGE currently supports the Xilinx-based Splash-2 reconfigurable accelerator and National Semiconductor's CLAy-based parallel accelerator, MAPA. The MARGE approach has been demonstrated on systolic applications such as DNA sequence comparison.

  14. Non Linear Programming (NLP) formulation for quantitative modeling of protein signal transduction pathways.

    PubMed

    Mitsos, Alexander; Melas, Ioannis N; Morris, Melody K; Saez-Rodriguez, Julio; Lauffenburger, Douglas A; Alexopoulos, Leonidas G

    2012-01-01

    Modeling of signal transduction pathways plays a major role in understanding cells' function and predicting cellular response. Mathematical formalisms based on a logic formalism are relatively simple but can describe how signals propagate from one protein to the next and have led to the construction of models that simulate the cells response to environmental or other perturbations. Constrained fuzzy logic was recently introduced to train models to cell specific data to result in quantitative pathway models of the specific cellular behavior. There are two major issues in this pathway optimization: i) excessive CPU time requirements and ii) loosely constrained optimization problem due to lack of data with respect to large signaling pathways. Herein, we address both issues: the former by reformulating the pathway optimization as a regular nonlinear optimization problem; and the latter by enhanced algorithms to pre/post-process the signaling network to remove parts that cannot be identified given the experimental conditions. As a case study, we tackle the construction of cell type specific pathways in normal and transformed hepatocytes using medium and large-scale functional phosphoproteomic datasets. The proposed Non Linear Programming (NLP) formulation allows for fast optimization of signaling topologies by combining the versatile nature of logic modeling with state of the art optimization algorithms.

  15. A first hazard analysis of the Harrat Ash Shamah volcanic field, Syria-Jordan Borderline

    NASA Astrophysics Data System (ADS)

    Cagnan, Zehra; Akkar, Sinan; Moghimi, Saed

    2017-04-01

    The northernmost part of the Saudi Cenozoic Volcanic Fields, the 100,000 km2 Harrat Ash Shamah has hosted some of the most recent volcanic eruptions along the Syria-Jordan borderline. With rapid growth of the cities in this region, exposure to any potential renewed volcanism increased considerably. We present here a first-order probabilistic hazard analysis related to new vent formation and subsequent lava flow from Harrat Ash Shamah. The 733 visible eruption vent sites were utilized to develop a probability density function for new eruption sites using Gaussian kernel smoothing. This revealed a NNW striking zone of high spatial hazard surrounding the cities Amman and Irbid in Jordan. The temporal eruption recurrence rate is estimated to be approximately one vent per 3500 years, but the temporal record of the field is so poorly constrained that the lower and upper bounds for the recurrence interval are 17,700 yrs and 70 yrs, respectively. A Poisson temporal model is employed within the scope of this study. In order to treat the uncertainties associated with the spatio-temporal models as well as size of the area affected by the lava flow, the logic tree approach is adopted. For the Syria-Jordan borderline, the spatial variation of volcanic hazard is computed as well as uncertainty associated with these estimates.

  16. Navigating the field of temporally framed care in the Danish home care sector.

    PubMed

    Tufte, Pernille; Dahl, Hanne Marlene

    2016-01-01

    The organisational and temporal framing of elderly care in Europe has changed in the wake of new public management reforms and standardised care services, the strict measurement of time and work schedules have become central aspects of care work. The article investigates the crafting of care in this framing: how care workers approach the services specified in their rotas and navigate between needs, demands and opportunities in the daily performance of duties. Applying feminist theory on time and anthropological theory on social navigation, it examines the practice of home care work in two Danish municipalities. Data are derived predominantly from participant observation. The article identifies two overarching temporal dilemmas in different home care situations: one where process time prevails over clock time and another where the care workers balance the two. Focusing on how care workers respond to these dilemmas in practice, the article identifies various navigation tactics, including leaving time outside, individualised routinisation, working on different paths simultaneously and postponing tasks. By assessing care workers' performance in the temporal framing of work and focusing on care workers' mediation between different time logics, this study provides an in-depth perspective on the broader feminist literature on the dilemmas of care. © 2015 Foundation for the Sociology of Health & Illness.

  17. Engineering integrated digital circuits with allosteric ribozymes for scaling up molecular computation and diagnostics.

    PubMed

    Penchovsky, Robert

    2012-10-19

    Here we describe molecular implementations of integrated digital circuits, including a three-input AND logic gate, a two-input multiplexer, and 1-to-2 decoder using allosteric ribozymes. Furthermore, we demonstrate a multiplexer-decoder circuit. The ribozymes are designed to seek-and-destroy specific RNAs with a certain length by a fully computerized procedure. The algorithm can accurately predict one base substitution that alters the ribozyme's logic function. The ability to sense the length of RNA molecules enables single ribozymes to be used as platforms for multiple interactions. These ribozymes can work as integrated circuits with the functionality of up to five logic gates. The ribozyme design is universal since the allosteric and substrate domains can be altered to sense different RNAs. In addition, the ribozymes can specifically cleave RNA molecules with triplet-repeat expansions observed in genetic disorders such as oculopharyngeal muscular dystrophy. Therefore, the designer ribozymes can be employed for scaling up computing and diagnostic networks in the fields of molecular computing and diagnostics and RNA synthetic biology.

  18. The logic of selecting an appropriate map projection in a Decision Support System (DSS)

    USGS Publications Warehouse

    Finn, Michael P.; Usery, E. Lynn; Woodard, Laura N.; Yamamoto, Kristina H.

    2017-01-01

    There are undeniable practical consequences to consider when choosing an appropriate map projection for a specific region. The surface of a globe covered by global, continental, and regional maps are so singular that each type distinctively affects the amount of distortion incurred during a projection transformation because of the an assortment of effects caused by distance, direction, scale , and area. A Decision Support System (DSS) for Map Projections of Small Scale Data was previously developed to help select an appropriate projection. This paper reports on a tutorial to accompany that DSS. The DSS poses questions interactively, allowing the user to decide on the parameters, which in turn determines the logic path to a solution. The objective of including a tutorial to accompany the DSS is achieved by visually representing the path of logic that is taken to a recommended map projection derived from the parameters the user selects. The tutorial informs the DSS user about the pedigree of the projection and provides a basic explanation of the specific projection design. This information is provided by informational pop-ups and other aids.

  19. Automated Translation of Safety Critical Application Software Specifications into PLC Ladder Logic

    NASA Technical Reports Server (NTRS)

    Leucht, Kurt W.; Semmel, Glenn S.

    2008-01-01

    The numerous benefits of automatic application code generation are widely accepted within the software engineering community. A few of these benefits include raising the abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at the NASA Kennedy Space Center (KSC) recognized the need for PLC code generation while developing their new ground checkout and launch processing system. They developed a process and a prototype software tool that automatically translates a high-level representation or specification of safety critical application software into ladder logic that executes on a PLC. This process and tool are expected to increase the reliability of the PLC code over that which is written manually, and may even lower life-cycle costs and shorten the development schedule of the new control system at KSC. This paper examines the problem domain and discusses the process and software tool that were prototyped by the KSC software engineers.

  20. Model Checking - My 27-Year Quest to Overcome the State Explosion Problem

    NASA Technical Reports Server (NTRS)

    Clarke, Ed

    2009-01-01

    Model Checking is an automatic verification technique for state-transition systems that are finite=state or that have finite-state abstractions. In the early 1980 s in a series of joint papers with my graduate students E.A. Emerson and A.P. Sistla, we proposed that Model Checking could be used for verifying concurrent systems and gave algorithms for this purpose. At roughly the same time, Joseph Sifakis and his student J.P. Queille at the University of Grenoble independently developed a similar technique. Model Checking has been used successfully to reason about computer hardware and communication protocols and is beginning to be used for verifying computer software. Specifications are written in temporal logic, which is particularly valuable for expressing concurrency properties. An intelligent, exhaustive search is used to determine if the specification is true or not. If the specification is not true, the Model Checker will produce a counterexample execution trace that shows why the specification does not hold. This feature is extremely useful for finding obscure errors in complex systems. The main disadvantage of Model Checking is the state-explosion problem, which can occur if the system under verification has many processes or complex data structures. Although the state-explosion problem is inevitable in worst case, over the past 27 years considerable progress has been made on the problem for certain classes of state-transition systems that occur often in practice. In this talk, I will describe what Model Checking is, how it works, and the main techniques that have been developed for combating the state explosion problem.

  1. Runtime Verification of C Programs

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus

    2008-01-01

    We present in this paper a framework, RMOR, for monitoring the execution of C programs against state machines, expressed in a textual (nongraphical) format in files separate from the program. The state machine language has been inspired by a graphical state machine language RCAT recently developed at the Jet Propulsion Laboratory, as an alternative to using Linear Temporal Logic (LTL) for requirements capture. Transitions between states are labeled with abstract event names and Boolean expressions over such. The abstract events are connected to code fragments using an aspect-oriented pointcut language similar to ASPECTJ's or ASPECTC's pointcut language. The system is implemented in the C analysis and transformation package CIL, and is programmed in OCAML, the implementation language of CIL. The work is closely related to the notion of stateful aspects within aspect-oriented programming, where pointcut languages are extended with temporal assertions over the execution trace.

  2. Temporal Precedence Checking for Switched Models and its Application to a Parallel Landing Protocol

    NASA Technical Reports Server (NTRS)

    Duggirala, Parasara Sridhar; Wang, Le; Mitra, Sayan; Viswanathan, Mahesh; Munoz, Cesar A.

    2014-01-01

    This paper presents an algorithm for checking temporal precedence properties of nonlinear switched systems. This class of properties subsume bounded safety and capture requirements about visiting a sequence of predicates within given time intervals. The algorithm handles nonlinear predicates that arise from dynamics-based predictions used in alerting protocols for state-of-the-art transportation systems. It is sound and complete for nonlinear switch systems that robustly satisfy the given property. The algorithm is implemented in the Compare Execute Check Engine (C2E2) using validated simulations. As a case study, a simplified model of an alerting system for closely spaced parallel runways is considered. The proposed approach is applied to this model to check safety properties of the alerting logic for different operating conditions such as initial velocities, bank angles, aircraft longitudinal separation, and runway separation.

  3. Support vector inductive logic programming outperforms the naive Bayes classifier and inductive logic programming for the classification of bioactive chemical compounds.

    PubMed

    Cannon, Edward O; Amini, Ata; Bender, Andreas; Sternberg, Michael J E; Muggleton, Stephen H; Glen, Robert C; Mitchell, John B O

    2007-05-01

    We investigate the classification performance of circular fingerprints in combination with the Naive Bayes Classifier (MP2D), Inductive Logic Programming (ILP) and Support Vector Inductive Logic Programming (SVILP) on a standard molecular benchmark dataset comprising 11 activity classes and about 102,000 structures. The Naive Bayes Classifier treats features independently while ILP combines structural fragments, and then creates new features with higher predictive power. SVILP is a very recently presented method which adds a support vector machine after common ILP procedures. The performance of the methods is evaluated via a number of statistical measures, namely recall, specificity, precision, F-measure, Matthews Correlation Coefficient, area under the Receiver Operating Characteristic (ROC) curve and enrichment factor (EF). According to the F-measure, which takes both recall and precision into account, SVILP is for seven out of the 11 classes the superior method. The results show that the Bayes Classifier gives the best recall performance for eight of the 11 targets, but has a much lower precision, specificity and F-measure. The SVILP model on the other hand has the highest recall for only three of the 11 classes, but generally far superior specificity and precision. To evaluate the statistical significance of the SVILP superiority, we employ McNemar's test which shows that SVILP performs significantly (p < 5%) better than both other methods for six out of 11 activity classes, while being superior with less significance for three of the remaining classes. While previously the Bayes Classifier was shown to perform very well in molecular classification studies, these results suggest that SVILP is able to extract additional knowledge from the data, thus improving classification results further.

  4. Four States, Four Projects, One Mission: Collectively Enhancing Mental and Behavioral Health Capacity Throughout the Gulf Coast.

    PubMed

    Langhinrichsen-Rohling, Jennifer; Osofsky, Howard; Osofsky, Joy; Rohrer, Glenn; Rehner, Timothy

    The 2010 Deepwater Horizon oil spill triggered numerous concerns regarding the health and well-being of citizens within the already vulnerable Gulf Coast region. Four Mental and Behavioral Health Capacity Projects (MBHCPs) united to form the Quad-State MBHCP component of the Gulf Region Health Outreach Program (GRHOP). Their shared mission was to increase mental and behavioral health (MBH) capacity within coastal counties of Louisiana, Mississippi, Alabama, and the Florida Panhandle. To describe strategies used to collectively enhance the impact of the 4 state-specific MBHCPs and to share lessons learned from a multistate collaborative flexibly designed to meet a shared mission. Archival materials were assessed. They included attendance sheets/notes from regularly scheduled group meetings, GRHOP quarterly and annual reports, and state-specific MBHCP logic models. Nationally available data on MBH services provided in project-relevant primary care sites were also examined. Three strategies were found to be effective facilitators of collective success: (i) reciprocal participation in the backbone organization (GRHOP); (ii) creation and comparison of state-specific MBHCP logic models and activities; and (iii) cross-fertilization among the MBHCP state-specific logic models, a unified Quad-State, and the GRHOP-wide logic model to generate additional synergistic endeavors and measureable outcomes. Examples of region-wide MBHCP success, such as uptake in integrated health services in health care clinics across the jurisdiction of investment, are presented. Isolated approaches to complex issues are, at times, ineffective. The Collective Impact (CI) model, with an emphasis on coordination among existing organizations, stakeholders, and the public, can serve as a guidepost to facilitate sustainable change even when used in a modified form. Strategies discussed herein for maximizing the 5 prescribed CI conditions provide an important roadmap for how to interface among multidisciplinary projects seeking to address the same, large-scale public health problem.

  5. Generic algorithms for high performance scalable geocomputing

    NASA Astrophysics Data System (ADS)

    de Jong, Kor; Schmitz, Oliver; Karssenberg, Derek

    2016-04-01

    During the last decade, the characteristics of computing hardware have changed a lot. For example, instead of a single general purpose CPU core, personal computers nowadays contain multiple cores per CPU and often general purpose accelerators, like GPUs. Additionally, compute nodes are often grouped together to form clusters or a supercomputer, providing enormous amounts of compute power. For existing earth simulation models to be able to use modern hardware platforms, their compute intensive parts must be rewritten. This can be a major undertaking and may involve many technical challenges. Compute tasks must be distributed over CPU cores, offloaded to hardware accelerators, or distributed to different compute nodes. And ideally, all of this should be done in such a way that the compute task scales well with the hardware resources. This presents two challenges: 1) how to make good use of all the compute resources and 2) how to make these compute resources available for developers of simulation models, who may not (want to) have the required technical background for distributing compute tasks. The first challenge requires the use of specialized technology (e.g.: threads, OpenMP, MPI, OpenCL, CUDA). The second challenge requires the abstraction of the logic handling the distribution of compute tasks from the model-specific logic, hiding the technical details from the model developer. To assist the model developer, we are developing a C++ software library (called Fern) containing algorithms that can use all CPU cores available in a single compute node (distributing tasks over multiple compute nodes will be done at a later stage). The algorithms are grid-based (finite difference) and include local and spatial operations such as convolution filters. The algorithms handle distribution of the compute tasks to CPU cores internally. In the resulting model the low-level details of how this is done is separated from the model-specific logic representing the modeled system. This contrasts with practices in which code for distributing of compute tasks is mixed with model-specific code, and results in a better maintainable model. For flexibility and efficiency, the algorithms are configurable at compile-time with the respect to the following aspects: data type, value type, no-data handling, input value domain handling, and output value range handling. This makes the algorithms usable in very different contexts, without the need for making intrusive changes to existing models when using them. Applications that benefit from using the Fern library include the construction of forward simulation models in (global) hydrology (e.g. PCR-GLOBWB (Van Beek et al. 2011)), ecology, geomorphology, or land use change (e.g. PLUC (Verstegen et al. 2014)) and manipulation of hyper-resolution land surface data such as digital elevation models and remote sensing data. Using the Fern library, we have also created an add-on to the PCRaster Python Framework (Karssenberg et al. 2010) allowing its users to speed up their spatio-temporal models, sometimes by changing just a single line of Python code in their model. In our presentation we will give an overview of the design of the algorithms, providing examples of different contexts where they can be used to replace existing sequential algorithms, including the PCRaster environmental modeling software (www.pcraster.eu). We will show how the algorithms can be configured to behave differently when necessary. References Karssenberg, D., Schmitz, O., Salamon, P., De Jong, K. and Bierkens, M.F.P., 2010, A software framework for construction of process-based stochastic spatio-temporal models and data assimilation. Environmental Modelling & Software, 25, pp. 489-502, Link. Best Paper Award 2010: Software and Decision Support. Van Beek, L. P. H., Y. Wada, and M. F. P. Bierkens. 2011. Global monthly water stress: 1. Water balance and water availability. Water Resources Research. 47. Verstegen, J. A., D. Karssenberg, F. van der Hilst, and A. P. C. Faaij. 2014. Identifying a land use change cellular automaton by Bayesian data assimilation. Environmental Modelling & Software 53:121-136.

  6. Spatial Data Transfer Standard (SDTS), part 1 : logical specifications

    DOT National Transportation Integrated Search

    1997-11-20

    This document contains a specification of the Spatial Data Transfer Standard (SDTS), that will serve as a national spatial data transfer mechanism for the United States. As such it is designed to transfer a wide variety of data structures that are us...

  7. Rigorous Science: a How-To Guide

    PubMed Central

    Fang, Ferric C.

    2016-01-01

    ABSTRACT Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word “rigor” is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education. PMID:27834205

  8. Rigorous Science: a How-To Guide.

    PubMed

    Casadevall, Arturo; Fang, Ferric C

    2016-11-08

    Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word "rigor" is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education. Copyright © 2016 Casadevall and Fang.

  9. An Amphibious Ship-To-Shore Simulation for Use on an IBM PC (Personal Computer)

    DTIC Science & Technology

    1984-09-01

    CA : «< <- j Special ■ *- amphibious ship- an IBM Personal ion of the phy- he logic used analysis, and a DD | JAM 11 1473 COITION...research, for instance, wiL1 be geared toward a technically oriented person who is familiar with computers, programming and the associated logic. A...problem, often vaguely stated by the decision aaker , into precise and operational terms [Ref. Hz p.51]. The analysis begins with specification of the

  10. A Novel Triggerless Approach for Modeling Mass Wasting Susceptibility

    NASA Astrophysics Data System (ADS)

    Aly, M. H.; Rowden, K. W.

    2017-12-01

    Common approaches for modeling mass wasting susceptibility rely on using triggers, which are catalysts for failure, as critical inputs. Frequently used triggers include removal of the toe of a slope or vegetation and time correlated events such as seismicity or heavy precipitation. When temporal data are unavailable, correlating triggers with a particular mass wasting event (MWE) is futile. Meanwhile, geologic structures directly influence slope stability and are typically avoided in alternative modeling approaches. Depending on strata's dip direction, underlying geology can make a slope either stronger or weaker. To heuristically understand susceptibility and reliably infer risk, without being constrained by the previously mentioned limitations, a novel triggerless approach is conceived in this study. Core requisites include a digital elevation model and digitized geologic maps containing geologic formations delineated as polygons encompassing adequate distribution of structural attitudes. Tolerably simple geology composed of gently deformed, relatively flat-lying Carboniferous strata with minimal faulting or monoclines, ideal for applying this new triggerless approach, is found in the Boston Mountains, NW Arkansas, where 47 MWEs are documented. Two models are then created; one model has integrated Empirical Bayesian Kriging (EBK) and fuzzy logic, while the second model has employed a standard implementation of a weighted overlay. Statistical comparisons show that the first model has identified 83%, compared to only 28% for the latter model, of the failure events in categories ranging from moderate to very high susceptibility. These results demonstrate that the introduced triggerless approach is efficiently capable of modeling mass wasting susceptibility, by incorporating EBK and fuzzy logic, in areas lacking temporal datasets.

  11. Improving Cognitive Abilities and e-Inclusion in Children with Cerebral Palsy

    NASA Astrophysics Data System (ADS)

    Martinengo, Chiara; Curatelli, Francesco

    Besides overcoming the motor barriers for accessing to computers and Internet, ICT tools can provide a very useful, and often necessary, support for the cognitive development of motor-impaired children with cerebral palsy. In fact, software tools for computation and communication allow teachers to put into effect, in a more complete and efficient way, the learning methods and the educational plans studied for the child. In the present article, after a brief analysis of the general objectives to be pursued for favouring the learning for children with cerebral palsy, we take account of some specific difficulties in the logical-linguistic and logical-mathematical fields, and we show how they can be overcome using general ICT tools and specifically implemented software programs.

  12. Floral Morphogenesis: Stochastic Explorations of a Gene Network Epigenetic Landscape

    PubMed Central

    Aldana, Maximino; Benítez, Mariana; Cortes-Poza, Yuriria; Espinosa-Soto, Carlos; Hartasánchez, Diego A.; Lotto, R. Beau; Malkin, David; Escalera Santos, Gerardo J.; Padilla-Longoria, Pablo

    2008-01-01

    In contrast to the classical view of development as a preprogrammed and deterministic process, recent studies have demonstrated that stochastic perturbations of highly non-linear systems may underlie the emergence and stability of biological patterns. Herein, we address the question of whether noise contributes to the generation of the stereotypical temporal pattern in gene expression during flower development. We modeled the regulatory network of organ identity genes in the Arabidopsis thaliana flower as a stochastic system. This network has previously been shown to converge to ten fixed-point attractors, each with gene expression arrays that characterize inflorescence cells and primordial cells of sepals, petals, stamens, and carpels. The network used is binary, and the logical rules that govern its dynamics are grounded in experimental evidence. We introduced different levels of uncertainty in the updating rules of the network. Interestingly, for a level of noise of around 0.5–10%, the system exhibited a sequence of transitions among attractors that mimics the sequence of gene activation configurations observed in real flowers. We also implemented the gene regulatory network as a continuous system using the Glass model of differential equations, that can be considered as a first approximation of kinetic-reaction equations, but which are not necessarily equivalent to the Boolean model. Interestingly, the Glass dynamics recover a temporal sequence of attractors, that is qualitatively similar, although not identical, to that obtained using the Boolean model. Thus, time ordering in the emergence of cell-fate patterns is not an artifact of synchronous updating in the Boolean model. Therefore, our model provides a novel explanation for the emergence and robustness of the ubiquitous temporal pattern of floral organ specification. It also constitutes a new approach to understanding morphogenesis, providing predictions on the population dynamics of cells with different genetic configurations during development. PMID:18978941

  13. Dynamic Network-Based Epistasis Analysis: Boolean Examples

    PubMed Central

    Azpeitia, Eugenio; Benítez, Mariana; Padilla-Longoria, Pablo; Espinosa-Soto, Carlos; Alvarez-Buylla, Elena R.

    2011-01-01

    In this article we focus on how the hierarchical and single-path assumptions of epistasis analysis can bias the inference of gene regulatory networks. Here we emphasize the critical importance of dynamic analyses, and specifically illustrate the use of Boolean network models. Epistasis in a broad sense refers to gene interactions, however, as originally proposed by Bateson, epistasis is defined as the blocking of a particular allelic effect due to the effect of another allele at a different locus (herein, classical epistasis). Classical epistasis analysis has proven powerful and useful, allowing researchers to infer and assign directionality to gene interactions. As larger data sets are becoming available, the analysis of classical epistasis is being complemented with computer science tools and system biology approaches. We show that when the hierarchical and single-path assumptions are not met in classical epistasis analysis, the access to relevant information and the correct inference of gene interaction topologies is hindered, and it becomes necessary to consider the temporal dynamics of gene interactions. The use of dynamical networks can overcome these limitations. We particularly focus on the use of Boolean networks that, like classical epistasis analysis, relies on logical formalisms, and hence can complement classical epistasis analysis and relax its assumptions. We develop a couple of theoretical examples and analyze them from a dynamic Boolean network model perspective. Boolean networks could help to guide additional experiments and discern among alternative regulatory schemes that would be impossible or difficult to infer without the elimination of these assumption from the classical epistasis analysis. We also use examples from the literature to show how a Boolean network-based approach has resolved ambiguities and guided epistasis analysis. Our article complements previous accounts, not only by focusing on the implications of the hierarchical and single-path assumption, but also by demonstrating the importance of considering temporal dynamics, and specifically introducing the usefulness of Boolean network models and also reviewing some key properties of network approaches. PMID:22645556

  14. Computation by symmetry operations in a structured model of the brain: Recognition of rotational invariance and time reversal

    NASA Astrophysics Data System (ADS)

    McGrann, John V.; Shaw, Gordon L.; Shenoy, Krishna V.; Leng, Xiaodan; Mathews, Robert B.

    1994-06-01

    Symmetries have long been recognized as a vital component of physical and biological systems. What we propose here is that symmetry operations are an important feature of higher brain function and result from the spatial and temporal modularity of the cortex. These symmetry operations arise naturally in the trion model of the cortex. The trion model is a highly structured mathematical realization of the Mountcastle organizational principle [Mountcastle, in The Mindful Brain (MIT, Cambridge, 1978)] in which the cortical column is the basic neural network of the cortex and is comprised of subunit minicolumns, which are idealized as trions with three levels of firing. A columnar network of a small number of trions has a large repertoire of quasistable, periodic spatial-temporal firing magic patterns (MP's), which can be excited. The MP's are related by specific symmetries: Spatial rotation, parity, ``spin'' reversal, and time reversal as well as other ``global'' symmetry operations in this abstract internal language of the brain. These MP's can be readily enhanced (as well as inherent categories of MP's) by only a small change in connection strengths via a Hebb learning rule. Learning introduces small breaking of the symmetries in the connectivities which enables a symmetry in the patterns to be recognized in the Monte Carlo evolution of the MP's. Examples of the recognition of rotational invariance and of a time-reversed pattern are presented. We propose the possibility of building a logic device from the hardware implementation of a higher level architecture of trion cortical columns.

  15. The importance of context in logic model construction for a multi-site community-based Aboriginal driver licensing program.

    PubMed

    Cullen, Patricia; Clapham, Kathleen; Byrne, Jake; Hunter, Kate; Senserrick, Teresa; Keay, Lisa; Ivers, Rebecca

    2016-08-01

    Evidence indicates that Aboriginal people are underrepresented among driver licence holders in New South Wales, which has been attributed to licensing barriers for Aboriginal people. The Driving Change program was developed to provide culturally responsive licensing services that engage Aboriginal communities and build local capacity. This paper outlines the formative evaluation of the program, including logic model construction and exploration of contextual factors. Purposive sampling was used to identify key informants (n=12) from a consultative committee of key stakeholders and program staff. Semi-structured interviews were transcribed and thematically analysed. Data from interviews informed development of the logic model. Participants demonstrated high level of support for the program and reported that it filled an important gap. The program context revealed systemic barriers to licensing that were correspondingly targeted by specific program outputs in the logic model. Addressing underlying assumptions of the program involved managing local capacity and support to strengthen implementation. This formative evaluation highlights the importance of exploring program context as a crucial first step in logic model construction. The consultation process assisted in clarifying program goals and ensuring that the program was responding to underlying systemic factors that contribute to inequitable licensing access for Aboriginal people. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Implementing biological logic gates using gold nanoparticles conjugated to fluorophores

    NASA Astrophysics Data System (ADS)

    Barnoy, Eran A.; Popovtzer, Rachela; Fixler, Dror

    2018-02-01

    We describe recent research in which we explored biologically relevant logic gates using gold nanoparticles (GNPs) conjugated to fluorophores and tracing the results remotely by time-domain fluorescence lifetime imaging microscopy (FLIM). GNPs have a well-known effect on nearby fluorophores in terms of their fluorescence intensity (FI - increase or decrease) as well as fluorescence lifetime (FLT). We have designed a few bio-switch systems in which the FLIMdetected fluorescence varies after biologically relevant stimulation. Some of our tools include fluorescein diacetate (FDA) which can be activated by either esterases or pH, peptide chains cleavable by caspase 3, and the polymer polyacrylic acid which varies in size based on surrounding pH. After conjugating GNPs to chosen fluorophores, we have successfully demonstrated the logic gates of NOT, AND, OR, NAND, NOR, and XOR by imaging different stages of activation. These logic gates have been demonstrated both in solutions as well as within cultured cells, thereby possibly opening the door for nanoparticulate in vivo smart detection. While these initial probes are mainly tools for intelligent detection systems, they lay the foundation for logic gates functioning in conjunction so as to lead to a form of in vivo biological computing, where the system would be able to release proper treatment options in specific situations without external influence.

  17. The institutional logic of integrated care: an ethnography of patient transitions.

    PubMed

    Shaw, James A; Kontos, Pia; Martin, Wendy; Victor, Christina

    2017-03-20

    Purpose The purpose of this paper is to use theories of institutional logics and institutional entrepreneurship to examine how and why macro-, meso-, and micro-level influences inter-relate in the implementation of integrated transitional care out of hospital in the English National Health Service. Design/methodology/approach The authors conducted an ethnographic case study of a hospital and surrounding services within a large urban centre in England. Specific methods included qualitative interviews with patients/caregivers, health/social care providers, and organizational leaders; observations of hospital transition planning meetings, community "hub" meetings, and other instances of transition planning; reviews of patient records; and analysis of key policy documents. Analysis was iterative and informed by theory on institutional logics and institutional entrepreneurship. Findings Organizational leaders at the meso-level of health and social care promoted a partnership logic of integrated care in response to conflicting institutional ideas found within a key macro-level policy enacted in 2003 (The Community Care (Delayed Discharges) Act). Through institutional entrepreneurship at the micro-level, the partnership logic became manifest in the form of relationship work among health and social care providers; they sought to build strong interpersonal relationships to enact more integrated transitional care. Originality/value This study has three key implications. First, efforts to promote integrated care should strategically include institutional entrepreneurs at the organizational and clinical levels. Second, integrated care initiatives should emphasize relationship-building among health and social care providers. Finally, theoretical development on institutional logics should further examine the role of interpersonal relationships in facilitating the "spread" of logics between macro-, meso-, and micro-level influences on inter-organizational change.

  18. Health literacy and logical inconsistencies in valuations of hypothetical health states: results from the Canadian EQ-5D-5L valuation study.

    PubMed

    Al Sayah, Fatima; Johnson, Jeffrey A; Ohinmaa, Arto; Xie, Feng; Bansback, Nick

    2017-06-01

    To examine the association of health literacy with logical inconsistencies in time trade-off valuations of hypothetical health states described by the EQ-5D-5L classification system. Data from the EQ-5D-5L Canadian Valuation study were used. Health literacy was assessed using the Brief Health Literacy Screen. A health state valuation was considered logically inconsistent if a respondent gave the same or lower value for a very mild health state compared to the value given to 55555, or gave the same or lower value for a very mild health state compared to value assigned to the majority of the health states that are dominated by the very mild health state. Average age of respondents (N = 1209) was 48 (SD = 17) years, 45% were male, 7% reported inadequate health literacy, and 11% had a logical inconsistency. In adjusted analysis, participants with inadequate health literacy were 2.2 (95%CI: 1.2, 4.0; p = 0.014) times more likely to provide an inconsistent valuation compared to those with adequate health literacy. More specifically, those who had problems in "understanding written information" and "reading health information" were more likely to have a logical inconsistency compared to those who did not. However, lacking "confidence in completing medical forms" was not associated with logical inconsistencies. Health literacy was associated with logical inconsistencies in valuations of hypothetical health states described by the EQ-5D-5L classification system. Valuations studies should consider assessing health literacy, and explore better ways to introduce the valuation tasks or use simpler approaches of health preferences elicitation for individuals with inadequate health literacy.

  19. Enhancing genomic laboratory reports from the patients' view: A qualitative analysis.

    PubMed

    Stuckey, Heather; Williams, Janet L; Fan, Audrey L; Rahm, Alanna Kulchak; Green, Jamie; Feldman, Lynn; Bonhag, Michele; Zallen, Doris T; Segal, Michael M; Williams, Marc S

    2015-10-01

    The purpose of this study was to develop a family genomic laboratory report designed to communicate genome sequencing results to parents of children who were participating in a whole genome sequencing clinical research study. Semi-structured interviews were conducted with parents of children who participated in a whole genome sequencing clinical research study to address the elements, language and format of a sample family-directed genome laboratory report. The qualitative interviews were followed by two focus groups aimed at evaluating example presentations of information about prognosis and next steps related to the whole genome sequencing result. Three themes emerged from the qualitative data: (i) Parents described a continual search for valid information and resources regarding their child's condition, a need that prior reports did not meet for parents; (ii) Parents believed that the Family Report would help facilitate communication with physicians and family members; and (iii) Parents identified specific items they appreciated in a genomics Family Report: simplicity of language, logical flow, visual appeal, information on what to expect in the future and recommended next steps. Parents affirmed their desire for a family genomic results report designed for their use and reference. They articulated the need for clear, easy to understand language that provided information with temporal detail and specific recommendations regarding relevant findings consistent with that available to clinicians. © 2015 Wiley Periodicals, Inc.

  20. Enhancing genomic laboratory reports from the patients' view: A qualitative analysis

    PubMed Central

    Stuckey, Heather; Fan, Audrey L.; Rahm, Alanna Kulchak; Green, Jamie; Feldman, Lynn; Bonhag, Michele; Zallen, Doris T.; Segal, Michael M.; Williams, Marc S.

    2015-01-01

    The purpose of this study was to develop a family genomic laboratory report designed to communicate genome sequencing results to parents of children who were participating in a whole genome sequencing clinical research study. Semi‐structured interviews were conducted with parents of children who participated in a whole genome sequencing clinical research study to address the elements, language and format of a sample family‐directed genome laboratory report. The qualitative interviews were followed by two focus groups aimed at evaluating example presentations of information about prognosis and next steps related to the whole genome sequencing result. Three themes emerged from the qualitative data: (i) Parents described a continual search for valid information and resources regarding their child's condition, a need that prior reports did not meet for parents; (ii) Parents believed that the Family Report would help facilitate communication with physicians and family members; and (iii) Parents identified specific items they appreciated in a genomics Family Report: simplicity of language, logical flow, visual appeal, information on what to expect in the future and recommended next steps. Parents affirmed their desire for a family genomic results report designed for their use and reference. They articulated the need for clear, easy to understand language that provided information with temporal detail and specific recommendations regarding relevant findings consistent with that available to clinicians. PMID:26086630

  1. A solution to the biodiversity paradox by logical deterministic cellular automata.

    PubMed

    Kalmykov, Lev V; Kalmykov, Vyacheslav L

    2015-06-01

    The paradox of biological diversity is the key problem of theoretical ecology. The paradox consists in the contradiction between the competitive exclusion principle and the observed biodiversity. The principle is important as the basis for ecological theory. On a relatively simple model we show a mechanism of indefinite coexistence of complete competitors which violates the known formulations of the competitive exclusion principle. This mechanism is based on timely recovery of limiting resources and their spatio-temporal allocation between competitors. Because of limitations of the black-box modeling there was a problem to formulate the exclusion principle correctly. Our white-box multiscale model of two-species competition is based on logical deterministic individual-based cellular automata. This approach provides an automatic deductive inference on the basis of a system of axioms, and gives a direct insight into mechanisms of the studied system. It is one of the most promising methods of artificial intelligence. We reformulate and generalize the competitive exclusion principle and explain why this formulation provides a solution of the biodiversity paradox. In addition, we propose a principle of competitive coexistence.

  2. Spike processing with a graphene excitable laser

    PubMed Central

    Shastri, Bhavin J.; Nahmias, Mitchell A.; Tait, Alexander N.; Rodriguez, Alejandro W.; Wu, Ben; Prucnal, Paul R.

    2016-01-01

    Novel materials and devices in photonics have the potential to revolutionize optical information processing, beyond conventional binary-logic approaches. Laser systems offer a rich repertoire of useful dynamical behaviors, including the excitable dynamics also found in the time-resolved “spiking” of neurons. Spiking reconciles the expressiveness and efficiency of analog processing with the robustness and scalability of digital processing. We demonstrate a unified platform for spike processing with a graphene-coupled laser system. We show that this platform can simultaneously exhibit logic-level restoration, cascadability and input-output isolation—fundamental challenges in optical information processing. We also implement low-level spike-processing tasks that are critical for higher level processing: temporal pattern detection and stable recurrent memory. We study these properties in the context of a fiber laser system and also propose and simulate an analogous integrated device. The addition of graphene leads to a number of advantages which stem from its unique properties, including high absorption and fast carrier relaxation. These could lead to significant speed and efficiency improvements in unconventional laser processing devices, and ongoing research on graphene microfabrication promises compatibility with integrated laser platforms. PMID:26753897

  3. [Productive restructuring and the reallocation of work and employment: a survey of the "new" forms of social inequality].

    PubMed

    Marques, Ana Paula Pereira

    2013-06-01

    The scope of this paper is to question the inevitability of the processes of segmentation and increased precariousness of the relations of labor and employment, which are responsible for the introduction of "new" forms of social inequality that underpin the current model of development of economies and societies. It seeks to criticize the limits of global financial and economic logic, which constitute a "new spirit of capitalism," namely a kind of reverence for the natural order of things. It is therefore necessary to conduct an analytical survey of the ongoing changes in the labor market, accompanied by epistemological vigilance which makes it possible to see neoliberal (di)visions and dominant techno-deterministic theses in context. The enunciation of scenarios on the future of work will conclude this survey and will make it possible to draw attention to both the historical and temporal constraints and to the urgent need to unveil what is ideological and political in the prevailing logic of rationalization and processes to reinstate work and employment as a "central social experience" in contemporary times.

  4. Scripting Module for the Satellite Orbit Analysis Program (SOAP)

    NASA Technical Reports Server (NTRS)

    Carnright, Robert; Paget, Jim; Coggi, John; Stodden, David

    2008-01-01

    This add-on module to the SOAP software can perform changes to simulation objects based on the occurrence of specific conditions. This allows the software to encompass simulation response of scheduled or physical events. Users can manipulate objects in the simulation environment under programmatic control. Inputs to the scripting module are Actions, Conditions, and the Script. Actions are arbitrary modifications to constructs such as Platform Objects (i.e. satellites), Sensor Objects (representing instruments or communication links), or Analysis Objects (user-defined logical or numeric variables). Examples of actions include changes to a satellite orbit ( v), changing a sensor-pointing direction, and the manipulation of a numerical expression. Conditions represent the circumstances under which Actions are performed and can be couched in If-Then-Else logic, like performing v at specific times or adding to the spacecraft power only when it is being illuminated by the Sun. The SOAP script represents the entire set of conditions being considered over a specific time interval. The output of the scripting module is a series of events, which are changes to objects at specific times. As the SOAP simulation clock runs forward, the scheduled events are performed. If the user sets the clock back in time, the events within that interval are automatically undone. This script offers an interface for defining scripts where the user does not have to remember the vocabulary of various keywords. Actions can be captured by employing the same user interface that is used to define the objects themselves. Conditions can be set to invoke Actions by selecting them from pull-down lists. Users define the script by selecting from the pool of defined conditions. Many space systems have to react to arbitrary events that can occur from scheduling or from the environment. For example, an instrument may cease to draw power when the area that it is tasked to observe is not in view. The contingency of the planetary body blocking the line of sight is a condition upon which the power being drawn is set to zero. It remains at zero until the observation objective is again in view. Computing the total power drawn by the instrument over a period of days or weeks can now take such factors into consideration. What makes the architecture especially powerful is that the scripting module can look ahead and behind in simulation time, and this temporal versatility can be leveraged in displays such as x-y plots. For example, a plot of a satellite s altitude as a function of time can take changes to the orbit into account.

  5. The influence of role-specific self-concept and sex-role identity on career choices in science

    NASA Astrophysics Data System (ADS)

    Baker, Dale R.

    Despite much effort on the part of educators the number of females who choose science careers remains low. This research focuses on two factors which may be influencing females in their choice of careers. These factors are role-specific self-concept in science and self perception in terms of stereotypical masculine and feminine characteristics. In addition logical ability and mathematics and science courses were also examined as factors in career choice. Females preferring science related careers and females preferring nontraditional careers such as police, military and trades were found to have a positive role-specific self-concept and a masculine perception of themselves. Females preferring traditional careers such as teacher or hairdresser had a poor role-specific self-concept and a more feminine perception of themselves. Males as a group were found to have a more positive role-specific self-concept than females. Logical ability was also related to a science career preference for both males and females. Males expected to take more higher level math courses than females, while females preferring science careers expected to take the most higher level science courses.

  6. States of Consciousness and State-Specific Sciences

    ERIC Educational Resources Information Center

    Tart, Charles T.

    1972-01-01

    Proposes the development of state-specific sciences" to overcome the problems of scientifically studying altered states of consciousness induced by drugs or meditation from the paradigm of the ordinary consciousness state. The requirements of good observation, public nature of the observation, logical theorizing, and testing of theories by…

  7. Active control of flexible structures using a fuzzy logic algorithm

    NASA Astrophysics Data System (ADS)

    Cohen, Kelly; Weller, Tanchum; Ben-Asher, Joseph Z.

    2002-08-01

    This study deals with the development and application of an active control law for the vibration suppression of beam-like flexible structures experiencing transient disturbances. Collocated pairs of sensors/actuators provide active control of the structure. A design methodology for the closed-loop control algorithm based on fuzzy logic is proposed. First, the behavior of the open-loop system is observed. Then, the number and locations of collocated actuator/sensor pairs are selected. The proposed control law, which is based on the principles of passivity, commands the actuator to emulate the behavior of a dynamic vibration absorber. The absorber is tuned to a targeted frequency, whereas the damping coefficient of the dashpot is varied in a closed loop using a fuzzy logic based algorithm. This approach not only ensures inherent stability associated with passive absorbers, but also circumvents the phenomenon of modal spillover. The developed controller is applied to the AFWAL/FIB 10 bar truss. Simulated results using MATLAB© show that the closed-loop system exhibits fairly quick settling times and desirable performance, as well as robustness characteristics. To demonstrate the robustness of the control system to changes in the temporal dynamics of the flexible structure, the transient response to a considerably perturbed plant is simulated. The modal frequencies of the 10 bar truss were raised as well as lowered substantially, thereby significantly perturbing the natural frequencies of vibration. For these cases, too, the developed control law provides adequate settling times and rates of vibrational energy dissipation.

  8. Unique Non-Keplerian Orbit Vantage Locations for Sun-Earth Connection and Earth Science Vision Roadmaps

    NASA Technical Reports Server (NTRS)

    Folta, David; Young, Corissa; Ross, Adam

    2001-01-01

    The purpose of this investigation is to determine the feasibility of attaining and maintaining unique non-Keplerian orbit vantage locations in the Earth/Moon environment in order to obtain continuous scientific measurements. The principal difficulty associated with obtaining continuous measurements is the temporal nature of astrodynamics, i.e., classical orbits. This investigation demonstrates advanced trajectory designs to meet demanding science requirements which cannot be met following traditional orbital mechanic logic. Examples of continuous observer missions addressed include Earth pole-sitters and unique vertical libration orbits that address Sun-Earth Connection and Earth Science Vision roadmaps.

  9. Verification and Planning for Stochastic Processes with Asynchronous Events

    DTIC Science & Technology

    2005-01-01

    Massachusetts: The MIT Press. Bratley, Paul , Bennett L. Fox, and Linus E. Schrage. 1987. A Guide to Simulation. 2nd ed. Berlin: Springer. BIBLIOGRAPHY...π,τ〉 o δ(te −∞) Here, δ(t − t0) is the Dirac delta function (Dirac 1927 , p. 625) with the property that ∫ x −∞ δ(t − t0) dt is 0 for x < t0 and 1 for...no. 3: 207–226. Bernstein, Arthur and Paul K. Harter, Jr. 1981. Proving real-time properties of programs with temporal logic. In Proceedings of the

  10. Temporal and Spatial prediction of groundwater levels using Artificial Neural Networks, Fuzzy logic and Kriging interpolation.

    NASA Astrophysics Data System (ADS)

    Tapoglou, Evdokia; Karatzas, George P.; Trichakis, Ioannis C.; Varouchakis, Emmanouil A.

    2014-05-01

    The purpose of this study is to examine the use of Artificial Neural Networks (ANN) combined with kriging interpolation method, in order to simulate the hydraulic head both spatially and temporally. Initially, ANNs are used for the temporal simulation of the hydraulic head change. The results of the most appropriate ANNs, determined through a fuzzy logic system, are used as an input for the kriging algorithm where the spatial simulation is conducted. The proposed algorithm is tested in an area located across Isar River in Bayern, Germany and covers an area of approximately 7800 km2. The available data extend to a time period from 1/11/2008 to 31/10/2012 (1460 days) and include the hydraulic head at 64 wells, temperature and rainfall at 7 weather stations and surface water elevation at 5 monitoring stations. One feedforward ANN was trained for each of the 64 wells, where hydraulic head data are available, using a backpropagation algorithm. The most appropriate input parameters for each wells' ANN are determined considering their proximity to the measuring station, as well as their statistical characteristics. For the rainfall, the data for two consecutive time lags for best correlated weather station, as well as a third and fourth input from the second best correlated weather station, are used as an input. The surface water monitoring stations with the three best correlations for each well are also used in every case. Finally, the temperature for the best correlated weather station is used. Two different architectures are considered and the one with the best results is used henceforward. The output of the ANNs corresponds to the hydraulic head change per time step. These predictions are used in the kriging interpolation algorithm. However, not all 64 simulated values should be used. The appropriate neighborhood for each prediction point is constructed based not only on the distance between known and prediction points, but also on the training and testing error of the ANN. Therefore, the neighborhood of each prediction point is the best available. Then, the appropriate variogram is determined, by fitting the experimental variogram to a theoretical variogram model. Three models are examined, the linear, the exponential and the power-law. Finally, the hydraulic head change is predicted for every grid cell and for every time step used. All the algorithms used were developed in Visual Basic .NET, while the visualization of the results was performed in MATLAB using the .NET COM Interoperability. The results are evaluated using leave one out cross-validation and various performance indicators. The best results were achieved by using ANNs with two hidden layers, consisting of 20 and 15 nodes respectively and by using power-law variogram with the fuzzy logic system.

  11. Boolean and brain-inspired computing using spin-transfer torque devices

    NASA Astrophysics Data System (ADS)

    Fan, Deliang

    Several completely new approaches (such as spintronic, carbon nanotube, graphene, TFETs, etc.) to information processing and data storage technologies are emerging to address the time frame beyond current Complementary Metal-Oxide-Semiconductor (CMOS) roadmap. The high speed magnetization switching of a nano-magnet due to current induced spin-transfer torque (STT) have been demonstrated in recent experiments. Such STT devices can be explored in compact, low power memory and logic design. In order to truly leverage STT devices based computing, researchers require a re-think of circuit, architecture, and computing model, since the STT devices are unlikely to be drop-in replacements for CMOS. The potential of STT devices based computing will be best realized by considering new computing models that are inherently suited to the characteristics of STT devices, and new applications that are enabled by their unique capabilities, thereby attaining performance that CMOS cannot achieve. The goal of this research is to conduct synergistic exploration in architecture, circuit and device levels for Boolean and brain-inspired computing using nanoscale STT devices. Specifically, we first show that the non-volatile STT devices can be used in designing configurable Boolean logic blocks. We propose a spin-memristor threshold logic (SMTL) gate design, where memristive cross-bar array is used to perform current mode summation of binary inputs and the low power current mode spintronic threshold device carries out the energy efficient threshold operation. Next, for brain-inspired computing, we have exploited different spin-transfer torque device structures that can implement the hard-limiting and soft-limiting artificial neuron transfer functions respectively. We apply such STT based neuron (or 'spin-neuron') in various neural network architectures, such as hierarchical temporal memory and feed-forward neural network, for performing "human-like" cognitive computing, which show more than two orders of lower energy consumption compared to state of the art CMOS implementation. Finally, we show the dynamics of injection locked Spin Hall Effect Spin-Torque Oscillator (SHE-STO) cluster can be exploited as a robust multi-dimensional distance metric for associative computing, image/ video analysis, etc. Our simulation results show that the proposed system architecture with injection locked SHE-STOs and the associated CMOS interface circuits can be suitable for robust and energy efficient associative computing and pattern matching.

  12. Fuzzy logic control system to provide autonomous collision avoidance for Mars rover vehicle

    NASA Technical Reports Server (NTRS)

    Murphy, Michael G.

    1990-01-01

    NASA is currently involved with planning unmanned missions to Mars to investigate the terrain and process soil samples in advance of a manned mission. A key issue involved in unmanned surface exploration on Mars is that of supporting autonomous maneuvering since radio communication involves lengthy delays. It is anticipated that specific target locations will be designated for sample gathering. In maneuvering autonomously from a starting position to a target position, the rover will need to avoid a variety of obstacles such as boulders or troughs that may block the shortest path to the target. The physical integrity of the rover needs to be maintained while minimizing the time and distance required to attain the target position. Fuzzy logic lends itself well to building reliable control systems that function in the presence of uncertainty or ambiguity. The following major issues are discussed: (1) the nature of fuzzy logic control systems and software tools to implement them; (2) collision avoidance in the presence of fuzzy parameters; and (3) techniques for adaptation in fuzzy logic control systems.

  13. Logical recoding of S-R rules can reverse the effects of spatial S-R correspondence.

    PubMed

    Wühr, Peter; Biebl, Rupert

    2009-02-01

    Two experiments investigated competing explanations for the reversal of spatial stimulus-response (S-R) correspondence effects (i.e., Simon effects) with an incompatible S-R mapping on the relevant, nonspatial dimension. Competing explanations were based on generalized S-R rules (logical-recoding account) or referred to display-control arrangement correspondence or to S-S congruity. In Experiment 1, compatible responses to finger-name stimuli presented at left/right locations produced normal Simon effects, whereas incompatible responses to finger-name stimuli produced an inverted Simon effect. This finding supports the logical-recoding account. In Experiment 2, spatial S-R correspondence and color S-R correspondence were varied independently, and main effects of these variables were observed. The lack of an interaction between these variables, however, disconfirms a prediction of the display-control arrangement correspondence account. Together, the results provide converging evidence for the logical-recoding account. This account claims that participants derive generalized response selection rules (e.g., the identity or reversal rule) from specific S-R rules and inadvertently apply the generalized rules to the irrelevant (spatial) S-R dimension when selecting their response.

  14. Label-free logic modules and two-layer cascade based on stem-loop probes containing a G-quadruplex domain.

    PubMed

    Guo, Yahui; Cheng, Junjie; Wang, Jine; Zhou, Xiaodong; Hu, Jiming; Pei, Renjun

    2014-09-01

    A simple, versatile, and label-free DNA computing strategy was designed by using toehold-mediated strand displacement and stem-loop probes. A full set of logic gates (YES, NOT, OR, NAND, AND, INHIBIT, NOR, XOR, XNOR) and a two-layer logic cascade were constructed. The probes contain a G-quadruplex domain, which was blocked or unfolded through inputs initiating strand displacement and the obviously distinguishable light-up fluorescent signal of G-quadruplex/NMM complex was used as the output readout. The inputs are the disease-specific nucleotide sequences with potential for clinic diagnosis. The developed versatile computing system based on our label-free and modular strategy might be adapted in multi-target diagnosis through DNA hybridization and aptamer-target interaction. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Why is number word learning hard? Evidence from bilingual learners.

    PubMed

    Wagner, Katie; Kimura, Katherine; Cheung, Pierina; Barner, David

    2015-12-01

    Young children typically take between 18 months and 2 years to learn the meanings of number words. In the present study, we investigated this developmental trajectory in bilingual preschoolers to examine the relative contributions of two factors in number word learning: (1) the construction of numerical concepts, and (2) the mapping of language specific words onto these concepts. We found that children learn the meanings of small number words (i.e., one, two, and three) independently in each language, indicating that observed delays in learning these words are attributable to difficulties in mapping words to concepts. In contrast, children generally learned to accurately count larger sets (i.e., five or greater) simultaneously in their two languages, suggesting that the difficulty in learning to count is not tied to a specific language. We also replicated previous studies that found that children learn the counting procedure before they learn its logic - i.e., that for any natural number, n, the successor of n in the count list denotes the cardinality n+1. Consistent with past studies, we found that children's knowledge of successors is first acquired incrementally. In bilinguals, we found that this knowledge exhibits item-specific transfer between languages, suggesting that the logic of the positive integers may not be stored in a language-specific format. We conclude that delays in learning the meanings of small number words are mainly due to language-specific processes of mapping words to concepts, whereas the logic and procedures of counting appear to be learned in a format that is independent of a particular language and thus transfers rapidly from one language to the other in development. Copyright © 2015. Published by Elsevier Inc.

  16. Data Quality Objectives Supporting the Environmental Soil Monitoring Program for the Idaho National Laboratory Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haney, Thomas Jay

    This document describes the process used to develop data quality objectives for the Idaho National Laboratory (INL) Environmental Soil Monitoring Program in accordance with U.S. Environmental Protection Agency guidance. This document also develops and presents the logic that was used to determine the specific number of soil monitoring locations at the INL Site, at locations bordering the INL Site, and at locations in the surrounding regional area. The monitoring location logic follows the guidance from the U.S. Department of Energy for environmental surveillance of its facilities.

  17. The Design of Fault Tolerant Quantum Dot Cellular Automata Based Logic

    NASA Technical Reports Server (NTRS)

    Armstrong, C. Duane; Humphreys, William M.; Fijany, Amir

    2002-01-01

    As transistor geometries are reduced, quantum effects begin to dominate device performance. At some point, transistors cease to have the properties that make them useful computational components. New computing elements must be developed in order to keep pace with Moore s Law. Quantum dot cellular automata (QCA) represent an alternative paradigm to transistor-based logic. QCA architectures that are robust to manufacturing tolerances and defects must be developed. We are developing software that allows the exploration of fault tolerant QCA gate architectures by automating the specification, simulation, analysis and documentation processes.

  18. An Integrated Specification and Verification Environment for Component-Based Architectures of Large-Scale Distributed Systems

    DTIC Science & Technology

    2009-05-26

    Interrupt HW Interrupt DFS Dynamic Frequency Selection TPC Transmit Power Control r- MPX Hub ! i j I Power Supply Init/Reset A/D...values of several variables: from IN_0_DAT when the Mailbox forwards data supplied by Client 0, from OUT1DAT when the conditions on the ready flags are...logically implies ip y 0, and also <j> logically implies ip y <p; (b) if for all i we have that cj>t+1 is of the form (p% y ip, then the chain

  19. Existing School Buildings: Incremental Seismic Retrofit Opportunities.

    ERIC Educational Resources Information Center

    Federal Emergency Management Agency, Washington, DC.

    The intent of this document is to provide technical guidance to school district facility managers for linking specific incremental seismic retrofit opportunities to specific maintenance and capital improvement projects. The linkages are based on logical affinities, such as technical fit, location of the work within the building, cost saving…

  20. Nanozyme-based bio-barcode assay for high sensitive and logic-controlled specific detection of multiple DNAs.

    PubMed

    Lin, Xiaodong; Liu, Yaqing; Tao, Zhanhui; Gao, Jinting; Deng, Jiankang; Yin, Jinjin; Wang, Shuo

    2017-08-15

    Since HCV and HIV share a common transmission path, high sensitive detection of HIV and HCV gene is of significant importance to improve diagnosis accuracy and cure rate at early stage for HIV virus-infected patients. In our investigation, a novel nanozyme-based bio-barcode fluorescence amplified assay is successfully developed for simultaneous detection of HIV and HCV DNAs with excellent sensitivity in an enzyme-free and label-free condition. Here, bimetallic nanoparticles, PtAu NPs , present outstanding peroxidase-like activity and act as barcode to catalyze oxidation of nonfluorescent substrate of amplex red (AR) into fluorescent resorufin generating stable and sensitive "Turn On" fluorescent output signal, which is for the first time to be integrated with bio-barcode strategy for fluorescence detection DNA. Furthermore, the provided strategy presents excellent specificity and can distinguish single-base mismatched mutant from target DNA. What interesting is that cascaded INHIBIT-OR logic gate is integrated with biosensors for the first time to distinguish individual target DNA from each other under logic function control, which presents great application in development of rapid and intelligent detection. Copyright © 2017. Published by Elsevier B.V.

  1. Target-responsive DNA-capped nanocontainer used for fabricating universal detector and performing logic operations

    PubMed Central

    Wu, Li; Ren, Jinsong; Qu, Xiaogang

    2014-01-01

    Nucleic acids have become a powerful tool in nanotechnology because of their controllable diverse conformational transitions and adaptable higher-order nanostructure. Using single-stranded DNA probes as the pore-caps for various target recognition, here we present an ultrasensitive universal electrochemical detection system based on graphene and mesoporous silica, and achieve sensitivity with all of the major classes of analytes and simultaneously realize DNA logic gate operations. The concept is based on the locking of the pores and preventing the signal-reporter molecules from escape by target-induced the conformational change of the tailored DNA caps. The coupling of ‘waking up’ gatekeeper with highly specific biochemical recognition is an innovative strategy for the detection of various targets, able to compete with classical methods which need expensive instrumentation and sophisticated experimental operations. The present study has introduced a new electrochemical signal amplification concept and also adds a new dimension to the function of graphene-mesoporous materials hybrids as multifunctional nanoscale logic devices. More importantly, the development of this approach would spur further advances in important areas, such as point-of-care diagnostics or detection of specific biological contaminations, and hold promise for use in field analysis. PMID:25249622

  2. Temporal resolution in individuals with neurological disorders

    PubMed Central

    Rabelo, Camila Maia; Weihing, Jeffrey A; Schochat, Eliane

    2015-01-01

    OBJECTIVE: Temporal processing refers to the ability of the central auditory nervous system to encode and detect subtle changes in acoustic signals. This study aims to investigate the temporal resolution ability of individuals with mesial temporal sclerosis and to determine the sensitivity and specificity of the gaps-in-noise test in identifying this type of lesion. METHOD: This prospective study investigated differences in temporal resolution between 30 individuals with normal hearing and without neurological lesions (G1) and 16 individuals with both normal hearing and mesial temporal sclerosis (G2). Test performances were compared, and the sensitivity and specificity were calculated. RESULTS: There was no difference in gap detection thresholds between the two groups, although G1 revealed better average thresholds than G2 did. The sensitivity and specificity of the gaps-in-noise test for neurological lesions were 68% and 98%, respectively. CONCLUSIONS: Temporal resolution ability is compromised in individuals with neurological lesions caused by mesial temporal sclerosis. The gaps-in-noise test was shown to be a sensitive and specific measure of central auditory dysfunction in these patients. PMID:26375561

  3. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  4. Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness

    NASA Technical Reports Server (NTRS)

    Staats, Matt; Whalen, Michael W.; Heindahl, Mats P. E.; Rajan, Ajitha

    2010-01-01

    In black-box testing, the tester creates a set of tests to exercise a system under test without regard to the internal structure of the system. Generally, no objective metric is used to measure the adequacy of black-box tests. In recent work, we have proposed three requirements coverage metrics, allowing testers to objectively measure the adequacy of a black-box test suite with respect to a set of requirements formalized as Linear Temporal Logic (LTL) properties. In this report, we evaluate the effectiveness of these coverage metrics with respect to fault finding. Specifically, we conduct an empirical study to investigate two questions: (1) do test suites satisfying a requirements coverage metric provide better fault finding than randomly generated test suites of approximately the same size?, and (2) do test suites satisfying a more rigorous requirements coverage metric provide better fault finding than test suites satisfying a less rigorous requirements coverage metric? Our results indicate (1) only one coverage metric proposed -- Unique First Cause (UFC) coverage -- is sufficiently rigorous to ensure test suites satisfying the metric outperform randomly generated test suites of similar size and (2) that test suites satisfying more rigorous coverage metrics provide better fault finding than test suites satisfying less rigorous coverage metrics.

  5. CRTC1 Nuclear Translocation Following Learning Modulates Memory Strength via Exchange of Chromatin Remodeling Complexes on the Fgf1 Gene.

    PubMed

    Uchida, Shusaku; Teubner, Brett J W; Hevi, Charles; Hara, Kumiko; Kobayashi, Ayumi; Dave, Rutu M; Shintaku, Tatsushi; Jaikhan, Pattaporn; Yamagata, Hirotaka; Suzuki, Takayoshi; Watanabe, Yoshifumi; Zakharenko, Stanislav S; Shumyatsky, Gleb P

    2017-01-10

    Memory is formed by synapse-to-nucleus communication that leads to regulation of gene transcription, but the identity and organizational logic of signaling pathways involved in this communication remain unclear. Here we find that the transcription cofactor CRTC1 is a critical determinant of sustained gene transcription and memory strength in the hippocampus. Following associative learning, synaptically localized CRTC1 is translocated to the nucleus and regulates Fgf1b transcription in an activity-dependent manner. After both weak and strong training, the HDAC3-N-CoR corepressor complex leaves the Fgf1b promoter and a complex involving the translocated CRTC1, phosphorylated CREB, and histone acetyltransferase CBP induces transient transcription. Strong training later substitutes KAT5 for CBP, a process that is dependent on CRTC1, but not on CREB phosphorylation. This in turn leads to long-lasting Fgf1b transcription and memory enhancement. Thus, memory strength relies on activity-dependent changes in chromatin and temporal regulation of gene transcription on specific CREB/CRTC1 gene targets. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  6. Logical enzyme triggered (LET) layer-by-layer nanocapsules for drug delivery system

    NASA Astrophysics Data System (ADS)

    Kelley, Marie-Michelle

    Breast cancer is the second leading cause of morbidity and mortality among women in the United States. Early detection and treatment methods have resulted in 100% 5-year survival rates for stage 0-I breast cancer. Unfortunately, the 5-year survival rate of metastatic breast cancer (stage IV) is reduced fivefold. The most challenging issues of metastatic breast cancer treatment are the ability to selectively target the adenoma and adenocarcinoma cells both in their location of origin and as they metastasize following initial treatment. Multilayer/Layer-by-Layer (LbL) nanocapsules have garnered vast interest as anticancer drug delivery systems due to their ability to be easily modified, their capacity to encapsulate a wide range of chemicals and proteins, and their improved pharmacokinetics. Multilayer nanocapsule formation requires the layering of opposing charged polyelectrolytic polymers over a removable core nanoparticle. Our goal is to have a programmable nanocapsules degrade only after receiving and validating specific breast cancer biomarkers. The overall objective is to fabricate a novel programmable LbL nanocapsule with a specific logical system that will enhance functions pertinent to drug delivery systems. Our central hypothesis is that LbL technology coupled with extracellular matrix (ECM) protein substrates will result in a logical enzyme triggered LbL nanocapsule drug delivery system. This platform represents a novel approach toward a logically regulated nano-encapsulated cancer therapy that can selectively follow and deliver chemotherapeutics to cancer cells. The rationale for this project is to overcome a crucial limitation of existing drug delivery systems where chemotherapeutic can be erroneously delivered to non-carcinogenic cells.

  7. The effect of output-input isolation on the scaling and energy consumption of all-spin logic devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Jiaxi; Haratipour, Nazila; Koester, Steven J., E-mail: skoester@umn.edu

    All-spin logic (ASL) is a novel approach for digital logic applications wherein spin is used as the state variable instead of charge. One of the challenges in realizing a practical ASL system is the need to ensure non-reciprocity, meaning the information flows from input to output, not vice versa. One approach described previously, is to introduce an asymmetric ground contact, and while this approach was shown to be effective, it remains unclear as to the optimal approach for achieving non-reciprocity in ASL. In this study, we quantitatively analyze techniques to achieve non-reciprocity in ASL devices, and we specifically compare themore » effect of using asymmetric ground position and dipole-coupled output/input isolation. For this analysis, we simulate the switching dynamics of multiple-stage logic devices with FePt and FePd perpendicular magnetic anisotropy materials using a combination of a matrix-based spin circuit model coupled to the Landau–Lifshitz–Gilbert equation. The dipole field is included in this model and can act as both a desirable means of coupling magnets and a source of noise. The dynamic energy consumption has been calculated for these schemes, as a function of input/output magnet separation, and the results show that using a scheme that electrically isolates logic stages produces superior non-reciprocity, thus allowing both improved scaling and reduced energy consumption.« less

  8. Derivation of sorting programs

    NASA Technical Reports Server (NTRS)

    Varghese, Joseph; Loganantharaj, Rasiah

    1990-01-01

    Program synthesis for critical applications has become a viable alternative to program verification. Nested resolution and its extension are used to synthesize a set of sorting programs from their first order logic specifications. A set of sorting programs, such as, naive sort, merge sort, and insertion sort, were successfully synthesized starting from the same set of specifications.

  9. Error Reporting Logic

    DTIC Science & Technology

    2008-06-01

    14] Mark Weiser. Program slicing. Trans. Software Engineering , July 1984. 17 ...entitled “Perpetually Available and Secure In- formation Systems”, the Software Industry Center at CMU and its sponsors, especially the Alfred P. Sloan...ERL In Acme, a software architect can choose to associate a handwritten error message to each specification. If the specification fails, for any

  10. Information coding with frequency of oscillations in Belousov-Zhabotinsky encapsulated disks

    NASA Astrophysics Data System (ADS)

    Gorecki, J.; Gorecka, J. N.; Adamatzky, Andrew

    2014-04-01

    Information processing with an excitable chemical medium, like the Belousov-Zhabotinsky (BZ) reaction, is typically based on information coding in the presence or absence of excitation pulses. Here we present a new concept of Boolean coding that can be applied to an oscillatory medium. A medium represents the logical TRUE state if a selected region oscillates with a high frequency. If the frequency fails below a specified value, it represents the logical FALSE state. We consider a medium composed of disks encapsulating an oscillatory mixture of reagents, as related to our recent experiments with lipid-coated BZ droplets. We demonstrate that by using specific geometrical arrangements of disks containing the oscillatory medium one can perform logical operations on variables coded in oscillation frequency. Realizations of a chemical signal diode and of a single-bit memory with oscillatory disks are also discussed.

  11. Hyperbranched Hybridization Chain Reaction for Triggered Signal Amplification and Concatenated Logic Circuits.

    PubMed

    Bi, Sai; Chen, Min; Jia, Xiaoqiang; Dong, Ying; Wang, Zonghua

    2015-07-06

    A hyper-branched hybridization chain reaction (HB-HCR) is presented herein, which consists of only six species that can metastably coexist until the introduction of an initiator DNA to trigger a cascade of hybridization events, leading to the self-sustained assembly of hyper-branched and nicked double-stranded DNA structures. The system can readily achieve ultrasensitive detection of target DNA. Moreover, the HB-HCR principle is successfully applied to construct three-input concatenated logic circuits with excellent specificity and extended to design a security-mimicking keypad lock system. Significantly, the HB-HCR-based keypad lock can alarm immediately if the "password" is incorrect. Overall, the proposed HB-HCR with high amplification efficiency is simple, homogeneous, fast, robust, and low-cost, and holds great promise in the development of biosensing, in the programmable assembly of DNA architectures, and in molecular logic operations. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Autonomous vehicle motion control, approximate maps, and fuzzy logic

    NASA Technical Reports Server (NTRS)

    Ruspini, Enrique H.

    1993-01-01

    Progress on research on the control of actions of autonomous mobile agents using fuzzy logic is presented. The innovations described encompass theoretical and applied developments. At the theoretical level, results of research leading to the combined utilization of conventional artificial planning techniques with fuzzy logic approaches for the control of local motion and perception actions are presented. Also formulations of dynamic programming approaches to optimal control in the context of the analysis of approximate models of the real world are examined. Also a new approach to goal conflict resolution that does not require specification of numerical values representing relative goal importance is reviewed. Applied developments include the introduction of the notion of approximate map. A fuzzy relational database structure for the representation of vague and imprecise information about the robot's environment is proposed. Also the central notions of control point and control structure are discussed.

  13. A remote sensing based vegetation classification logic for global land cover analysis

    USGS Publications Warehouse

    Running, Steven W.; Loveland, Thomas R.; Pierce, Lars L.; Nemani, R.R.; Hunt, E. Raymond

    1995-01-01

    This article proposes a simple new logic for classifying global vegetation. The critical features of this classification are that 1) it is based on simple, observable, unambiguous characteristics of vegetation structure that are important to ecosystem biogeochemistry and can be measured in the field for validation, 2) the structural characteristics are remotely sensible so that repeatable and efficient global reclassifications of existing vegetation will be possible, and 3) the defined vegetation classes directly translate into the biophysical parameters of interest by global climate and biogeochemical models. A first test of this logic for the continental United States is presented based on an existing 1 km AVHRR normalized difference vegetation index database. Procedures for solving critical remote sensing problems needed to implement the classification are discussed. Also, some inferences from this classification to advanced vegetation biophysical variables such as specific leaf area and photosynthetic capacity useful to global biogeochemical modeling are suggested.

  14. A Domain-Specific Language for Discrete Mathematics

    NASA Astrophysics Data System (ADS)

    Jha, Rohit; Samuel, Alfy; Pawar, Ashmee; Kiruthika, M.

    2013-05-01

    This paper discusses a Domain Specific Language (DSL) that has been developed to enable implementation of concepts of discrete mathematics. A library of data types and functions provides functionality which is frequently required by users. Covering the areas of Mathematical Logic, Set Theory, Functions, Graph Theory, Number Theory, Linear Algebra and Combinatorics, the language's syntax is close to the actual notation used in the specific fields.

  15. Temporal Processing Capacity in High-Level Visual Cortex Is Domain Specific.

    PubMed

    Stigliani, Anthony; Weiner, Kevin S; Grill-Spector, Kalanit

    2015-09-09

    Prevailing hierarchical models propose that temporal processing capacity--the amount of information that a brain region processes in a unit time--decreases at higher stages in the ventral stream regardless of domain. However, it is unknown if temporal processing capacities are domain general or domain specific in human high-level visual cortex. Using a novel fMRI paradigm, we measured temporal capacities of functional regions in high-level visual cortex. Contrary to hierarchical models, our data reveal domain-specific processing capacities as follows: (1) regions processing information from different domains have differential temporal capacities within each stage of the visual hierarchy and (2) domain-specific regions display the same temporal capacity regardless of their position in the processing hierarchy. In general, character-selective regions have the lowest capacity, face- and place-selective regions have an intermediate capacity, and body-selective regions have the highest capacity. Notably, domain-specific temporal processing capacities are not apparent in V1 and have perceptual implications. Behavioral testing revealed that the encoding capacity of body images is higher than that of characters, faces, and places, and there is a correspondence between peak encoding rates and cortical capacities for characters and bodies. The present evidence supports a model in which the natural statistics of temporal information in the visual world may affect domain-specific temporal processing and encoding capacities. These findings suggest that the functional organization of high-level visual cortex may be constrained by temporal characteristics of stimuli in the natural world, and this temporal capacity is a characteristic of domain-specific networks in high-level visual cortex. Significance statement: Visual stimuli bombard us at different rates every day. For example, words and scenes are typically stationary and vary at slow rates. In contrast, bodies are dynamic and typically change at faster rates. Using a novel fMRI paradigm, we measured temporal processing capacities of functional regions in human high-level visual cortex. Contrary to prevailing theories, we find that different regions have different processing capacities, which have behavioral implications. In general, character-selective regions have the lowest capacity, face- and place-selective regions have an intermediate capacity, and body-selective regions have the highest capacity. These results suggest that temporal processing capacity is a characteristic of domain-specific networks in high-level visual cortex and contributes to the segregation of cortical regions. Copyright © 2015 the authors 0270-6474/15/3512412-13$15.00/0.

  16. Temporal changes in the abundance, leaf growth and photosynthesis of three co-occurring Philippine seagrasses.

    PubMed

    Agawin, N S.R.; Duarte, C M.; Fortes, M D.; Uri, J S.; Vermaat, J E.

    2001-06-01

    The analysis of the temporal changes in shoot density, areal leaf biomass, leaf growth and parameters of the photosynthesis-irradiance relationship of three tropical seagrass species (Enhalus acoroides, Thalassia hemprichii and Cymodocea rotundata), co-existing in a shallow subtidal meadow in Cape Bolinao, Philippines, shows that species-specific traits are significant sources of temporal variability, and indicates that these seagrass species respond differently to a common environmental forcing. Species-specific differences are much less important as source of variability of the temporal change in chlorophyll concentration of seagrass leaves. The results indicate that the temporal changes in photosynthetic performance of these seagrasses were driven by environmental forcing and their specific responses to it mostly, but the temporal change in their abundance and leaf growth was also controlled by other factors. The significant contribution of species-specific factors in the temporal changes of biomass, growth and photosynthetic performance of co-occurring seagrass species in Cape Bolinao should contribute to the maintenance of the multispecific, highly productive meadows characteristic of pristine coastal ecosystems in Southeast (SE) Asia.

  17. The FTA Method And A Possibility Of Its Application In The Area Of Road Freight Transport

    NASA Astrophysics Data System (ADS)

    Poliaková, Adela

    2015-06-01

    The Fault Tree process utilizes logic diagrams to portray and analyse potentially hazardous events. Three basic symbols (logic gates) are adequate for diagramming any fault tree. However, additional recently developed symbols can be used to reduce the time and effort required for analysis. A fault tree is a graphical representation of the relationship between certain specific events and the ultimate undesired event (2). This paper deals to method of Fault Tree Analysis basic description and provides a practical view on possibility of application by quality improvement in road freight transport company.

  18. An innovative approach for modeling and simulation of an automated industrial robotic arm operated electro-pneumatically

    NASA Astrophysics Data System (ADS)

    Popa, L.; Popa, V.

    2017-08-01

    The article is focused on modeling an automated industrial robotic arm operated electro-pneumatically and to simulate the robotic arm operation. It is used the graphic language FBD (Function Block Diagram) to program the robotic arm on Zelio Logic automation. The innovative modeling and simulation procedures are considered specific problems regarding the development of a new type of technical products in the field of robotics. Thus, were identified new applications of a Programmable Logic Controller (PLC) as a specialized computer performing control functions with a variety of high levels of complexit.

  19. A high-speed on-chip pseudo-random binary sequence generator for multi-tone phase calibration

    NASA Astrophysics Data System (ADS)

    Gommé, Liesbeth; Vandersteen, Gerd; Rolain, Yves

    2011-07-01

    An on-chip reference generator is conceived by adopting the technique of decimating a pseudo-random binary sequence (PRBS) signal in parallel sequences. This is of great benefit when high-speed generation of PRBS and PRBS-derived signals is the objective. The design implemented standard CMOS logic is available in commercial libraries to provide the logic functions for the generator. The design allows the user to select the periodicity of the PRBS and the PRBS-derived signals. The characterization of the on-chip generator marks its performance and reveals promising specifications.

  20. A logical model of cooperating rule-based systems

    NASA Technical Reports Server (NTRS)

    Bailin, Sidney C.; Moore, John M.; Hilberg, Robert H.; Murphy, Elizabeth D.; Bahder, Shari A.

    1989-01-01

    A model is developed to assist in the planning, specification, development, and verification of space information systems involving distributed rule-based systems. The model is based on an analysis of possible uses of rule-based systems in control centers. This analysis is summarized as a data-flow model for a hypothetical intelligent control center. From this data-flow model, the logical model of cooperating rule-based systems is extracted. This model consists of four layers of increasing capability: (1) communicating agents, (2) belief-sharing knowledge sources, (3) goal-sharing interest areas, and (4) task-sharing job roles.

  1. Research in mathematical theory of computation. [computer programming applications

    NASA Technical Reports Server (NTRS)

    Mccarthy, J.

    1973-01-01

    Research progress in the following areas is reviewed: (1) new version of computer program LCF (logic for computable functions) including a facility to search for proofs automatically; (2) the description of the language PASCAL in terms of both LCF and in first order logic; (3) discussion of LISP semantics in LCF and attempt to prove the correctness of the London compilers in a formal way; (4) design of both special purpose and domain independent proving procedures specifically program correctness in mind; (5) design of languages for describing such proof procedures; and (6) the embedding of ideas in the first order checker.

  2. Computing with volatile memristors: an application of non-pinched hysteresis

    NASA Astrophysics Data System (ADS)

    Pershin, Y. V.; Shevchenko, S. N.

    2017-02-01

    The possibility of in-memory computing with volatile memristive devices, namely, memristors requiring a power source to sustain their memory, is demonstrated theoretically. We have adopted a hysteretic graphene-based field emission structure as a prototype of a volatile memristor, which is characterized by a non-pinched hysteresis loop. A memristive model of the structure is developed and used to simulate a polymorphic circuit implementing stateful logic gates, such as the material implication. Specific regions of parameter space realizing useful logic functions are identified. Our results are applicable to other realizations of volatile memory devices, such as certain NEMS switches.

  3. AgRISTARS: Foreign commodity production forecasting. Corn/soybean decision logic development and testing

    NASA Technical Reports Server (NTRS)

    Dailey, C. L.; Abotteen, K. M. (Principal Investigator)

    1980-01-01

    The development and testing of an analysis procedure which was developed to improve the consistency and objectively of crop identification using Landsat data is described. The procedure was developed to identify corn and soybean crops in the U.S. corn belt region. The procedure consists of a series of decision points arranged in a tree-like structure, the branches of which lead an analyst to crop labels. The specific decision logic is designed to maximize the objectively of the identification process and to promote the possibility of future automation. Significant results are summarized.

  4. Sentiments analysis at conceptual level making use of the Narrative Knowledge Representation Language.

    PubMed

    Zarri, Gian Piero

    2014-10-01

    This paper illustrates some of the knowledge representation structures and inference procedures proper to a high-level, fully implemented conceptual language, NKRL (Narrative Knowledge Representation Language). The aim is to show how these tools can be used to deal, in a sentiment analysis/opinion mining context, with some common types of human (and non-human) "behaviors". These behaviors correspond, in particular, to the concrete, mutual relationships among human and non-human characters that can be expressed under the form of non-fictional and real-time "narratives" (i.e., as logically and temporally structured sequences of "elementary events"). Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Fuzzy Sets in Dynamic Adaptation of Parameters of a Bee Colony Optimization for Controlling the Trajectory of an Autonomous Mobile Robot

    PubMed Central

    Amador-Angulo, Leticia; Mendoza, Olivia; Castro, Juan R.; Rodríguez-Díaz, Antonio; Melin, Patricia; Castillo, Oscar

    2016-01-01

    A hybrid approach composed by different types of fuzzy systems, such as the Type-1 Fuzzy Logic System (T1FLS), Interval Type-2 Fuzzy Logic System (IT2FLS) and Generalized Type-2 Fuzzy Logic System (GT2FLS) for the dynamic adaptation of the alpha and beta parameters of a Bee Colony Optimization (BCO) algorithm is presented. The objective of the work is to focus on the BCO technique to find the optimal distribution of the membership functions in the design of fuzzy controllers. We use BCO specifically for tuning membership functions of the fuzzy controller for trajectory stability in an autonomous mobile robot. We add two types of perturbations in the model for the Generalized Type-2 Fuzzy Logic System to better analyze its behavior under uncertainty and this shows better results when compared to the original BCO. We implemented various performance indices; ITAE, IAE, ISE, ITSE, RMSE and MSE to measure the performance of the controller. The experimental results show better performances using GT2FLS then by IT2FLS and T1FLS in the dynamic adaptation the parameters for the BCO algorithm. PMID:27618062

  6. LEGO-MM: LEarning structured model by probabilistic loGic Ontology tree for MultiMedia.

    PubMed

    Tang, Jinhui; Chang, Shiyu; Qi, Guo-Jun; Tian, Qi; Rui, Yong; Huang, Thomas S

    2016-09-22

    Recent advances in Multimedia ontology have resulted in a number of concept models, e.g., LSCOM and Mediamill 101, which are accessible and public to other researchers. However, most current research effort still focuses on building new concepts from scratch, very few work explores the appropriate method to construct new concepts upon the existing models already in the warehouse. To address this issue, we propose a new framework in this paper, termed LEGO1-MM, which can seamlessly integrate both the new target training examples and the existing primitive concept models to infer the more complex concept models. LEGOMM treats the primitive concept models as the lego toy to potentially construct an unlimited vocabulary of new concepts. Specifically, we first formulate the logic operations to be the lego connectors to combine existing concept models hierarchically in probabilistic logic ontology trees. Then, we incorporate new target training information simultaneously to efficiently disambiguate the underlying logic tree and correct the error propagation. Extensive experiments are conducted on a large vehicle domain data set from ImageNet. The results demonstrate that LEGO-MM has significantly superior performance over existing state-of-the-art methods, which build new concept models from scratch.

  7. Light-Triggered Ternary Device and Inverter Based on Heterojunction of van der Waals Materials.

    PubMed

    Shim, Jaewoo; Jo, Seo-Hyeon; Kim, Minwoo; Song, Young Jae; Kim, Jeehwan; Park, Jin-Hong

    2017-06-27

    Multivalued logic (MVL) devices/circuits have received considerable attention because the binary logic used in current Si complementary metal-oxide-semiconductor (CMOS) technology cannot handle the predicted information throughputs and energy demands of the future. To realize MVL, the conventional transistor platform needs to be redesigned to have two or more distinctive threshold voltages (V TH s). Here, we report a finding: the photoinduced drain current in graphene/WSe 2 heterojunction transistors unusually decreases with increasing gate voltage under illumination, which we refer to as the light-induced negative differential transconductance (L-NDT) phenomenon. We also prove that such L-NDT phenomenon in specific bias ranges originates from a variable potential barrier at a graphene/WSe 2 junction due to a gate-controllable graphene electrode. This finding allows us to conceive graphene/WSe 2 -based MVL logic circuits by using the I D -V G characteristics with two distinctive V TH s. Based on this finding, we further demonstrate a light-triggered ternary inverter circuit with three stable logical states (ΔV out of each state <0.05 V). Our study offers the pathway to substantialize MVL systems.

  8. Temporal Causality Analysis of Sentiment Change in a Cancer Survivor Network.

    PubMed

    Bui, Ngot; Yen, John; Honavar, Vasant

    2016-06-01

    Online health communities constitute a useful source of information and social support for patients. American Cancer Society's Cancer Survivor Network (CSN), a 173,000-member community, is the largest online network for cancer patients, survivors, and caregivers. A discussion thread in CSN is often initiated by a cancer survivor seeking support from other members of CSN. Discussion threads are multi-party conversations that often provide a source of social support e.g., by bringing about a change of sentiment from negative to positive on the part of the thread originator. While previous studies regarding cancer survivors have shown that members of an online health community derive benefits from their participation in such communities, causal accounts of the factors that contribute to the observed benefits have been lacking. We introduce a novel framework to examine the temporal causality of sentiment dynamics in the CSN. We construct a Probabilistic Computation Tree Logic representation and a corresponding probabilistic Kripke structure to represent and reason about the changes in sentiments of posts in a thread over time. We use a sentiment classifier trained using machine learning on a set of posts manually tagged with sentiment labels to classify posts as expressing either positive or negative sentiment. We analyze the probabilistic Kripke structure to identify the prima facie causes of sentiment change on the part of the thread originators in the CSN forum and their significance. We find that the sentiment of replies appears to causally influence the sentiment of the thread originator. Our experiments also show that the conclusions are robust with respect to the choice of the (i) classification threshold of the sentiment classifier; (ii) and the choice of the specific sentiment classifier used. We also extend the basic framework for temporal causality analysis to incorporate the uncertainty in the states of the probabilistic Kripke structure resulting from the use of an imperfect state transducer (in our case, the sentiment classifier). Our analysis of temporal causality of CSN sentiment dynamics offers new insights that the designers, managers and moderators of an online community such as CSN can utilize to facilitate and enhance the interactions so as to better meet the social support needs of the CSN participants. The proposed methodology for analysis of temporal causality has broad applicability in a variety of settings where the dynamics of the underlying system can be modeled in terms of state variables that change in response to internal or external inputs.

  9. Temporal Causality Analysis of Sentiment Change in a Cancer Survivor Network

    PubMed Central

    Bui, Ngot; Yen, John; Honavar, Vasant

    2017-01-01

    Online health communities constitute a useful source of information and social support for patients. American Cancer Society’s Cancer Survivor Network (CSN), a 173,000-member community, is the largest online network for cancer patients, survivors, and caregivers. A discussion thread in CSN is often initiated by a cancer survivor seeking support from other members of CSN. Discussion threads are multi-party conversations that often provide a source of social support e.g., by bringing about a change of sentiment from negative to positive on the part of the thread originator. While previous studies regarding cancer survivors have shown that members of an online health community derive benefits from their participation in such communities, causal accounts of the factors that contribute to the observed benefits have been lacking. We introduce a novel framework to examine the temporal causality of sentiment dynamics in the CSN. We construct a Probabilistic Computation Tree Logic representation and a corresponding probabilistic Kripke structure to represent and reason about the changes in sentiments of posts in a thread over time. We use a sentiment classifier trained using machine learning on a set of posts manually tagged with sentiment labels to classify posts as expressing either positive or negative sentiment. We analyze the probabilistic Kripke structure to identify the prima facie causes of sentiment change on the part of the thread originators in the CSN forum and their significance. We find that the sentiment of replies appears to causally influence the sentiment of the thread originator. Our experiments also show that the conclusions are robust with respect to the choice of the (i) classification threshold of the sentiment classifier; (ii) and the choice of the specific sentiment classifier used. We also extend the basic framework for temporal causality analysis to incorporate the uncertainty in the states of the probabilistic Kripke structure resulting from the use of an imperfect state transducer (in our case, the sentiment classifier). Our analysis of temporal causality of CSN sentiment dynamics offers new insights that the designers, managers and moderators of an online community such as CSN can utilize to facilitate and enhance the interactions so as to better meet the social support needs of the CSN participants. The proposed methodology for analysis of temporal causality has broad applicability in a variety of settings where the dynamics of the underlying system can be modeled in terms of state variables that change in response to internal or external inputs. PMID:29399599

  10. Cortical Brain Atrophy and Intra-Individual Variability in Neuropsychological Test Performance in HIV Disease

    PubMed Central

    HINES, Lindsay J.; MILLER, Eric N.; HINKIN, Charles H.; ALGER, Jeffery R.; BARKER, Peter; GOODKIN, Karl; MARTIN, Eileen M.; MARUCA, Victoria; RAGIN, Ann; SACKTOR, Ned; SANDERS, Joanne; SELNES, Ola; BECKER, James T.

    2015-01-01

    Objective To characterize the relationship between dispersion-based intra-individual variability (IIVd) in neuropsychological test performance and brain volume among HIV seropositive and seronegative men and to determine the effects of cardiovascular risk and HIV infection on this relationship. Methods Magnetic Resonance Imaging (MRI) was used to acquire high-resolution neuroanatomic data from 147 men age 50 and over, including 80 HIV seropositive (HIV+) and 67 seronegative controls (HIV−) in this cross-sectional cohort study. Voxel Based Morphometry was used to derive volumetric measurements at the level of the individual voxel. These brain structure maps were analyzed using Statistical Parametric Mapping (SPM2). IIVd was measured by computing intra-individual standard deviations (ISD’s) from the standardized performance scores of five neuropsychological tests: Wechsler Memory Scale-III Visual Reproduction I and II, Logical Memory I and II, Wechsler Adult Intelligence Scale-III Letter Number Sequencing. Results Total gray matter (GM) volume was inversely associated with IIVd. Among all subjects, IIVd -related GM atrophy was observed primarily in: 1) the inferior frontal gyrus bilaterally, the left inferior temporal gyrus extending to the supramarginal gyrus, spanning the lateral sulcus; 2) the right superior parietal lobule and intraparietal sulcus; and, 3) dorsal/ventral regions of the posterior section of the transverse temporal gyrus. HIV status, biological, and cardiovascular disease (CVD) variables were not linked to IIVd -related GM atrophy. Conclusions IIVd in neuropsychological test performance may be a sensitive marker of cortical integrity in older adults, regardless of HIV infection status or CVD risk factors, and degree of intra-individual variability links with volume loss in specific cortical regions; independent of mean-level performance on neuropsychological tests. PMID:26303224

  11. Theory of mind broad and narrow: reasoning about social exchange engages ToM areas, precautionary reasoning does not.

    PubMed

    Ermer, Elsa; Guerin, Scott A; Cosmides, Leda; Tooby, John; Miller, Michael B

    2006-01-01

    Baron-Cohen (1995) proposed that the theory of mind (ToM) inference system evolved to promote strategic social interaction. Social exchange--a form of co-operation for mutual benefit--involves strategic social interaction and requires ToM inferences about the contents of other individuals' mental states, especially their desires, goals, and intentions. There are behavioral and neuropsychological dissociations between reasoning about social exchange and reasoning about equivalent problems tapping other, more general content domains. It has therefore been proposed that social exchange behavior is regulated by social contract algorithms: a domain-specific inference system that is functionally specialized for reasoning about social exchange. We report an fMRI study using the Wason selection task that provides further support for this hypothesis. Precautionary rules share so many properties with social exchange rules--they are conditional, deontic, and involve subjective utilities--that most reasoning theories claim they are processed by the same neurocomputational machinery. Nevertheless, neuroimaging shows that reasoning about social exchange activates brain areas not activated by reasoning about precautionary rules, and vice versa. As predicted, neural correlates of ToM (anterior and posterior temporal cortex) were activated when subjects interpreted social exchange rules, but not precautionary rules (where ToM inferences are unnecessary). We argue that the interaction between ToM and social contract algorithms can be reciprocal: social contract algorithms requires ToM inferences, but their functional logic also allows ToM inferences to be made. By considering interactions between ToM in the narrower sense (belief-desire reasoning) and all the social inference systems that create the logic of human social interaction--ones that enable as well as use inferences about the content of mental states--a broader conception of ToM may emerge: a computational model embodying a Theory of Human Nature (ToHN).

  12. The engineering of cybernetic systems

    NASA Astrophysics Data System (ADS)

    Fry, Robert L.

    2002-05-01

    This tutorial develops a logical basis for the engineering of systems that operate cybernetically. The term cybernetic system has a clear quantitative definition. It is a system that dynamically matches acquired information to selected actions relative to a computational issue that defines the essential purpose of the system or machine. This notion requires that information and control be further quantified. The logic of questions and assertions as developed by Cox provides one means of doing this. The design and operation of cybernetic systems can be understood by contrasting these kinds of systems with communication systems and information theory as developed by Shannon. The joint logic of questions and assertions can be seen to underlie and be common to both information theory as applied to the design of discrete communication systems and to a theory of discrete general systems. The joint logic captures a natural complementarity between systems that transmit and receive information and those that acquire and act on it. Specific comparisons and contrasts are made between the source rate and channel capacity of a communication system and the acquisition rate and control capacity of a general system. An overview is provided of the joint logic of questions and assertions and the ties that this logic has to both conventional information theory and to a general theory of systems. I-diagrams, the interrogative complement of Venn diagrams, are described as providing valuable reasoning tools. An initial framework is suggested for the design of cybernetic systems. Two examples are given to illustrate this framework as applied to discrete cybernetic systems. These examples include a predator-prey problem as illustrated through "The Dog Chrysippus Pursuing its Prey," and the derivation of a single-neuron system that operates cybernetically and is biologically plausible. Future areas of research are highlighted which require development for a mature engineering framework.

  13. G4-FETs as Universal and Programmable Logic Gates

    NASA Technical Reports Server (NTRS)

    Johnson, Travis; Fijany, Amir; Mojarradi, Mohammad; Vatan, Farrokh; Toomarian, Nikzad; Kolawa, Elizabeth; Cristoloveanu, Sorin; Blalock, Benjamin

    2007-01-01

    An analysis of a patented generic silicon- on-insulator (SOI) electronic device called a G4-FET has revealed that the device could be designed to function as a universal and programmable logic gate. The universality and programmability could be exploited to design logic circuits containing fewer discrete components than are required for conventional transistor-based circuits performing the same logic functions. A G4-FET is a combination of a junction field-effect transistor (JFET) and a metal oxide/semiconductor field-effect transistor (MOSFET) superimposed in a single silicon island and can therefore be regarded as two transistors sharing the same body. A G4-FET can also be regarded as a single transistor having four gates: two side junction-based gates, a top MOS gate, and a back gate activated by biasing of the SOI substrate. Each of these gates can be used to control the conduction characteristics of the transistor; this possibility creates new options for designing analog, radio-frequency, mixed-signal, and digital circuitry. With proper choice of the specific dimensions for the gates, channels, and ancillary features of the generic G4-FET, the device could be made to function as a three-input, one-output logic gate. As illustrated by the truth table in the top part of the figure, the behavior of this logic gate would be the inverse (the NOT) of that of a majority gate. In other words, the device would function as a NOT-majority gate. By simply adding an inverter, one could obtain a majority gate. In contrast, to construct a majority gate in conventional complementary metal oxide/semiconductor (CMOS) circuitry, one would need four three-input AND gates and a four-input OR gate, altogether containing 32 transistors.

  14. Development of an intelligent system for cooling rate and fill control in GMAW

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Einerson, C.J.; Smartt, H.B.; Johnson, J.A.

    1992-09-01

    A control strategy for gas metal arc welding (GMAW) is developed in which the welding system detects certain existing conditions and adjusts the process in accordance to pre-specified rules. This strategy is used to control the reinforcement and weld bead centerline cooling rate during welding. Relationships between heat and mass transfer rates to the base metal and the required electrode speed and welding speed for specific open circuit voltages are taught to a artificial neural network. Control rules are programmed into a fuzzy logic system. TRADITOINAL CONTROL OF THE GMAW PROCESS is based on the use of explicit welding proceduresmore » detailing allowable parameter ranges on a pass by pass basis for a given weld. The present work is an exploration of a completely different approach to welding control. In this work the objectives are to produce welds having desired weld bead reinforcements while maintaining the weld bead centerline cooling rate at preselected values. The need for this specific control is related to fabrication requirements for specific types of pressure vessels. The control strategy involves measuring weld joint transverse cross-sectional area ahead of the welding torch and the weld bead centerline cooling rate behind the weld pool, both by means of video (2), calculating the required process parameters necessary to obtain the needed heat and mass transfer rates (in appropriate dimensions) by means of an artificial neural network, and controlling the heat transfer rate by means of a fuzzy logic controller (3). The result is a welding machine that senses the welding conditions and responds to those conditions on the basis of logical rules, as opposed to producing a weld based on a specific procedure.« less

  15. Development of an intelligent system for cooling rate and fill control in GMAW. [Gas Metal Arc Welding (GMAW)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Einerson, C.J.; Smartt, H.B.; Johnson, J.A.

    1992-01-01

    A control strategy for gas metal arc welding (GMAW) is developed in which the welding system detects certain existing conditions and adjusts the process in accordance to pre-specified rules. This strategy is used to control the reinforcement and weld bead centerline cooling rate during welding. Relationships between heat and mass transfer rates to the base metal and the required electrode speed and welding speed for specific open circuit voltages are taught to a artificial neural network. Control rules are programmed into a fuzzy logic system. TRADITOINAL CONTROL OF THE GMAW PROCESS is based on the use of explicit welding proceduresmore » detailing allowable parameter ranges on a pass by pass basis for a given weld. The present work is an exploration of a completely different approach to welding control. In this work the objectives are to produce welds having desired weld bead reinforcements while maintaining the weld bead centerline cooling rate at preselected values. The need for this specific control is related to fabrication requirements for specific types of pressure vessels. The control strategy involves measuring weld joint transverse cross-sectional area ahead of the welding torch and the weld bead centerline cooling rate behind the weld pool, both by means of video (2), calculating the required process parameters necessary to obtain the needed heat and mass transfer rates (in appropriate dimensions) by means of an artificial neural network, and controlling the heat transfer rate by means of a fuzzy logic controller (3). The result is a welding machine that senses the welding conditions and responds to those conditions on the basis of logical rules, as opposed to producing a weld based on a specific procedure.« less

  16. Program Instrumentation and Trace Analysis

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Goldberg, Allen; Filman, Robert; Rosu, Grigore; Koga, Dennis (Technical Monitor)

    2002-01-01

    Several attempts have been made recently to apply techniques such as model checking and theorem proving to the analysis of programs. This shall be seen as a current trend to analyze real software systems instead of just their designs. This includes our own effort to develop a model checker for Java, the Java PathFinder 1, one of the very first of its kind in 1998. However, model checking cannot handle very large programs without some kind of abstraction of the program. This paper describes a complementary scalable technique to handle such large programs. Our interest is turned on the observation part of the equation: How much information can be extracted about a program from observing a single execution trace? It is our intention to develop a technology that can be applied automatically and to large full-size applications, with minimal modification to the code. We present a tool, Java PathExplorer (JPaX), for exploring execution traces of Java programs. The tool prioritizes scalability for completeness, and is directed towards detecting errors in programs, not to prove correctness. One core element in JPaX is an instrumentation package that allows to instrument Java byte code files to log various events when executed. The instrumentation is driven by a user provided script that specifies what information to log. Examples of instructions that such a script can contain are: 'report name and arguments of all called methods defined in class C, together with a timestamp'; 'report all updates to all variables'; and 'report all acquisitions and releases of locks'. In more complex instructions one can specify that certain expressions should be evaluated and even that certain code should be executed under various conditions. The instrumentation package can hence be seen as implementing Aspect Oriented Programming for Java in the sense that one can add functionality to a Java program without explicitly changing the code of the original program, but one rather writes an aspect and compiles it into the original program using the instrumentation. Another core element of JPaX is an observation package that supports the analysis of the generated event stream. Two kinds of analysis are currently supported. In temporal analysis the execution trace is evaluated against formulae written in temporal logic. We have implemented a temporal logic evaluator on finite traces using the Maude rewriting system from SRI International, USA. Temporal logic is defined in Maude by giving its syntax as a signature and its semantics as rewrite equations. The resulting semantics is extremely efficient and can handle event streams of hundreds of millions events in few minutes. Furthermore, the implementation is very succinct. The second form of even stream analysis supported is error pattern analysis where an execution trace is analyzed using various error detection algorithms that can identify error-prone programming practices that may potentially lead to errors in some different executions. Two such algorithms focusing on concurrency errors have been implemented in JPaX, one for deadlocks and the other for data races. It is important to note, that a deadlock or data race potential does not need to occur in order for its potential to be detected with these algorithms. This is what makes them very scalable in practice. The data race algorithm implemented is the Eraser algorithm from Compaq, however adopted to Java. The tool is currently being applied to a code base for controlling a spacecraft by the developers of that software in order to evaluate its applicability.

  17. 78 FR 45026 - Revisions to the Export Administration Regulations (EAR): Control of Military Electronic...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-25

    ... bear no direct correlation to military-specific applications in accordance with the stated intention... logical correlation to the way that discrete microwave transistors and MMIC technologies actually work...

  18. Self-assembling software generator

    DOEpatents

    Bouchard, Ann M [Albuquerque, NM; Osbourn, Gordon C [Albuquerque, NM

    2011-11-25

    A technique to generate an executable task includes inspecting a task specification data structure to determine what software entities are to be generated to create the executable task, inspecting the task specification data structure to determine how the software entities will be linked after generating the software entities, inspecting the task specification data structure to determine logic to be executed by the software entities, and generating the software entities to create the executable task.

  19. Logic programming to predict cell fate patterns and retrodict genotypes in organogenesis.

    PubMed

    Hall, Benjamin A; Jackson, Ethan; Hajnal, Alex; Fisher, Jasmin

    2014-09-06

    Caenorhabditis elegans vulval development is a paradigm system for understanding cell differentiation in the process of organogenesis. Through temporal and spatial controls, the fate pattern of six cells is determined by the competition of the LET-23 and the Notch signalling pathways. Modelling cell fate determination in vulval development using state-based models, coupled with formal analysis techniques, has been established as a powerful approach in predicting the outcome of combinations of mutations. However, computing the outcomes of complex and highly concurrent models can become prohibitive. Here, we show how logic programs derived from state machines describing the differentiation of C. elegans vulval precursor cells can increase the speed of prediction by four orders of magnitude relative to previous approaches. Moreover, this increase in speed allows us to infer, or 'retrodict', compatible genomes from cell fate patterns. We exploit this technique to predict highly variable cell fate patterns resulting from dig-1 reduced-function mutations and let-23 mosaics. In addition to the new insights offered, we propose our technique as a platform for aiding the design and analysis of experimental data. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  20. Eleven fetal echocardiographic planes using 4-dimensional ultrasound with spatio-temporal image correlation (STIC): a logical approach to fetal heart volume analysis.

    PubMed

    Jantarasaengaram, Surasak; Vairojanavong, Kittipong

    2010-09-15

    Theoretically, a cross-sectional image of any cardiac planes can be obtained from a STIC fetal heart volume dataset. We described a method to display 11 fetal echocardiographic planes from STIC volumes. Fetal heart volume datasets were acquired by transverse acquisition from 200 normal fetuses at 15 to 40 weeks of gestation. Analysis of the volume datasets using the described technique to display 11 echocardiographic planes in the multiplanar display mode were performed offline. Volume datasets from 18 fetuses were excluded due to poor image resolution. The mean visualization rates for all echocardiographic planes at 15-17, 18-22, 23-27, 28-32 and 33-40 weeks of gestation fetuses were 85.6% (range 45.2-96.8%, N = 31), 92.9% (range 64.0-100%, N = 64), 93.4% (range 51.4-100%, N = 37), 88.7%(range 54.5-100%, N = 33) and 81.8% (range 23.5-100%, N = 17) respectively. Overall, the applied technique can favorably display the pertinent echocardiographic planes. Description of the presented method provides a logical approach to explore the fetal heart volumes.

  1. Sensory-motor problems in Autism

    PubMed Central

    Whyatt, Caroline; Craig, Cathy

    2013-01-01

    Despite being largely characterized as a social and cognitive disorder, strong evidence indicates the presence of significant sensory-motor problems in Autism Spectrum Disorder (ASD). This paper outlines our progression from initial, broad assessment using the Movement Assessment Battery for Children (M-ABC2) to subsequent targeted kinematic assessment. In particular, pronounced ASD impairment seen in the broad categories of manual dexterity and ball skills was found to be routed in specific difficulties on isolated tasks, which were translated into focused experimental assessment. Kinematic results from both subsequent studies highlight impaired use of perception-action coupling to guide, adapt and tailor movement to task demands, resulting in inflexible and rigid motor profiles. In particular difficulties with the use of temporal adaption are shown, with “hyperdexterity” witnessed in ballistic movement profiles, often at the cost of spatial accuracy and task performance. By linearly progressing from the use of a standardized assessment tool to targeted kinematic assessment, clear and defined links are drawn between measureable difficulties and underlying sensory-motor assessment. Results are specifically viewed in-light of perception-action coupling and its role in early infant development suggesting that rather than being “secondary” level impairment, sensory-motor problems may be fundamental in the progression of ASD. This logical and systematic process thus allows a further understanding into the potential root of observable motor problems in ASD; a vital step if underlying motor problems are to be considered a fundamental aspect of autism and allow a route of non-invasive preliminary diagnosis. PMID:23882194

  2. Temporal Lobe and Frontal-Subcortical Dissociations in Non-Demented Parkinson's Disease with Verbal Memory Impairment.

    PubMed

    Tanner, Jared J; Mareci, Thomas H; Okun, Michael S; Bowers, Dawn; Libon, David J; Price, Catherine C

    2015-01-01

    The current investigation examined verbal memory in idiopathic non-dementia Parkinson's disease and the significance of the left entorhinal cortex and left entorhinal-retrosplenial region connections (via temporal cingulum) on memory impairment in Parkinson's disease. Forty non-demented Parkinson's disease patients and forty non-Parkinson's disease controls completed two verbal memory tests--a wordlist measure (Philadelphia repeatable Verbal Memory Test) and a story measure (Logical Memory). All participants received T1-weighted and diffusion magnetic resonance imaging (3T; Siemens) sequences. Left entorhinal volume and left entorhinal-retrosplenial connectivity (temporal cingulum edge weight) were the primary imaging variables of interest with frontal lobe thickness and subcortical structure volumes as dissociating variables. Individuals with Parkinson's disease showed worse verbal memory, smaller entorhinal volumes, but did not differ in entorhinal-retrosplenial connectivity. For Parkinson's disease entorhinal-retrosplenial edge weight had the strongest associations with verbal memory. A subset of Parkinson's disease patients (23%) had deficits (z-scores < -1.5) across both memory measures. Relative to non-impaired Parkinson's peers, this memory-impaired group had smaller entorhinal volumes. Although entorhinal cortex volume was significantly reduced in Parkinson's disease patients relative to non-Parkinson's peers, only white matter connections associated with the entorhinal cortex were significantly associated with verbal memory performance in our sample. There was also no suggestion of contribution from frontal-subcortical gray or frontal white matter regions. These findings argue for additional investigation into medial temporal lobe gray and white matter connectivity for understanding memory in Parkinson's disease.

  3. Bio-inspired nano-sensor-enhanced CNN visual computer.

    PubMed

    Porod, Wolfgang; Werblin, Frank; Chua, Leon O; Roska, Tamas; Rodriguez-Vazquez, Angel; Roska, Botond; Fay, Patrick; Bernstein, Gary H; Huang, Yih-Fang; Csurgay, Arpad I

    2004-05-01

    Nanotechnology opens new ways to utilize recent discoveries in biological image processing by translating the underlying functional concepts into the design of CNN (cellular neural/nonlinear network)-based systems incorporating nanoelectronic devices. There is a natural intersection joining studies of retinal processing, spatio-temporal nonlinear dynamics embodied in CNN, and the possibility of miniaturizing the technology through nanotechnology. This intersection serves as the springboard for our multidisciplinary project. Biological feature and motion detectors map directly into the spatio-temporal dynamics of CNN for target recognition, image stabilization, and tracking. The neural interactions underlying color processing will drive the development of nanoscale multispectral sensor arrays for image fusion. Implementing such nanoscale sensors on a CNN platform will allow the implementation of device feedback control, a hallmark of biological sensory systems. These biologically inspired CNN subroutines are incorporated into the new world of analog-and-logic algorithms and software, containing also many other active-wave computing mechanisms, including nature-inspired (physics and chemistry) as well as PDE-based sophisticated spatio-temporal algorithms. Our goal is to design and develop several miniature prototype devices for target detection, navigation, tracking, and robotics. This paper presents an example illustrating the synergies emerging from the convergence of nanotechnology, biotechnology, and information and cognitive science.

  4. Dynamically protected cat-qubits: a new paradigm for universal quantum computation

    NASA Astrophysics Data System (ADS)

    Mirrahimi, Mazyar; Leghtas, Zaki; Albert, Victor V.; Touzard, Steven; Schoelkopf, Robert J.; Jiang, Liang; Devoret, Michel H.

    2014-04-01

    We present a new hardware-efficient paradigm for universal quantum computation which is based on encoding, protecting and manipulating quantum information in a quantum harmonic oscillator. This proposal exploits multi-photon driven dissipative processes to encode quantum information in logical bases composed of Schrödinger cat states. More precisely, we consider two schemes. In a first scheme, a two-photon driven dissipative process is used to stabilize a logical qubit basis of two-component Schrödinger cat states. While such a scheme ensures a protection of the logical qubit against the photon dephasing errors, the prominent error channel of single-photon loss induces bit-flip type errors that cannot be corrected. Therefore, we consider a second scheme based on a four-photon driven dissipative process which leads to the choice of four-component Schrödinger cat states as the logical qubit. Such a logical qubit can be protected against single-photon loss by continuous photon number parity measurements. Next, applying some specific Hamiltonians, we provide a set of universal quantum gates on the encoded qubits of each of the two schemes. In particular, we illustrate how these operations can be rendered fault-tolerant with respect to various decoherence channels of participating quantum systems. Finally, we also propose experimental schemes based on quantum superconducting circuits and inspired by methods used in Josephson parametric amplification, which should allow one to achieve these driven dissipative processes along with the Hamiltonians ensuring the universal operations in an efficient manner.

  5. Decision support and disease management: a logic engineering approach.

    PubMed

    Fox, J; Thomson, R

    1998-12-01

    This paper describes the development and application of PROforma, a unified technology for clinical decision support and disease management. Work leading to the implementation of PROforma has been carried out in a series of projects funded by European agencies over the past 13 years. The work has been based on logic engineering, a distinct design and development methodology that combines concepts from knowledge engineering, logic programming, and software engineering. Several of the projects have used the approach to demonstrate a wide range of applications in primary and specialist care and clinical research. Concurrent academic research projects have provided a sound theoretical basis for the safety-critical elements of the methodology. The principal technical results of the work are the PROforma logic language for defining clinical processes and an associated suite of software tools for delivering applications, such as decision support and disease management procedures. The language supports four standard objects (decisions, plans, actions, and enquiries), each of which has an intuitive meaning with well-understood logical semantics. The development toolset includes a powerful visual programming environment for composing applications from these standard components, for verifying consistency and completeness of the resulting specification and for delivering stand-alone or embeddable applications. Tools and applications that have resulted from the work are described and illustrated, with examples from specialist cancer care and primary care. The results of a number of evaluation activities are included to illustrate the utility of the technology.

  6. Elements of orbit-determination theory - Textbook

    NASA Technical Reports Server (NTRS)

    Solloway, C. B.

    1971-01-01

    Text applies to solution of various optimization problems. Concepts are logically introduced and refinements and complexities for computerized numerical solutions are avoided. Specific topics and essential equivalence of several different approaches to various aspects of the problem are given.

  7. Advanced Computing Architectures for Cognitive Processing

    DTIC Science & Technology

    2009-07-01

    Evolution ................................................................................. 20  Figure 9: Logic diagram smart block-based neuron...48  Figure 21: Naive Grid Potential Kernel...processing would be helpful for Air Force systems acquisition. Specific cognitive processing approaches addressed herein include global information grid

  8. Freight data architecture business process, logical data model, and physical data model.

    DOT National Transportation Integrated Search

    2014-09-01

    This document summarizes the study teams efforts to establish data-sharing partnerships : and relay the lessons learned. In addition, it provides information on a prototype freight data : architecture and supporting description and specifications ...

  9. Fuzzy Logic Based Anomaly Detection for Embedded Network Security Cyber Sensor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ondrej Linda; Todd Vollmer; Jason Wright

    Resiliency and security in critical infrastructure control systems in the modern world of cyber terrorism constitute a relevant concern. Developing a network security system specifically tailored to the requirements of such critical assets is of a primary importance. This paper proposes a novel learning algorithm for anomaly based network security cyber sensor together with its hardware implementation. The presented learning algorithm constructs a fuzzy logic rule based model of normal network behavior. Individual fuzzy rules are extracted directly from the stream of incoming packets using an online clustering algorithm. This learning algorithm was specifically developed to comply with the constrainedmore » computational requirements of low-cost embedded network security cyber sensors. The performance of the system was evaluated on a set of network data recorded from an experimental test-bed mimicking the environment of a critical infrastructure control system.« less

  10. [A functional analysis of healthcare auditors' skills in Venezuela, 2008].

    PubMed

    Chirinos-Muñoz, Mónica S

    2010-10-01

    Using functional analysis for identifying the basic, working, specific and generic skills and values which a health service auditor must have. Implementing the functional analysis technique with 10 experts, identifying specific, basic, generic skills and values by means of deductive logic. A functional map was obtained which started by establishing a key purpose based on improving healthcare and service quality from which three key functions emerged. The main functions and skills' units were then broken down into the competitive elements defining what a health service auditor is able to do. This functional map (following functional analysis methodology) shows in detail the simple and complex tasks which a healthcare auditor should apply in the workplace, adopting a forward management approach for improving healthcare and health service quality. This methodology, expressing logical-deductive awareness raising, provides expert consensual information validating each element regarding overall skills.

  11. The fundamental downscaling limit of field effect transistors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mamaluy, Denis, E-mail: mamaluy@sandia.gov; Gao, Xujiao

    2015-05-11

    We predict that within next 15 years a fundamental down-scaling limit for CMOS technology and other Field-Effect Transistors (FETs) will be reached. Specifically, we show that at room temperatures all FETs, irrespective of their channel material, will start experiencing unacceptable level of thermally induced errors around 5-nm gate lengths. These findings were confirmed by performing quantum mechanical transport simulations for a variety of 6-, 5-, and 4-nm gate length Si devices, optimized to satisfy high-performance logic specifications by ITRS. Different channel materials and wafer/channel orientations have also been studied; it is found that altering channel-source-drain materials achieves only insignificant increasemore » in switching energy, which overall cannot sufficiently delay the approaching downscaling limit. Alternative possibilities are discussed to continue the increase of logic element densities for room temperature operation below the said limit.« less

  12. The fundamental downscaling limit of field effect transistors

    DOE PAGES

    Mamaluy, Denis; Gao, Xujiao

    2015-05-12

    We predict that within next 15 years a fundamental down-scaling limit for CMOS technology and other Field-Effect Transistors (FETs) will be reached. Specifically, we show that at room temperatures all FETs, irrespective of their channel material, will start experiencing unacceptable level of thermally induced errors around 5-nm gate lengths. These findings were confirmed by performing quantum mechanical transport simulations for a variety of 6-, 5-, and 4-nm gate length Si devices, optimized to satisfy high-performance logic specifications by ITRS. Different channel materials and wafer/channel orientations have also been studied; it is found that altering channel-source-drain materials achieves only insignificant increasemore » in switching energy, which overall cannot sufficiently delay the approaching downscaling limit. Alternative possibilities are discussed to continue the increase of logic element densities for room temperature operation below the said limit.« less

  13. Space platform expendables resupply concept definition study. Volume 3: Work breakdown structure and work breakdown structure dictionary

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The work breakdown structure (WBS) for the Space Platform Expendables Resupply Concept Definition Study is described. The WBS consists of a list of WBS elements, a dictionary of element definitions, and an element logic diagram. The list and logic diagram identify the interrelationships of the elements. The dictionary defines the types of work that may be represented by or be classified under each specific element. The Space Platform Expendable Resupply WBS was selected mainly to support the program planning, scheduling, and costing performed in the programmatics task (task 3). The WBS is neither a statement-of-work nor a work authorization document. Rather, it is a framework around which to define requirements, plan effort, assign responsibilities, allocate and control resources, and report progress, expenditures, technical performance, and schedule performance. The WBS element definitions are independent of make-or-buy decisions, organizational structure, and activity locations unless exceptions are specifically stated.

  14. Fuzzy efficiency optimization of AC induction motors

    NASA Technical Reports Server (NTRS)

    Jani, Yashvant; Sousa, Gilberto; Turner, Wayne; Spiegel, Ron; Chappell, Jeff

    1993-01-01

    This paper describes the early states of work to implement a fuzzy logic controller to optimize the efficiency of AC induction motor/adjustable speed drive (ASD) systems running at less than optimal speed and torque conditions. In this paper, the process by which the membership functions of the controller were tuned is discussed and a controller which operates on frequency as well as voltage is proposed. The membership functions for this dual-variable controller are sketched. Additional topics include an approach for fuzzy logic to motor current control which can be used with vector-controlled drives. Incorporation of a fuzzy controller as an application-specific integrated circuit (ASIC) microchip is planned.

  15. Intelligent control of a multi-degree-of freedom reaction compensating platform system using fuzzy logic

    NASA Technical Reports Server (NTRS)

    Choi, Benjamin B.; Lawrence, Charles; Lin, Yueh-Jaw

    1994-01-01

    This paper presents the development of a general-purpose fuzzy logic (FL) control methodology for isolating the external vibratory disturbances of space-based devices. According to the desired performance specifications, a full investigation regarding the development of an FL controller was done using different scenarios, such as variances of passive reaction-compensating components and external disturbance load. It was shown that the proposed FL controller is robust in that the FL-controlled system closely follows the prespecified ideal reference model. The comparative study also reveals that the FL-controlled system achieves significant improvement in reducing vibrations over passive systems.

  16. A demonstration of CMOS VLSI circuit prototyping in support of the site facility using the 1.2 micron standard cell library developed by National Security Agency

    NASA Technical Reports Server (NTRS)

    Smith, Edwyn D.

    1991-01-01

    Two silicon CMOS application specific integrated circuits (ASICs), a data generation chip, and a data checker chip were designed. The conversion of the data generator circuitry into a pair of CMOS ASIC chips using the 1.2 micron standard cell library is documented. The logic design of the data checker is discussed. The functions of the control circuitry is described. An accurate estimate of timing relationships is essential to make sure that the logic design performs correctly under practical conditions. Timing and delay information are examined.

  17. Aptamer-Binding Directed DNA Origami Pattern for Logic Gates.

    PubMed

    Yang, Jing; Jiang, Shuoxing; Liu, Xiangrong; Pan, Linqiang; Zhang, Cheng

    2016-12-14

    In this study, an aptamer-substrate strategy is introduced to control programmable DNA origami pattern. Combined with DNA aptamer-substrate binding and DNAzyme-cutting, small DNA tiles were specifically controlled to fill into the predesigned DNA origami frame. Here, a set of DNA logic gates (OR, YES, and AND) are performed in response to the stimuli of adenosine triphosphate (ATP) and cocaine. The experimental results are confirmed by AFM imaging and time-dependent fluorescence changes, demonstrating that the geometric patterns are regulated in a controllable and programmable manner. Our approach provides a new platform for engineering programmable origami nanopatterns and constructing complex DNA nanodevices.

  18. Interruption as a test of the user-computer interface

    NASA Technical Reports Server (NTRS)

    Kreifeldt, J. G.; Mccarthy, M. E.

    1981-01-01

    In order to study the effects different logic systems might have on interrupted operation, an algebraic calculator and a reverse polish notation calculator were compared when trained users were interrupted during problem entry. The RPN calculator showed markedly superior resistance to interruption effects compared to the AN calculator although no significant differences were found when the users were not interrupted. Causes and possible remedies for interruption effects are speculated. It is proposed that because interruption is such a common occurrence, it be incorporated into comparative evaluation tests of different logic system and control/display system and that interruption resistance be adopted as a specific design criteria for such design.

  19. Beyond "Brown": Empirical Research on Diverse Learners with or At-Risk for Specific Learning Disabilities from 1994-2012

    ERIC Educational Resources Information Center

    Trent, Stanley C.; Drivers, Melissa; Rodriguez, Diane; Oh, Kevin; Stewart, Shavon; Kea, Cathy; Artiles, Alfredo; Hull, Michael

    2014-01-01

    We conducted a literature review to determine the presence of culturally and linguistically diverse (CLD) learners in research on specific learning disabilities (SLD) from 1994-2012. We believed that disaggregation of results by category might identify nuances that will guide future policies, research, and practice. We deemed it logical to begin…

  20. Freight Transportation Energy Use : Volume 2. Methodology and Program Documentation.

    DOT National Transportation Integrated Search

    1978-07-01

    The structure and logic of the transportation network model component of the TSC Freight Energy Model are presented. The model assigns given origin-destination commodity flows to specific transport modes and routes, thereby determining the traffic lo...

  1. ITS logical architecture : volume II, process specifications.

    DOT National Transportation Integrated Search

    1981-07-01

    Author's abstract: This report identifies 24 critical issues related to pedestrian and bicycle facilities and programs, summarizes the state-of-the-art on each issue as it is contained in the published literature, and provides a concise commentary on...

  2. Robust Satisficing Decision Making for Unmanned Aerial Vehicle Complex Missions under Severe Uncertainty

    PubMed Central

    Ji, Xiaoting; Niu, Yifeng; Shen, Lincheng

    2016-01-01

    This paper presents a robust satisficing decision-making method for Unmanned Aerial Vehicles (UAVs) executing complex missions in an uncertain environment. Motivated by the info-gap decision theory, we formulate this problem as a novel robust satisficing optimization problem, of which the objective is to maximize the robustness while satisfying some desired mission requirements. Specifically, a new info-gap based Markov Decision Process (IMDP) is constructed to abstract the uncertain UAV system and specify the complex mission requirements with the Linear Temporal Logic (LTL). A robust satisficing policy is obtained to maximize the robustness to the uncertain IMDP while ensuring a desired probability of satisfying the LTL specifications. To this end, we propose a two-stage robust satisficing solution strategy which consists of the construction of a product IMDP and the generation of a robust satisficing policy. In the first stage, a product IMDP is constructed by combining the IMDP with an automaton representing the LTL specifications. In the second, an algorithm based on robust dynamic programming is proposed to generate a robust satisficing policy, while an associated robustness evaluation algorithm is presented to evaluate the robustness. Finally, through Monte Carlo simulation, the effectiveness of our algorithms is demonstrated on an UAV search mission under severe uncertainty so that the resulting policy can maximize the robustness while reaching the desired performance level. Furthermore, by comparing the proposed method with other robust decision-making methods, it can be concluded that our policy can tolerate higher uncertainty so that the desired performance level can be guaranteed, which indicates that the proposed method is much more effective in real applications. PMID:27835670

  3. Robust Satisficing Decision Making for Unmanned Aerial Vehicle Complex Missions under Severe Uncertainty.

    PubMed

    Ji, Xiaoting; Niu, Yifeng; Shen, Lincheng

    2016-01-01

    This paper presents a robust satisficing decision-making method for Unmanned Aerial Vehicles (UAVs) executing complex missions in an uncertain environment. Motivated by the info-gap decision theory, we formulate this problem as a novel robust satisficing optimization problem, of which the objective is to maximize the robustness while satisfying some desired mission requirements. Specifically, a new info-gap based Markov Decision Process (IMDP) is constructed to abstract the uncertain UAV system and specify the complex mission requirements with the Linear Temporal Logic (LTL). A robust satisficing policy is obtained to maximize the robustness to the uncertain IMDP while ensuring a desired probability of satisfying the LTL specifications. To this end, we propose a two-stage robust satisficing solution strategy which consists of the construction of a product IMDP and the generation of a robust satisficing policy. In the first stage, a product IMDP is constructed by combining the IMDP with an automaton representing the LTL specifications. In the second, an algorithm based on robust dynamic programming is proposed to generate a robust satisficing policy, while an associated robustness evaluation algorithm is presented to evaluate the robustness. Finally, through Monte Carlo simulation, the effectiveness of our algorithms is demonstrated on an UAV search mission under severe uncertainty so that the resulting policy can maximize the robustness while reaching the desired performance level. Furthermore, by comparing the proposed method with other robust decision-making methods, it can be concluded that our policy can tolerate higher uncertainty so that the desired performance level can be guaranteed, which indicates that the proposed method is much more effective in real applications.

  4. Advanced Platform Systems Technology study. Volume 4: Technology advancement program plan

    NASA Technical Reports Server (NTRS)

    1983-01-01

    An overview study of the major technology definition tasks and subtasks along with their interfaces and interrelationships is presented. Although not specifically indicated in the diagram, iterations were required at many steps to finalize the results. The development of the integrated technology advancement plan was initiated by using the results of the previous two tasks, i.e., the trade studies and the preliminary cost and schedule estimates for the selected technologies. Descriptions for the development of each viable technology advancement was drawn from the trade studies. Additionally, a logic flow diagram depicting the steps in developing each technology element was developed along with descriptions for each of the major elements. Next, major elements of the logic flow diagrams were time phased, and that allowed the definition of a technology development schedule that was consistent with the space station program schedule when possible. Schedules show the major milestone including tests required as described in the logic flow diagrams.

  5. Installing logic-gate responses to a variety of biological substances in supramolecular hydrogel-enzyme hybrids.

    PubMed

    Ikeda, Masato; Tanida, Tatsuya; Yoshii, Tatsuyuki; Kurotani, Kazuya; Onogi, Shoji; Urayama, Kenji; Hamachi, Itaru

    2014-06-01

    Soft materials that exhibit stimuli-responsive behaviour under aqueous conditions (such as supramolecular hydrogels composed of self-assembled nanofibres) have many potential biological applications. However, designing a macroscopic response to structurally complex biochemical stimuli in these materials still remains a challenge. Here we show that redox-responsive peptide-based hydrogels have the ability to encapsulate enzymes and still retain their activities. Moreover, cooperative coupling of enzymatic reactions with the gel response enables us to construct unique stimuli-responsive soft materials capable of sensing a variety of disease-related biomarkers. The programmable gel-sol response (even to biological samples) is visible to the naked eye. Furthermore, we built Boolean logic gates (OR and AND) into the hydrogel-enzyme hybrid materials, which were able to sense simultaneously plural specific biochemicals and execute a controlled drug release in accordance with the logic operation. The intelligent soft materials that we have developed may prove valuable in future medical diagnostics or treatments.

  6. [Documenting a rehabilitation program using a logic model: an advantage to the assessment process].

    PubMed

    Poncet, Frédérique; Swaine, Bonnie; Pradat-Diehl, Pascale

    2017-03-06

    The cognitive and behavioral disorders after brain injury can result in severe limitations of activities and restrictions of participation. An interdisciplinary rehabilitation program was developed in physical medicine and rehabilitation at the Pitié-Salpêtriere Hospital, Paris, France. Clinicians believe this program decreases activity limitations and improves participation in patients. However, the program’s effectiveness had never been assessed. To do this, we had to define/describe this program. However rehabilitation programs are holistic and thus complex making them difficult to describe. Therefore, to facilitate the evaluation of complex programs, including those for rehabilitation, we illustrate the use of a theoretical logic model, as proposed by Champagne, through the process of documentation of a specific complex and interdisciplinary rehabilitation program. Through participatory/collaborative research, the rehabilitation program was analyzed using three “submodels” of the logic model of intervention: causal model, intervention model and program theory model. This should facilitate the evaluation of programs, including those for rehabilitation.

  7. Installing logic-gate responses to a variety of biological substances in supramolecular hydrogel-enzyme hybrids

    NASA Astrophysics Data System (ADS)

    Ikeda, Masato; Tanida, Tatsuya; Yoshii, Tatsuyuki; Kurotani, Kazuya; Onogi, Shoji; Urayama, Kenji; Hamachi, Itaru

    2014-06-01

    Soft materials that exhibit stimuli-responsive behaviour under aqueous conditions (such as supramolecular hydrogels composed of self-assembled nanofibres) have many potential biological applications. However, designing a macroscopic response to structurally complex biochemical stimuli in these materials still remains a challenge. Here we show that redox-responsive peptide-based hydrogels have the ability to encapsulate enzymes and still retain their activities. Moreover, cooperative coupling of enzymatic reactions with the gel response enables us to construct unique stimuli-responsive soft materials capable of sensing a variety of disease-related biomarkers. The programmable gel-sol response (even to biological samples) is visible to the naked eye. Furthermore, we built Boolean logic gates (OR and AND) into the hydrogel-enzyme hybrid materials, which were able to sense simultaneously plural specific biochemicals and execute a controlled drug release in accordance with the logic operation. The intelligent soft materials that we have developed may prove valuable in future medical diagnostics or treatments.

  8. Digital Poetry: A Narrow Relation between Poetics and the Codes of the Computational Logic

    NASA Astrophysics Data System (ADS)

    Laurentiz, Silvia

    The project "Percorrendo Escrituras" (Walking Through Writings Project) has been developed at ECA-USP Fine Arts Department. Summarizing, it intends to study different structures of digital information that share the same universe and are generators of a new aesthetics condition. The aim is to search which are the expressive possibilities of the computer among the algorithm functions and other of its specific properties. It is a practical, theoretical and interdisciplinary project where the study of programming evolutionary language, logic and mathematics take us to poetic experimentations. The focus of this research is the digital poetry, and it comes from poetics of permutation combinations and culminates with dynamic and complex systems, autonomous, multi-user and interactive, through agents generation derivations, filtration and emergent standards. This lecture will present artworks that use some mechanisms introduced by cybernetics and the notion of system in digital poetry that demonstrate the narrow relationship between poetics and the codes of computational logic.

  9. Fuzzy-logic based strategy for validation of multiplex methods: example with qualitative GMO assays.

    PubMed

    Bellocchi, Gianni; Bertholet, Vincent; Hamels, Sandrine; Moens, W; Remacle, José; Van den Eede, Guy

    2010-02-01

    This paper illustrates the advantages that a fuzzy-based aggregation method could bring into the validation of a multiplex method for GMO detection (DualChip GMO kit, Eppendorf). Guidelines for validation of chemical, bio-chemical, pharmaceutical and genetic methods have been developed and ad hoc validation statistics are available and routinely used, for in-house and inter-laboratory testing, and decision-making. Fuzzy logic allows summarising the information obtained by independent validation statistics into one synthetic indicator of overall method performance. The microarray technology, introduced for simultaneous identification of multiple GMOs, poses specific validation issues (patterns of performance for a variety of GMOs at different concentrations). A fuzzy-based indicator for overall evaluation is illustrated in this paper, and applied to validation data for different genetically modified elements. Remarks were drawn on the analytical results. The fuzzy-logic based rules were shown to be applicable to improve interpretation of results and facilitate overall evaluation of the multiplex method.

  10. Fuzzy logic and neural networks in artificial intelligence and pattern recognition

    NASA Astrophysics Data System (ADS)

    Sanchez, Elie

    1991-10-01

    With the use of fuzzy logic techniques, neural computing can be integrated in symbolic reasoning to solve complex real world problems. In fact, artificial neural networks, expert systems, and fuzzy logic systems, in the context of approximate reasoning, share common features and techniques. A model of Fuzzy Connectionist Expert System is introduced, in which an artificial neural network is designed to construct the knowledge base of an expert system from, training examples (this model can also be used for specifications of rules in fuzzy logic control). Two types of weights are associated with the synaptic connections in an AND-OR structure: primary linguistic weights, interpreted as labels of fuzzy sets, and secondary numerical weights. Cell activation is computed through min-max fuzzy equations of the weights. Learning consists in finding the (numerical) weights and the network topology. This feedforward network is described and first illustrated in a biomedical application (medical diagnosis assistance from inflammatory-syndromes/proteins profiles). Then, it is shown how this methodology can be utilized for handwritten pattern recognition (characters play the role of diagnoses): in a fuzzy neuron describing a number for example, the linguistic weights represent fuzzy sets on cross-detecting lines and the numerical weights reflect the importance (or weakness) of connections between cross-detecting lines and characters.

  11. The Nature of Quantum Truth: Logic, Set Theory, & Mathematics in the Context of Quantum Theory

    NASA Astrophysics Data System (ADS)

    Frey, Kimberly

    The purpose of this dissertation is to construct a radically new type of mathematics whose underlying logic differs from the ordinary classical logic used in standard mathematics, and which we feel may be more natural for applications in quantum mechanics. Specifically, we begin by constructing a first order quantum logic, the development of which closely parallels that of ordinary (classical) first order logic --- the essential differences are in the nature of the logical axioms, which, in our construction, are motivated by quantum theory. After showing that the axiomatic first order logic we develop is sound and complete (with respect to a particular class of models), this logic is then used as a foundation on which to build (axiomatic) mathematical systems --- and we refer to the resulting new mathematics as "quantum mathematics." As noted above, the hope is that this form of mathematics is more natural than classical mathematics for the description of quantum systems, and will enable us to address some foundational aspects of quantum theory which are still troublesome --- e.g. the measurement problem --- as well as possibly even inform our thinking about quantum gravity. After constructing the underlying logic, we investigate properties of several mathematical systems --- e.g. axiom systems for abstract algebras, group theory, linear algebra, etc. --- in the presence of this quantum logic. In the process, we demonstrate that the resulting quantum mathematical systems have some strange, but very interesting features, which indicates a richness in the structure of mathematics that is classically inaccessible. Moreover, some of these features do indeed suggest possible applications to foundational questions in quantum theory. We continue our investigation of quantum mathematics by constructing an axiomatic quantum set theory, which we show satisfies certain desirable criteria. Ultimately, we hope that such a set theory will lead to a foundation for quantum mathematics in a sense which parallels the foundational role of classical set theory in classical mathematics. One immediate application of the quantum set theory we develop is to provide a foundation on which to construct quantum natural numbers, which are the quantum analog of the classical counting numbers. It turns out that in a special class of models, there exists a 1-1 correspondence between the quantum natural numbers and bounded observables in quantum theory whose eigenvalues are (ordinary) natural numbers. This 1-1 correspondence is remarkably satisfying, and not only gives us great confidence in our quantum set theory, but indicates the naturalness of such models for quantum theory itself. We go on to develop a Peano-like arithmetic for these new "numbers," as well as consider some of its consequences. Finally, we conclude by summarizing our results, and discussing directions for future work.

  12. Flight Guidance System Requirements Specification

    NASA Technical Reports Server (NTRS)

    Miller, Steven P.; Tribble, Alan C.; Carlson, Timothy M.; Danielson, Eric J.

    2003-01-01

    This report describes a requirements specification written in the RSML-e language for the mode logic of a Flight Guidance System of a typical regional jet aircraft. This model was created as one of the first steps in a five-year project sponsored by the NASA Langley Research Center, Rockwell Collins Inc., and the Critical Systems Research Group of the University of Minnesota to develop new methods and tools to improve the safety of avionics designs. This model will be used to demonstrate the application of a variety of methods and techniques, including safety analysis of system and subsystem requirements, verification of key properties using theorem provers and model checkers, identification of potential sources mode confusion in system designs, partitioning of applications based on the criticality of system hazards, and autogeneration of avionics quality code. While this model is representative of the mode logic of a typical regional jet aircraft, it does not describe an actual or planned product. Several aspects of a full Flight Guidance System, such as recovery from failed sensors, have been omitted, and no claims are made regarding the accuracy or completeness of this specification.

  13. An Investigation of Quantum Dot Super Lattice Use in Nonvolatile Memory and Transistors

    NASA Astrophysics Data System (ADS)

    Mirdha, P.; Parthasarathy, B.; Kondo, J.; Chan, P.-Y.; Heller, E.; Jain, F. C.

    2018-02-01

    Site-specific self-assembled colloidal quantum dots (QDs) will deposit in two layers only on p-type substrate to form a QD superlattice (QDSL). The QDSL structure has been integrated into the floating gate of a nonvolatile memory component and has demonstrated promising results in multi-bit storage, ease of fabrication, and memory retention. Additionally, multi-valued logic devices and circuits have been created by using QDSL structures which demonstrated ternary and quaternary logic. With increasing use of site-specific self-assembled QDSLs, fundamental understanding of silicon and germanium QDSL charge storage capability, self-assembly on specific surfaces, uniform distribution, and mini-band formation has to be understood for successful implementation in devices. In this work, we investigate the differences in electron charge storage by building metal-oxide semiconductor (MOS) capacitors and using capacitance and voltage measurements to quantify the storage capabilities. The self-assembly process and distribution density of the QDSL is done by obtaining atomic force microscopy (AFM) results on line samples. Additionally, we present a summary of the theoretical density of states in each of the QDSLs.

  14. Representations of temporal information in short-term memory: Are they modality-specific?

    PubMed

    Bratzke, Daniel; Quinn, Katrina R; Ulrich, Rolf; Bausenhart, Karin M

    2016-10-01

    Rattat and Picard (2012) reported that the coding of temporal information in short-term memory is modality-specific, that is, temporal information received via the visual (auditory) modality is stored as a visual (auditory) code. This conclusion was supported by modality-specific interference effects on visual and auditory duration discrimination, which were induced by secondary tasks (visual tracking or articulatory suppression), presented during a retention interval. The present study assessed the stability of these modality-specific interference effects. Our study did not replicate the selective interference pattern but rather indicated that articulatory suppression not only impairs short-term memory for auditory but also for visual durations. This result pattern supports a crossmodal or an abstract view of temporal encoding. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Topographical gradients of semantics and phonology revealed by temporal lobe stimulation.

    PubMed

    Miozzo, Michele; Williams, Alicia C; McKhann, Guy M; Hamberger, Marla J

    2017-02-01

    Word retrieval is a fundamental component of oral communication, and it is well established that this function is supported by left temporal cortex. Nevertheless, the specific temporal areas mediating word retrieval and the particular linguistic processes these regions support have not been well delineated. Toward this end, we analyzed over 1000 naming errors induced by left temporal cortical stimulation in epilepsy surgery patients. Errors were primarily semantic (lemon → "pear"), phonological (horn → "corn"), non-responses, and delayed responses (correct responses after a delay), and each error type appeared predominantly in a specific region: semantic errors in mid-middle temporal gyrus (TG), phonological errors and delayed responses in middle and posterior superior TG, and non-responses in anterior inferior TG. To the extent that semantic errors, phonological errors and delayed responses reflect disruptions in different processes, our results imply topographical specialization of semantic and phonological processing. Specifically, results revealed an inferior-to-superior gradient, with more superior regions associated with phonological processing. Further, errors were increasingly semantically related to targets toward posterior temporal cortex. We speculate that detailed semantic input is needed to support phonological retrieval, and thus, the specificity of semantic input increases progressively toward posterior temporal regions implicated in phonological processing. Hum Brain Mapp 38:688-703, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  16. Algorithms and theory for the design and programming of industrial control systems materialized with PLC's

    NASA Astrophysics Data System (ADS)

    Montoya Villena, Rafael

    According to its title, the general objective of the Thesis consists in developing a clear, simple and systematic methodology for programming type PLC devices. With this aim in mind, we will use the following elements: Codification of all variables types. This section is very important since it allows us working with little information. The necessary rules are given to codify all type of phrases produced in industrial processes. An algorithm that describes process evolution and that has been called process D.F. This is one of the most important contributions, since it will allow us, together with information codification, representing the process evolution in a graphic way and with any design theory used. Theory selection. Evidently, the use of some kind of design method is necessary to obtain logic equations. For this particular case, we will use binodal theory, an ideal theory for wired technologies, since it can obtain highly reduced schemas for relatively simple automatisms, which means a minimum number of components used. User program outline algorithm (D.F.P.). This is another necessary contribution and perhaps the most important one, since logic equations resulting from binodal theory are compatible with process evolution if wired technology is used, whether it is electric, electronic, pneumatic, etc. On the other hand, PLC devices performance characteristics force the program instructions order to validate or not the automatism, as we have proven in different articles and lectures at congresses both national and international. Therefore, we will codify any information concerning the automating process, graphically represent its temporal evolution and, applying binodal theory and D.F.P (previously adapted), succeed in making logic equations compatible with the process to be automated and the device in which they will be implemented (PLC in our case)

  17. A general approach for developing system-specific functions to score protein-ligand docked complexes using support vector inductive logic programming.

    PubMed

    Amini, Ata; Shrimpton, Paul J; Muggleton, Stephen H; Sternberg, Michael J E

    2007-12-01

    Despite the increased recent use of protein-ligand and protein-protein docking in the drug discovery process due to the increases in computational power, the difficulty of accurately ranking the binding affinities of a series of ligands or a series of proteins docked to a protein receptor remains largely unsolved. This problem is of major concern in lead optimization procedures and has lead to the development of scoring functions tailored to rank the binding affinities of a series of ligands to a specific system. However, such methods can take a long time to develop and their transferability to other systems remains open to question. Here we demonstrate that given a suitable amount of background information a new approach using support vector inductive logic programming (SVILP) can be used to produce system-specific scoring functions. Inductive logic programming (ILP) learns logic-based rules for a given dataset that can be used to describe properties of each member of the set in a qualitative manner. By combining ILP with support vector machine regression, a quantitative set of rules can be obtained. SVILP has previously been used in a biological context to examine datasets containing a series of singular molecular structures and properties. Here we describe the use of SVILP to produce binding affinity predictions of a series of ligands to a particular protein. We also for the first time examine the applicability of SVILP techniques to datasets consisting of protein-ligand complexes. Our results show that SVILP performs comparably with other state-of-the-art methods on five protein-ligand systems as judged by similar cross-validated squares of their correlation coefficients. A McNemar test comparing SVILP to CoMFA and CoMSIA across the five systems indicates our method to be significantly better on one occasion. The ability to graphically display and understand the SVILP-produced rules is demonstrated and this feature of ILP can be used to derive hypothesis for future ligand design in lead optimization procedures. The approach can readily be extended to evaluate the binding affinities of a series of protein-protein complexes. (c) 2007 Wiley-Liss, Inc.

  18. An adaptive staircase procedure for the E-Prime programming environment.

    PubMed

    Hairston, W David; Maldjian, Joseph A

    2009-01-01

    Many studies need to determine a subject's threshold for a given task. This can be achieved efficiently using an adaptive staircase procedure. While the logic and algorithms for staircases have been well established, the few pre-programmed routines currently available to researchers require at least moderate programming experience to integrate into new paradigms and experimental settings. Here, we describe a freely distributed routine developed for the E-Prime programming environment that can be easily integrated into any experimental protocol with only a basic understanding of E-Prime. An example experiment (visual temporal-order-judgment task) where subjects report the order of occurrence of two circles illustrates the behavior and consistency of the routine.

  19. Efficient Translation of LTL Formulae into Buchi Automata

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Lerda, Flavio

    2001-01-01

    Model checking is a fully automated technique for checking that a system satisfies a set of required properties. With explicit-state model checkers, properties are typically defined in linear-time temporal logic (LTL), and are translated into B chi automata in order to be checked. This report presents how we have combined and improved existing techniques to obtain an efficient LTL to B chi automata translator. In particular, we optimize the core of existing tableau-based approaches to generate significantly smaller automata. Our approach has been implemented and is being released as part of the Java PathFinder software (JPF), an explicit state model checker under development at the NASA Ames Research Center.

  20. Causation and Validation of Nursing Diagnoses: A Middle Range Theory.

    PubMed

    de Oliveira Lopes, Marcos Venícios; da Silva, Viviane Martins; Herdman, T Heather

    2017-01-01

    To describe a predictive middle range theory (MRT) that provides a process for validation and incorporation of nursing diagnoses in clinical practice. Literature review. The MRT includes definitions, a pictorial scheme, propositions, causal relationships, and translation to nursing practice. The MRT can be a useful alternative for education, research, and translation of this knowledge into practice. This MRT can assist clinicians in understanding clinical reasoning, based on temporal logic and spectral interaction among elements of nursing classifications. In turn, this understanding will improve the use and accuracy of nursing diagnosis, which is a critical component of the nursing process that forms a basis for nursing practice standards worldwide. © 2015 NANDA International, Inc.

  1. WITH: a system to write clinical trials using XML and RDBMS.

    PubMed Central

    Fazi, Paola; Luzi, Daniela; Manco, Mariarosaria; Ricci, Fabrizio L.; Toffoli, Giovanni; Vignetti, Marco

    2002-01-01

    The paper illustrates the system WITH (Write on Internet clinical Trials in Haematology) which supports the writing of a clinical trial (CT) document. The requirements of this system have been defined analysing the writing process of a CT and then modelling the content of its sections together with their logical and temporal relationships. The system WITH allows: a) editing the document text; b) re-using the text; and c) facilitating the cooperation and the collaborative writing. It is based on XML mark-up language, and on a RDBMS. This choice guarantees: a) process standardisation; b) process management; c) efficient delivery of information-based tasks; and d) explicit focus on process design. PMID:12463823

  2. MicroRNA Functions in Osteogenesis and Dysfunctions in Osteoporosis

    PubMed Central

    van Wijnen, Andre J.; van de Peppel, Jeroen; van Leeuwen, Johannes P.; Lian, Jane B.; Stein, Gary S.; Westendorf, Jennifer J.; Oursler, Merry-Jo; Sampen, Hee-Jeong Im; Taipaleenmaki, Hanna; Hesse, Eric; Riester, Scott; Kakar, Sanjeev

    2013-01-01

    MicroRNAs (miRNAs) are critical post-transcriptional regulators of gene expression that control osteoblast mediated bone formation and osteoclast-related bone remodelling. Deregulation of miRNA mediated mechanisms is emerging as an important pathological factor in bone degeneration (e.g., osteoporosis) and other bone-related diseases. MiRNAs are intriguing regulatory molecules that are networked with cell signaling pathways and intricate transcriptional programs through ingenuous circuits with remarkably simple logic. This overview examines key principles by which miRNAs control differentiation of osteoblasts as they evolve from mesenchymal stromal cells during osteogenesis, or of osteoclasts as they originate from monocytic precursors in the hematopoietic lineage during osteoclastogenesis. Of particular note are miRNAs that are temporally up-regulated during osteoblastogenesis (e.g., miR-218) or osteoclastogenesis (e.g., miR-148a). Each miRNA stimulates differentiation by suppressing inhibitory signalling pathways (‘double-negative’ regulation). The excitement surrounding miRNAs in bone biology stems from the prominent effects that individual miRNAs can have on biological transitions during differentiation of skeletal cells and correlations of miRNA dysfunction with bone diseases. MiRNAs have significant clinical potential which is reflected by their versatility as disease-specific biomarkers and their promise as therapeutic agents to ameliorate or reverse bone tissue degeneration. PMID:23605904

  3. A Logic Model for Community Engagement within the CTSA Consortium: Can We Measure What We Model?

    PubMed Central

    Eder, Milton Mickey; Carter-Edwards, Lori; Hurd, Thelma C.; Rumala, Bernice B.; Wallerstein, Nina

    2013-01-01

    The Clinical Translation Science Award (CTSA) initiative calls upon academic health centers to engage communities around a clinical research relationship measured ultimately in terms of public health. Among a few initiatives involving university accountability for advancing public interests, a small CTSA workgroup devised a community engagement (CE) logic model that organizes common activities within a university-community infrastructure to facilitate community engagement in research. While the model focuses on the range of institutional CE inputs, it purposefully does not include an approach for assessing how community engagement influences research implementation and outcomes. Rather, with communities and individuals beginning to transition into new research roles, this article emphasizes studying community engagement through specific relationship types and assessing how expanded research teams contribute to the full spectrum of translational science. The authors propose a typology consisting of three relationship types—engagement, collaboration and shared leadership—to provide a foundation for investigating community–academic contributions to the new CTSA research paradigm. The typology shifts attention from specific community–academic activities and, instead, encourages analyses focused on measuring the strength of relationships through variables like synergy and trust. The collaborative study of CE relationships will inform an understanding of CTSA infrastructure development in support of translational research and its goal, which is expressed in the logic model: better science, better answers, better population health. PMID:23752038

  4. The role of multisensory interplay in enabling temporal expectations.

    PubMed

    Ball, Felix; Michels, Lara E; Thiele, Carsten; Noesselt, Toemme

    2018-01-01

    Temporal regularities can guide our attention to focus on a particular moment in time and to be especially vigilant just then. Previous research provided evidence for the influence of temporal expectation on perceptual processing in unisensory auditory, visual, and tactile contexts. However, in real life we are often exposed to a complex and continuous stream of multisensory events. Here we tested - in a series of experiments - whether temporal expectations can enhance perception in multisensory contexts and whether this enhancement differs from enhancements in unisensory contexts. Our discrimination paradigm contained near-threshold targets (subject-specific 75% discrimination accuracy) embedded in a sequence of distractors. The likelihood of target occurrence (early or late) was manipulated block-wise. Furthermore, we tested whether spatial and modality-specific target uncertainty (i.e. predictable vs. unpredictable target position or modality) would affect temporal expectation (TE) measured with perceptual sensitivity (d ' ) and response times (RT). In all our experiments, hidden temporal regularities improved performance for expected multisensory targets. Moreover, multisensory performance was unaffected by spatial and modality-specific uncertainty, whereas unisensory TE effects on d ' but not RT were modulated by spatial and modality-specific uncertainty. Additionally, the size of the temporal expectation effect, i.e. the increase in perceptual sensitivity and decrease of RT, scaled linearly with the likelihood of expected targets. Finally, temporal expectation effects were unaffected by varying target position within the stream. Together, our results strongly suggest that participants quickly adapt to novel temporal contexts, that they benefit from multisensory (relative to unisensory) stimulation and that multisensory benefits are maximal if the stimulus-driven uncertainty is highest. We propose that enhanced informational content (i.e. multisensory stimulation) enables the robust extraction of temporal regularities which in turn boost (uni-)sensory representations. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Annotating spatio-temporal datasets for meaningful analysis in the Web

    NASA Astrophysics Data System (ADS)

    Stasch, Christoph; Pebesma, Edzer; Scheider, Simon

    2014-05-01

    More and more environmental datasets that vary in space and time are available in the Web. This comes along with an advantage of using the data for other purposes than originally foreseen, but also with the danger that users may apply inappropriate analysis procedures due to lack of important assumptions made during the data collection process. In order to guide towards a meaningful (statistical) analysis of spatio-temporal datasets available in the Web, we have developed a Higher-Order-Logic formalism that captures some relevant assumptions in our previous work [1]. It allows to proof on meaningful spatial prediction and aggregation in a semi-automated fashion. In this poster presentation, we will present a concept for annotating spatio-temporal datasets available in the Web with concepts defined in our formalism. Therefore, we have defined a subset of the formalism as a Web Ontology Language (OWL) pattern. It allows capturing the distinction between the different spatio-temporal variable types, i.e. point patterns, fields, lattices and trajectories, that in turn determine whether a particular dataset can be interpolated or aggregated in a meaningful way using a certain procedure. The actual annotations that link spatio-temporal datasets with the concepts in the ontology pattern are provided as Linked Data. In order to allow data producers to add the annotations to their datasets, we have implemented a Web portal that uses a triple store at the backend to store the annotations and to make them available in the Linked Data cloud. Furthermore, we have implemented functions in the statistical environment R to retrieve the RDF annotations and, based on these annotations, to support a stronger typing of spatio-temporal datatypes guiding towards a meaningful analysis in R. [1] Stasch, C., Scheider, S., Pebesma, E., Kuhn, W. (2014): "Meaningful spatial prediction and aggregation", Environmental Modelling & Software, 51, 149-165.

  6. PERCLOS: A Valid Psychophysiological Measure of Alertness As Assessed by Psychomotor Vigilance

    DOT National Transportation Integrated Search

    2002-04-01

    The Logical Architecture is based on a Computer Aided Systems Engineering (CASE) model of the requirements for the flow of data and control through the various functions included in Intelligent Transportation Systems (ITS). Process Specifications pro...

  7. [Urgenturia, a logical improvement in order to better caracterize a keyword irritative symptom].

    PubMed

    Grise, Philippe; Caremel, Romain; Cherif, Mohamed; Sibert, Louis

    2007-09-01

    Multiple medical terms are used in the french medical literature to caracterize an urgency. However, it is a corner stone symptom of bladder overactivity, different from a normal physiological sensation. Specific tools have been designed to measure urgency but there is an essential need to give a specific and clear medical word according to other medical terms for urinary signs or symptoms. This leads to propose urgenturia as the specific medical term for urgency.

  8. An effective XML based name mapping mechanism within StoRM

    NASA Astrophysics Data System (ADS)

    Corso, E.; Forti, A.; Ghiselli, A.; Magnoni, L.; Zappi, R.

    2008-07-01

    In a Grid environment the naming capability allows users to refer to specific data resources in a physical storage system using a high level logical identifier. This logical identifier is typically organized in a file system like structure, a hierarchical tree of names. Storage Resource Manager (SRM) services map the logical identifier to the physical location of data evaluating a set of parameters as the desired quality of services and the VOMS attributes specified in the requests. StoRM is a SRM service developed by INFN and ICTP-EGRID to manage file and space on standard POSIX and high performing parallel and cluster file systems. An upcoming requirement in the Grid data scenario is the orthogonality of the logical name and the physical location of data, in order to refer, with the same identifier, to different copies of data archived in various storage areas with different quality of service. The mapping mechanism proposed in StoRM is based on a XML document that represents the different storage components managed by the service, the storage areas defined by the site administrator, the quality of service they provide and the Virtual Organization that want to use the storage area. An appropriate directory tree is realized in each storage component reflecting the XML schema. In this scenario StoRM is able to identify the physical location of a requested data evaluating the logical identifier and the specified attributes following the XML schema, without querying any database service. This paper presents the namespace schema defined, the different entities represented and the technical details of the StoRM implementation.

  9. Individual differences and specificity of prefrontal gamma frequency-tACS on fluid intelligence capabilities.

    PubMed

    Santarnecchi, E; Muller, T; Rossi, S; Sarkar, A; Polizzotto, N R; Rossi, A; Cohen Kadosh, R

    2016-02-01

    Emerging evidence suggests that transcranial alternating current stimulation (tACS) is an effective, frequency-specific modulator of endogenous brain oscillations, with the potential to alter cognitive performance. Here, we show that reduction in response latencies to solve complex logic problem indexing fluid intelligence is obtained through 40 Hz-tACS (gamma band) applied to the prefrontal cortex. This improvement in human performance depends on individual ability, with slower performers at baseline receiving greater benefits. The effect could have not being explained by regression to the mean, and showed task and frequency specificity: it was not observed for trials not involving logical reasoning, as well as with the application of low frequency 5 Hz-tACS (theta band) or non-periodic high frequency random noise stimulation (101-640 Hz). Moreover, performance in a spatial working memory task was not affected by brain stimulation, excluding possible effects on fluid intelligence enhancement through an increase in memory performance. We suggest that such high-level cognitive functions are dissociable by frequency-specific neuromodulatory effects, possibly related to entrainment of specific brain rhythms. We conclude that individual differences in cognitive abilities, due to acquired or developmental origins, could be reduced during frequency-specific tACS, a finding that should be taken into account for future individual cognitive rehabilitation studies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Differences in visual vs. verbal memory impairments as a result of focal temporal lobe damage in patients with traumatic brain injury.

    PubMed

    Ariza, Mar; Pueyo, Roser; Junqué, Carme; Mataró, María; Poca, María Antonia; Mena, Maria Pau; Sahuquillo, Juan

    2006-09-01

    The aim of the present study was to determine whether the type of lesion in a sample of moderate and severe traumatic brain injury (TBI) was related to material-specific memory impairment. Fifty-nine patients with TBI were classified into three groups according to whether the site of the lesion was right temporal, left temporal or diffuse. Six-months post-injury, visual (Warrington's Facial Recognition Memory Test and Rey's Complex Figure Test) and verbal (Rey's Auditory Verbal Learning Test) memories were assessed. Visual memory deficits assessed by facial memory were associated with right temporal lobe lesion, whereas verbal memory performance assessed with a list of words was related to left temporal lobe lesion. The group with diffuse injury showed both verbal and visual memory impairment. These results suggest a material-specific memory impairment in moderate and severe TBI after focal temporal lesions and a non-specific memory impairment after diffuse damage.

  11. Temporal efficiency evaluation and small-worldness characterization in temporal networks

    PubMed Central

    Dai, Zhongxiang; Chen, Yu; Li, Junhua; Fam, Johnson; Bezerianos, Anastasios; Sun, Yu

    2016-01-01

    Numerous real-world systems can be modeled as networks. To date, most network studies have been conducted assuming stationary network characteristics. Many systems, however, undergo topological changes over time. Temporal networks, which incorporate time into conventional network models, are therefore more accurate representations of such dynamic systems. Here, we introduce a novel generalized analytical framework for temporal networks, which enables 1) robust evaluation of the efficiency of temporal information exchange using two new network metrics and 2) quantitative inspection of the temporal small-worldness. Specifically, we define new robust temporal network efficiency measures by incorporating the time dependency of temporal distance. We propose a temporal regular network model, and based on this plus the redefined temporal efficiency metrics and widely used temporal random network models, we introduce a quantitative approach for identifying temporal small-world architectures (featuring high temporal network efficiency both globally and locally). In addition, within this framework, we can uncover network-specific dynamic structures. Applications to brain networks, international trade networks, and social networks reveal prominent temporal small-world properties with distinct dynamic network structures. We believe that the framework can provide further insight into dynamic changes in the network topology of various real-world systems and significantly promote research on temporal networks. PMID:27682314

  12. Temporal efficiency evaluation and small-worldness characterization in temporal networks

    NASA Astrophysics Data System (ADS)

    Dai, Zhongxiang; Chen, Yu; Li, Junhua; Fam, Johnson; Bezerianos, Anastasios; Sun, Yu

    2016-09-01

    Numerous real-world systems can be modeled as networks. To date, most network studies have been conducted assuming stationary network characteristics. Many systems, however, undergo topological changes over time. Temporal networks, which incorporate time into conventional network models, are therefore more accurate representations of such dynamic systems. Here, we introduce a novel generalized analytical framework for temporal networks, which enables 1) robust evaluation of the efficiency of temporal information exchange using two new network metrics and 2) quantitative inspection of the temporal small-worldness. Specifically, we define new robust temporal network efficiency measures by incorporating the time dependency of temporal distance. We propose a temporal regular network model, and based on this plus the redefined temporal efficiency metrics and widely used temporal random network models, we introduce a quantitative approach for identifying temporal small-world architectures (featuring high temporal network efficiency both globally and locally). In addition, within this framework, we can uncover network-specific dynamic structures. Applications to brain networks, international trade networks, and social networks reveal prominent temporal small-world properties with distinct dynamic network structures. We believe that the framework can provide further insight into dynamic changes in the network topology of various real-world systems and significantly promote research on temporal networks.

  13. Compiler writing system detail design specification. Volume 2: Component specification

    NASA Technical Reports Server (NTRS)

    Arthur, W. J.

    1974-01-01

    The logic modules and data structures composing the Meta-translator module are desribed. This module is responsible for the actual generation of the executable language compiler as a function of the input Meta-language. Machine definitions are also processed and are placed as encoded data on the compiler library data file. The transformation of intermediate language in target language object text is described.

  14. A geospatial database model for the management of remote sensing datasets at multiple spectral, spatial, and temporal scales

    NASA Astrophysics Data System (ADS)

    Ifimov, Gabriela; Pigeau, Grace; Arroyo-Mora, J. Pablo; Soffer, Raymond; Leblanc, George

    2017-10-01

    In this study the development and implementation of a geospatial database model for the management of multiscale datasets encompassing airborne imagery and associated metadata is presented. To develop the multi-source geospatial database we have used a Relational Database Management System (RDBMS) on a Structure Query Language (SQL) server which was then integrated into ArcGIS and implemented as a geodatabase. The acquired datasets were compiled, standardized, and integrated into the RDBMS, where logical associations between different types of information were linked (e.g. location, date, and instrument). Airborne data, at different processing levels (digital numbers through geocorrected reflectance), were implemented in the geospatial database where the datasets are linked spatially and temporally. An example dataset consisting of airborne hyperspectral imagery, collected for inter and intra-annual vegetation characterization and detection of potential hydrocarbon seepage events over pipeline areas, is presented. Our work provides a model for the management of airborne imagery, which is a challenging aspect of data management in remote sensing, especially when large volumes of data are collected.

  15. [Spatial and temporal evolution of the ecological environment and economy coordinated development in Hebei Province, China.

    PubMed

    Kong, Wei; Ren, Liang; Wang, Shu Jia; Liu, Yu Feng

    2016-09-01

    Based on the constructed evaluation index system of ecological environment and economy coordinated development in Hebei Province, accompanied by introducing the Coupling Degree Mo-del, the paper estimated the ecological environment comprehensive index, the economic comprehensive index and the coupling degree of ecological environment and economy coordinated development of Hebei Province from 2000 to 2014 and 11 cities in 4 years (2000, 2006, 2010, 2014). The results showed that during the study period, the level of the coordinated development of the eco-logical environment and economy in Hebei Province had been increasing, from the brink of a recession to the well coordinated development, which had gone through 3 evident stages. The coordinating degree of ecological environment and economy of the 11 cities increased year by year, and pre-sented significant difference in spatial distribution. Through analyzing the spatial and temporal evolution mechanism of the ecological environment and economy coordinated development in Hebei Province, the policy, economy, industry and location were the key contributing factors, accordingly, suggestions on the further coordinated development of ecological environment and economy in Hebei Province were proposed.

  16. Complex Event Recognition Architecture

    NASA Technical Reports Server (NTRS)

    Fitzgerald, William A.; Firby, R. James

    2009-01-01

    Complex Event Recognition Architecture (CERA) is the name of a computational architecture, and software that implements the architecture, for recognizing complex event patterns that may be spread across multiple streams of input data. One of the main components of CERA is an intuitive event pattern language that simplifies what would otherwise be the complex, difficult tasks of creating logical descriptions of combinations of temporal events and defining rules for combining information from different sources over time. In this language, recognition patterns are defined in simple, declarative statements that combine point events from given input streams with those from other streams, using conjunction, disjunction, and negation. Patterns can be built on one another recursively to describe very rich, temporally extended combinations of events. Thereafter, a run-time matching algorithm in CERA efficiently matches these patterns against input data and signals when patterns are recognized. CERA can be used to monitor complex systems and to signal operators or initiate corrective actions when anomalous conditions are recognized. CERA can be run as a stand-alone monitoring system, or it can be integrated into a larger system to automatically trigger responses to changing environments or problematic situations.

  17. Constraint-Based Abstract Semantics for Temporal Logic: A Direct Approach to Design and Implementation

    NASA Astrophysics Data System (ADS)

    Banda, Gourinath; Gallagher, John P.

    interpretation provides a practical approach to verifying properties of infinite-state systems. We apply the framework of abstract interpretation to derive an abstract semantic function for the modal μ-calculus, which is the basis for abstract model checking. The abstract semantic function is constructed directly from the standard concrete semantics together with a Galois connection between the concrete state-space and an abstract domain. There is no need for mixed or modal transition systems to abstract arbitrary temporal properties, as in previous work in the area of abstract model checking. Using the modal μ-calculus to implement CTL, the abstract semantics gives an over-approximation of the set of states in which an arbitrary CTL formula holds. Then we show that this leads directly to an effective implementation of an abstract model checking algorithm for CTL using abstract domains based on linear constraints. The implementation of the abstract semantic function makes use of an SMT solver. We describe an implemented system for proving properties of linear hybrid automata and give some experimental results.

  18. Impact of cloud timing on surface temperature and related hydroclimatic dynamics

    NASA Astrophysics Data System (ADS)

    Porporato, A. M.; Yin, J.

    2015-12-01

    Cloud feedbacks have long been identified as one of the largest source of uncertainty in climate change predictions. Differences in the spatial distribution of clouds and the related impact on surface temperature and climate dynamics have been recently emphasized in quasi-equilibrium General Circulation Models (GCM). However, much less attention has been paid to the temporal variation of cloud presence and thickness. Clouds in fact shade the solar radiation during the daytime, but also acts as greenhouse gas to reduce the emission of longwave radiation to the outer space anytime of the day. Thus it is logical to expect that even small differences in timing and thickness of clouds could result in very different predictions in GCMs. In this study, these two effects of cloud dynamics are analyzed by tracking the cloud impacts on longwave and shortwave radiation in a minimalist transient thermal balance model of the land surface. The marked changes in surface temperature due to alterations in the timing of onset of clouds demonstrate that capturing temporal variation of cloud at sub-daily scale should be a priority in cloud parameterization schemes in GCMs.

  19. [Memory peculiarities in patients with schizophrenia and their first-degree relatives].

    PubMed

    Savina, T D; Orlova, V A; Shcherbakova, N P; Korsakova, N K; Malova, Iu A; Efanova, N N; Ganisheva, T K; Nikolaev, R A

    2008-01-01

    Eighty-four families with schizophrenia: 84 patients (probands) and 73 their first-degree unaffected relatives as well as 37 normals and their relatives have been studied using pathopsychological (pictogram) and Luria's neuropsychological tests. The most prominent abnormalities both in patients and relatives were global characteristics of auditory-speech memory predominantly related to left subcortical and left temporal regions. Abnormalities of immediate recall of short logic story (SLS) were connected with dysfunction of the same brain regions. Less prominent delayed recall abnormalities of SLS were revealed only in patients and connected with left subcortical, left subcortical-frontal and left subcortical-temporal zones. This abnormality was absent in relatives and age-matched controls. The span of mediated retention was decreased in patients and, to a less degree, in relatives. A quantitative psychological analysis has demonstrated the disintegration ("schizys") between semantic conception and image memory structure in patients and, to a less degree, in relatives. Data obtained show primary memory abnormalities in families with schizophrenia related to the impairment of decoding information process in the subcortical structures, the left-side dysfunction of brain structures being predominantly typical.

  20. Minimum energy dissipation required for a logically irreversible operation

    NASA Astrophysics Data System (ADS)

    Takeuchi, Naoki; Yoshikawa, Nobuyuki

    2018-01-01

    According to Landauer's principle, the minimum heat emission required for computing is linked to logical entropy, or logical reversibility. The validity of Landauer's principle has been investigated for several decades and was finally demonstrated in recent experiments by showing that the minimum heat emission is associated with the reduction in logical entropy during a logically irreversible operation. Although the relationship between minimum heat emission and logical reversibility is being revealed, it is not clear how much free energy is required to be dissipated for a logically irreversible operation. In the present study, in order to reveal the connection between logical reversibility and free energy dissipation, we numerically demonstrated logically irreversible protocols using adiabatic superconductor logic. The calculation results of work during the protocol showed that, while the minimum heat emission conforms to Landauer's principle, the free energy dissipation can be arbitrarily reduced by performing the protocol quasistatically. The above results show that logical reversibility is not associated with thermodynamic reversibility, and that heat is not only emitted from logic devices but also absorbed by logic devices. We also formulated the heat emission from adiabatic superconductor logic during a logically irreversible operation at a finite operation speed.

  1. Using fuzzy logic analysis for siting decisions of infiltration trenches for highway runoff control.

    PubMed

    Ki, Seo Jin; Ray, Chittaranjan

    2014-09-15

    Determining optimal locations for best management practices (BMPs), including their field considerations and limitations, plays an important role for effective stormwater management. However, these issues have been often overlooked in modeling studies that focused on downstream water quality benefits. This study illustrates the methodology of locating infiltration trenches at suitable locations from spatial overlay analyses which combine multiple layers that address different aspects of field application into a composite map. Using seven thematic layers for each analysis, fuzzy logic was employed to develop a site suitability map for infiltration trenches, whereas the DRASTIC method was used to produce a groundwater vulnerability map on the island of Oahu, Hawaii, USA. In addition, the analytic hierarchy process (AHP), one of the most popular overlay analyses, was used for comparison to fuzzy logic. The results showed that the AHP and fuzzy logic methods developed significantly different index maps in terms of best locations and suitability scores. Specifically, the AHP method provided a maximum level of site suitability due to its inherent aggregation approach of all input layers in a linear equation. The most eligible areas in locating infiltration trenches were determined from the superposition of the site suitability and groundwater vulnerability maps using the fuzzy AND operator. The resulting map successfully balanced qualification criteria for a low risk of groundwater contamination and the best BMP site selection. The results of the sensitivity analysis showed that the suitability scores were strongly affected by the algorithms embedded in fuzzy logic; therefore, caution is recommended with their use in overlay analysis. Accordingly, this study demonstrates that the fuzzy logic analysis can not only be used to improve spatial decision quality along with other overlay approaches, but also is combined with general water quality models for initial and refined searches for the best locations of BMPs at the sub-basin level. Copyright © 2014. Published by Elsevier B.V.

  2. Engineering Problem-Solving Knowledge: The Impact of Context

    ERIC Educational Resources Information Center

    Wolff, Karin

    2017-01-01

    Employer complaints of engineering graduate inability to "apply knowledge" suggests a need to interrogate the complex theory-practice relationship in twenty-first century real world contexts. Focussing specifically on the application of mathematics, physics and logic-based disciplinary knowledge, the research examines engineering…

  3. QUARTERLY TECHNICAL PROGRESS REPORT, JULY, AUGUST, SEPTEMBER 1966.

    DTIC Science & Technology

    Contents: Circuit research program; Hardware systems research; Software systems research program; Numerical methods, computer arithmetic and...artificial languages; Library automation; Illiac II service , use, and program development; IBM service , use, and program development; Problem specifications; Switching theory and logical design; General laboratory information.

  4. Risk methodology overview. [for carbon fiber release

    NASA Technical Reports Server (NTRS)

    Credeur, K. R.

    1979-01-01

    Some considerations of risk estimation, how risk is measured, and how risk analysis decisions are made are discussed. Specific problems of carbon fiber release are discussed by reviewing the objective, describing the main elements, and giving an example of the risk logic and outputs.

  5. Engineering evaluations and studies. Report for Ku-band studies, exhibit A

    NASA Technical Reports Server (NTRS)

    Dodds, J. G.; Huth, G. K.; Maronde, R. G.; Roberts, D.

    1981-01-01

    System performance aspects of the Ku band radar communication hardware and investigations into the Ku band/payload interfaces are discussed. The communications track problem caused by the excessive signal dynamic range at the servo input was investigated. The management/handover logic is discussed and a simplified description of the transmitter enable logic function is presented. Output noise produced by a voltage-controlled oscillator chip used in the SPA return-link channel 3 mid-bit detector is discussed. The deployed assembly (DA) and EA-2 critical design review data are evaluated. Cross coupling effects on antenna servo stability were examined. A series of meetings on the acceptance test specification for the deployed assembly is summarized.

  6. The design of radiation-hardened ICs for space - A compendium of approaches

    NASA Technical Reports Server (NTRS)

    Kerns, Sherra E.; Shafer, B. D; Rockett, L. R., Jr.; Pridmore, J. S.; Berndt, D. F.

    1988-01-01

    Several technologies, including bulk and epi CMOS, CMOS/SOI-SOS (silicon-on-insulator-silicon-on-sapphire), CML (current-mode logic), ECL (emitter-coupled logic), analog bipolar (JI, single-poly DI, and SOI) and GaAs E/D (enhancement/depletion) heterojunction MESFET, are discussed. The discussion includes the direct effects of space radiation on microelectronic materials and devices, how these effects are evidenced in circuit and device design parameter variations, the particular effects of most significance to each functional class of circuit, specific techniques for hardening high-speed circuits, design examples for integrated systems, including operational amplifiers and A/D (analog/digital) converters, and the computer simulation of radiation effects on microelectronic ISs.

  7. Application of fuzzy logic in multicomponent analysis by optodes.

    PubMed

    Wollenweber, M; Polster, J; Becker, T; Schmidt, H L

    1997-01-01

    Fuzzy logic can be a useful tool for the determination of substrate concentrations applying optode arrays in combination with flow injection analysis, UV-VIS spectroscopy and kinetics. The transient diffuse reflectance spectra in the visible wavelength region from four optodes were evaluated to carry out the simultaneous determination of artificial mixtures of ampicillin and penicillin. The discrimination of the samples was achieved by changing the composition of the receptor gel and working pH. Different algorithms of pre-processing were applied on the data to reduce the spectral information to a few analytic-specific variables. These variables were used to develop the fuzzy model. After calibration the model was validated by an independent test data set.

  8. Resistor-logic demultiplexers for nanoelectronics based on constant-weight codes.

    PubMed

    Kuekes, Philip J; Robinett, Warren; Roth, Ron M; Seroussi, Gadiel; Snider, Gregory S; Stanley Williams, R

    2006-02-28

    The voltage margin of a resistor-logic demultiplexer can be improved significantly by basing its connection pattern on a constant-weight code. Each distinct code determines a unique demultiplexer, and therefore a large family of circuits is defined. We consider using these demultiplexers for building nanoscale crossbar memories, and determine the voltage margin of the memory system based on a particular code. We determine a purely code-theoretic criterion for selecting codes that will yield memories with large voltage margins, which is to minimize the ratio of the maximum to the minimum Hamming distance between distinct codewords. For the specific example of a 64 × 64 crossbar, we discuss what codes provide optimal performance for a memory.

  9. Self-Paced and Temporally Constrained Throwing Performance by Team-Handball Experts and Novices without Foreknowledge of Target Position

    PubMed Central

    Rousanoglou, Elissavet N.; Noutsos, Konstantinos S.; Bayios, Ioannis A.; Boudolos, Konstantinos D.

    2015-01-01

    The fixed duration of a team-handball game and its continuously changing situations incorporate an inherent temporal pressure. Also, the target’s position is not foreknown but online determined by the player’s interceptive processing of visual information. These ecological limitations do not favour throwing performance, particularly in novice players, and are not reflected in previous experimental settings of self-paced throws with foreknowledge of target position. The study investigated the self-paced and temporally constrained throwing performance without foreknowledge of target position, in team-handball experts and novices in three shot types (Standing Shot, 3Step Shot, Jump Shot). The target position was randomly illuminated on a tabloid surface before (self-paced condition) and after (temporally constrained condition) shot initiation. Response time, throwing velocity and throwing accuracy were measured. A mixed 2 (experience) X 2 (temporal constraint condition) ANOVA was applied. The novices performed with significantly lower throwing velocity and worse throwing accuracy in all shot types (p = 0.000) and, longer response time only in the 3Step Shot (p = 0.013). The temporal constraint (significantly shorter response times in all shot types at p = 0.000) had a shot specific effect with lower throwing velocity only in the 3Step Shot (p = 0.001) and an unexpected greater throwing accuracy only in the Standing Shot (p = 0.002). The significant interaction between experience and temporal constraint condition in throwing accuracy (p = 0.003) revealed a significant temporal constraint effect in the novices (p = 0.002) but not in the experts (p = 0.798). The main findings of the study are the shot specificity of the temporal constraint effect, as well as that, depending on the shot, the novices’ throwing accuracy may benefit rather than worsen under temporal pressure. Key points The temporal constraint induced a shot specific significant difference in throwing velocity in both the experts and the novices. The temporal constraint induced a shot specific significant difference in throwing accuracy only in the novices. Depending on the shot demands, the throwing accuracy of the novices may benefit under temporally constrained situations. PMID:25729288

  10. Method and Apparatus for Simultaneous Processing of Multiple Functions

    NASA Technical Reports Server (NTRS)

    Stoica, Adrian (Inventor); Andrei, Radu (Inventor)

    2017-01-01

    Electronic logic gates that operate using N logic state levels, where N is greater than 2, and methods of operating such gates. The electronic logic gates operate according to truth tables. At least two input signals each having a logic state that can range over more than two logic states are provided to the logic gates. The logic gates each provide an output signal that can have one of N logic states. Examples of gates described include NAND/NAND gates having two inputs A and B and NAND/NAND gates having three inputs A, B, and C, where A, B and C can take any of four logic states. Systems using such gates are described, and their operation illustrated. Optical logic gates that operate using N logic state levels are also described.

  11. Method and Apparatus for Simultaneous Processing of Multiple Functions

    NASA Technical Reports Server (NTRS)

    Stoica, Adrian (Inventor); Andrei, Radu (Inventor); Zhu, David (Inventor); Mojarradi, Mohammad Mehdi (Inventor); Vo, Tuan A. (Inventor)

    2015-01-01

    Electronic logic gates that operate using N logic state levels, where N is greater than 2, and methods of operating such gates. The electronic logic gates operate according to truth tables. At least two input signals each having a logic state that can range over more than two logic states are provided to the logic gates. The logic gates each provide an output signal that can have one of N logic states. Examples of gates described include NAND/NAND gates having two inputs A and B and NAND/NAND gates having three inputs A, B, and C, where A, B and C can take any of four logic states. Systems using such gates are described, and their operation illustrated. Optical logic gates that operate using N logic state levels are also described.

  12. Impact of hippocampal subfield histopathology in episodic memory impairment in mesial temporal lobe epilepsy and hippocampal sclerosis.

    PubMed

    Comper, Sandra Mara; Jardim, Anaclara Prada; Corso, Jeana Torres; Gaça, Larissa Botelho; Noffs, Maria Helena Silva; Lancellotti, Carmen Lúcia Penteado; Cavalheiro, Esper Abrão; Centeno, Ricardo Silva; Yacubian, Elza Márcia Targas

    2017-10-01

    The objective of the study was to analyze preoperative visual and verbal episodic memories in a homogeneous series of patients with mesial temporal lobe epilepsy (MTLE) and unilateral hippocampal sclerosis (HS) submitted to corticoamygdalohippocampectomy and its association with neuronal cell density of each hippocampal subfield. The hippocampi of 72 right-handed patients were collected and prepared for histopathological examination. Hippocampal sclerosis patterns were determined, and neuronal cell density was calculated. Preoperatively, two verbal and two visual memory tests (immediate and delayed recalls) were applied, and patients were divided into two groups, left and right MTLE (36/36). There were no statistical differences between groups regarding demographic and clinical data. Cornu Ammonis 4 (CA4) neuronal density was significantly lower in the right hippocampus compared with the left (p=0.048). The groups with HS presented different memory performance - the right HS were worse in visual memory test [Complex Rey Figure, immediate (p=0.001) and delayed (p=0.009)], but better in one verbal task [RAVLT delayed (p=0.005)]. Multiple regression analysis suggested that the verbal memory performance of the group with left HS was explained by CA1 neuronal density since both tasks were significantly influenced by CA1 [Logical Memory immediate recall (p=0.050) and Logical Memory and RAVLT delayed recalls (p=0.004 and p=0.001, respectively)]. For patients with right HS, both CA1 subfield integrity (p=0.006) and epilepsy duration (p=0.012) explained Complex Rey Figure immediate recall performance. Ultimately, epilepsy duration also explained the performance in the Complex Rey Figure delayed recall (p<0.001). Cornu Ammonis 1 (CA1) hippocampal subfield was related to immediate and delayed recalls of verbal memory tests in left HS, while CA1 and epilepsy duration were associated with visual memory performance in patients with right HS. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Using Abductive Research Logic: "The Logic of Discovery", to Construct a Rigorous Explanation of Amorphous Evaluation Findings

    ERIC Educational Resources Information Center

    Levin-Rozalis, Miri

    2010-01-01

    Background: Two kinds of research logic prevail in scientific research: deductive research logic and inductive research logic. However, both fail in the field of evaluation, especially evaluation conducted in unfamiliar environments. Purpose: In this article I wish to suggest the application of a research logic--"abduction"--"the logic of…

  14. Algorithms and architecture for multiprocessor based circuit simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deutsch, J.T.

    Accurate electrical simulation is critical to the design of high performance integrated circuits. Logic simulators can verify function and give first-order timing information. Switch level simulators are more effective at dealing with charge sharing than standard logic simulators, but cannot provide accurate timing information or discover DC problems. Delay estimation techniques and cell level simulation can be used in constrained design methods, but must be tuned for each application, and circuit simulation must still be used to generate the cell models. None of these methods has the guaranteed accuracy that many circuit designers desire, and none can provide detailed waveformmore » information. Detailed electrical-level simulation can predict circuit performance if devices and parasitics are modeled accurately. However, the computational requirements of conventional circuit simulators make it impractical to simulate current large circuits. In this dissertation, the implementation of Iterated Timing Analysis (ITA), a relaxation-based technique for accurate circuit simulation, on a special-purpose multiprocessor is presented. The ITA method is an SOR-Newton, relaxation-based method which uses event-driven analysis and selective trace to exploit the temporal sparsity of the electrical network. Because event-driven selective trace techniques are employed, this algorithm lends itself to implementation on a data-driven computer.« less

  15. Cracking the barcode of fullerene-like cortical microcolumns.

    PubMed

    Tozzi, Arturo; Peters, James F; Ori, Ottorino

    2017-03-22

    Artificial neural systems and nervous graph theoretical analysis rely upon the stance that the neural code is embodied in logic circuits, e.g., spatio-temporal sequences of ON/OFF spiking neurons. Nevertheless, this assumption does not fully explain complex brain functions. Here we show how nervous activity, other than logic circuits, could instead depend on topological transformations and symmetry constraints occurring at the micro-level of the cortical microcolumn, i.e., the embryological, anatomical and functional basic unit of the brain. Tubular microcolumns can be flattened in fullerene-like two-dimensional lattices, equipped with about 80 nodes standing for pyramidal neurons where neural computations take place. We show how the countless possible combinations of activated neurons embedded in the lattice resemble a barcode. Despite the fact that further experimental verification is required in order to validate our claim, different assemblies of firing neurons might have the appearance of diverse codes, each one responsible for a single mental activity. A two-dimensional fullerene-like lattice, grounded on simple topological changes standing for pyramidal neurons' activation, not just displays analogies with the real microcolumn's microcircuitry and the neural connectome, but also the potential for the manufacture of plastic, robust and fast artificial networks in robotic forms of full-fledged neural systems. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Application of linear logic to simulation

    NASA Astrophysics Data System (ADS)

    Clarke, Thomas L.

    1998-08-01

    Linear logic, since its introduction by Girard in 1987 has proven expressive and powerful. Linear logic has provided natural encodings of Turing machines, Petri nets and other computational models. Linear logic is also capable of naturally modeling resource dependent aspects of reasoning. The distinguishing characteristic of linear logic is that it accounts for resources; two instances of the same variable are considered differently from a single instance. Linear logic thus must obey a form of the linear superposition principle. A proportion can be reasoned with only once, unless a special operator is applied. Informally, linear logic distinguishes two kinds of conjunction, two kinds of disjunction, and also introduces a modal storage operator that explicitly indicates propositions that can be reused. This paper discuses the application of linear logic to simulation. A wide variety of logics have been developed; in addition to classical logic, there are fuzzy logics, affine logics, quantum logics, etc. All of these have found application in simulations of one sort or another. The special characteristics of linear logic and its benefits for simulation will be discussed. Of particular interest is a connection that can be made between linear logic and simulated dynamics by using the concept of Lie algebras and Lie groups. Lie groups provide the connection between the exponential modal storage operators of linear logic and the eigen functions of dynamic differential operators. Particularly suggestive are possible relations between complexity result for linear logic and non-computability results for dynamical systems.

  17. Neuropsychological outcomes after Gamma Knife radiosurgery for mesial temporal lobe epilepsy: a prospective multicenter study.

    PubMed

    Quigg, Mark; Broshek, Donna K; Barbaro, Nicholas M; Ward, Mariann M; Laxer, Kenneth D; Yan, Guofen; Lamborn, Kathleen

    2011-05-01

    To assess outcomes of language, verbal memory, cognitive efficiency and mental flexibility, mood, and quality of life (QOL) in a prospective, multicenter pilot study of Gamma Knife radiosurgery (RS) for mesial temporal lobe epilepsy (MTLE). RS, randomized to 20 Gy or 24 Gy comprising 5.5-7.5 ml at the 50% isodose volume, was performed on mesial temporal structures of patients with unilateral MTLE. Neuropsychological evaluations were performed at preoperative baseline, and mean change scores were described at 12 and 24 months postoperatively. QOL data were also available at 36 months. Thirty patients were treated and 26 were available for the final 24-month neuropsychological evaluation. Language (Boston Naming Test), verbal memory (California Verbal Learning Test and Logical Memory subtest of the Wechsler Memory Scale-Revised), cognitive efficiency and mental flexibility (Trail Making Test), and mood (Beck Depression Inventory) did not differ from baseline. QOL scores improved at 24 and 36 months, with those patients attaining seizure remission by month 24s accounting for the majority of the improvement. The serial changes in cognitive outcomes, mood, and QOL are unremarkable following RS for MTLE. RS may provide an alternative to open surgery, especially in those patients at risk of cognitive impairment or who desire a noninvasive alternative to open surgery. Wiley Periodicals, Inc. © 2011 International League Against Epilepsy.

  18. Formal specification and verification of Ada software

    NASA Technical Reports Server (NTRS)

    Hird, Geoffrey R.

    1991-01-01

    The use of formal methods in software development achieves levels of quality assurance unobtainable by other means. The Larch approach to specification is described, and the specification of avionics software designed to implement the logic of a flight control system is given as an example. Penelope is described which is an Ada-verification environment. The Penelope user inputs mathematical definitions, Larch-style specifications and Ada code and performs machine-assisted proofs that the code obeys its specifications. As an example, the verification of a binary search function is considered. Emphasis is given to techniques assisting the reuse of a verification effort on modified code.

  19. Correlation of vocals and lyrics with left temporal musicogenic epilepsy.

    PubMed

    Tseng, Wei-En J; Lim, Siew-Na; Chen, Lu-An; Jou, Shuo-Bin; Hsieh, Hsiang-Yao; Cheng, Mei-Yun; Chang, Chun-Wei; Li, Han-Tao; Chiang, Hsing-I; Wu, Tony

    2018-03-15

    Whether the cognitive processing of music and speech relies on shared or distinct neuronal mechanisms remains unclear. Music and language processing in the brain are right and left temporal functions, respectively. We studied patients with musicogenic epilepsy (ME) that was specifically triggered by popular songs to analyze brain hyperexcitability triggered by specific stimuli. The study included two men and one woman (all right-handed, aged 35-55 years). The patients had sound-triggered left temporal ME in response to popular songs with vocals, but not to instrumental, classical, or nonvocal piano solo versions of the same song. Sentimental lyrics, high-pitched singing, specificity/familiarity, and singing in the native language were the most significant triggering factors. We found that recognition of the human voice and analysis of lyrics are important causal factors in left temporal ME and provide observational evidence that sounds with speech structure are predominantly processed in the left temporal lobe. A literature review indicated that language-associated stimuli triggered ME in the left temporal epileptogenic zone at a nearly twofold higher rate compared with the right temporal region. Further research on ME may enhance understanding of the cognitive neuroscience of music. © 2018 New York Academy of Sciences.

  20. Logic Models for Program Design, Implementation, and Evaluation: Workshop Toolkit. REL 2015-057

    ERIC Educational Resources Information Center

    Shakman, Karen; Rodriguez, Sheila M.

    2015-01-01

    The Logic Model Workshop Toolkit is designed to help practitioners learn the purpose of logic models, the different elements of a logic model, and the appropriate steps for developing and using a logic model for program evaluation. Topics covered in the sessions include an overview of logic models, the elements of a logic model, an introduction to…

Top