Practical Formal Verification of Diagnosability of Large Models via Symbolic Model Checking
NASA Technical Reports Server (NTRS)
Cavada, Roberto; Pecheur, Charles
2003-01-01
This document reports on the activities carried out during a four-week visit of Roberto Cavada at the NASA Ames Research Center. The main goal was to test the practical applicability of the framework proposed, where a diagnosability problem is reduced to a Symbolic Model Checking problem. Section 2 contains a brief explanation of major techniques currently used in Symbolic Model Checking, and how these techniques can be tuned in order to obtain good performances when using Model Checking tools. Diagnosability is performed on large and structured models of real plants. Section 3 describes how these plants are modeled, and how models can be simplified to improve the performance of Symbolic Model Checkers. Section 4 reports scalability results. Three test cases are briefly presented, and several parameters and techniques have been applied on those test cases in order to produce comparison tables. Furthermore, comparison between several Model Checkers is reported. Section 5 summarizes the application of diagnosability verification to a real application. Several properties have been tested, and results have been highlighted. Finally, section 6 draws some conclusions, and outlines future lines of research.
Generalized Symbolic Execution for Model Checking and Testing
NASA Technical Reports Server (NTRS)
Khurshid, Sarfraz; Pasareanu, Corina; Visser, Willem; Kofmeyer, David (Technical Monitor)
2003-01-01
Modern software systems, which often are concurrent and manipulate complex data structures must be extremely reliable. We present a novel framework based on symbolic execution, for automated checking of such systems. We provide a two-fold generalization of traditional symbolic execution based approaches: one, we define a program instrumentation, which enables standard model checkers to perform symbolic execution; two, we give a novel symbolic execution algorithm that handles dynamically allocated structures (e.g., lists and trees), method preconditions (e.g., acyclicity of lists), data (e.g., integers and strings) and concurrency. The program instrumentation enables a model checker to automatically explore program heap configurations (using a systematic treatment of aliasing) and manipulate logical formulae on program data values (using a decision procedure). We illustrate two applications of our framework: checking correctness of multi-threaded programs that take inputs from unbounded domains with complex structure and generation of non-isomorphic test inputs that satisfy a testing criterion. Our implementation for Java uses the Java PathFinder model checker.
Symbolic LTL Compilation for Model Checking: Extended Abstract
NASA Technical Reports Server (NTRS)
Rozier, Kristin Y.; Vardi, Moshe Y.
2007-01-01
In Linear Temporal Logic (LTL) model checking, we check LTL formulas representing desired behaviors against a formal model of the system designed to exhibit these behaviors. To accomplish this task, the LTL formulas must be translated into automata [21]. We focus on LTL compilation by investigating LTL satisfiability checking via a reduction to model checking. Having shown that symbolic LTL compilation algorithms are superior to explicit automata construction algorithms for this task [16], we concentrate here on seeking a better symbolic algorithm.We present experimental data comparing algorithmic variations such as normal forms, encoding methods, and variable ordering and examine their effects on performance metrics including processing time and scalability. Safety critical systems, such as air traffic control, life support systems, hazardous environment controls, and automotive control systems, pervade our daily lives, yet testing and simulation alone cannot adequately verify their reliability [3]. Model checking is a promising approach to formal verification for safety critical systems which involves creating a formal mathematical model of the system and translating desired safety properties into a formal specification for this model. The complement of the specification is then checked against the system model. When the model does not satisfy the specification, model-checking tools accompany this negative answer with a counterexample, which points to an inconsistency between the system and the desired behaviors and aids debugging efforts.
Symbolic Analysis of Concurrent Programs with Polymorphism
NASA Technical Reports Server (NTRS)
Rungta, Neha Shyam
2010-01-01
The current trend of multi-core and multi-processor computing is causing a paradigm shift from inherently sequential to highly concurrent and parallel applications. Certain thread interleavings, data input values, or combinations of both often cause errors in the system. Systematic verification techniques such as explicit state model checking and symbolic execution are extensively used to detect errors in such systems [7, 9]. Explicit state model checking enumerates possible thread schedules and input data values of a program in order to check for errors [3, 9]. To partially mitigate the state space explosion from data input values, symbolic execution techniques substitute data input values with symbolic values [5, 7, 6]. Explicit state model checking and symbolic execution techniques used in conjunction with exhaustive search techniques such as depth-first search are unable to detect errors in medium to large-sized concurrent programs because the number of behaviors caused by data and thread non-determinism is extremely large. We present an overview of abstraction-guided symbolic execution for concurrent programs that detects errors manifested by a combination of thread schedules and data values [8]. The technique generates a set of key program locations relevant in testing the reachability of the target locations. The symbolic execution is then guided along these locations in an attempt to generate a feasible execution path to the error state. This allows the execution to focus in parts of the behavior space more likely to contain an error.
The Priority Inversion Problem and Real-Time Symbolic Model Checking
1993-04-23
real time systems unpredictable in subtle ways. This makes it more difficult to implement and debug such systems. Our work discusses this problem and presents one possible solution. The solution is formalized and verified using temporal logic model checking techniques. In order to perform the verification, the BDD-based symbolic model checking algorithm given in previous works was extended to handle real-time properties using the bounded until operator. We believe that this algorithm, which is based on discrete time, is able to handle many real-time properties
Symbolic discrete event system specification
NASA Technical Reports Server (NTRS)
Zeigler, Bernard P.; Chi, Sungdo
1992-01-01
Extending discrete event modeling formalisms to facilitate greater symbol manipulation capabilities is important to further their use in intelligent control and design of high autonomy systems. An extension to the DEVS formalism that facilitates symbolic expression of event times by extending the time base from the real numbers to the field of linear polynomials over the reals is defined. A simulation algorithm is developed to generate the branching trajectories resulting from the underlying nondeterminism. To efficiently manage symbolic constraints, a consistency checking algorithm for linear polynomial constraints based on feasibility checking algorithms borrowed from linear programming has been developed. The extended formalism offers a convenient means to conduct multiple, simultaneous explorations of model behaviors. Examples of application are given with concentration on fault model analysis.
2014-06-19
urgent and compelling. Recent efforts in this area automate program analysis techniques using model checking and symbolic execution [2, 5–7]. These...bounded model checking tool for x86 binary programs developed at the Air Force Institute of Technology (AFIT). Jiseki creates a bit-vector logic model based...assume there are n different paths through the function foo . The program could potentially call the function foo a bound number of times, resulting in n
Towards Symbolic Model Checking for Multi-Agent Systems via OBDDs
NASA Technical Reports Server (NTRS)
Raimondi, Franco; Lomunscio, Alessio
2004-01-01
We present an algorithm for model checking temporal-epistemic properties of multi-agent systems, expressed in the formalism of interpreted systems. We first introduce a technique for the translation of interpreted systems into boolean formulae, and then present a model-checking algorithm based on this translation. The algorithm is based on OBDD's, as they offer a compact and efficient representation for boolean formulae.
Reduced circuit implementation of encoder and syndrome generator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trager, Barry M; Winograd, Shmuel
An error correction method and system includes an Encoder and Syndrome-generator that operate in parallel to reduce the amount of circuitry used to compute check symbols and syndromes for error correcting codes. The system and method computes the contributions to the syndromes and check symbols 1 bit at a time instead of 1 symbol at a time. As a result, the even syndromes can be computed as powers of the odd syndromes. Further, the system assigns symbol addresses so that there are, for an example GF(2.sup.8) which has 72 symbols, three (3) blocks of addresses which differ by a cubemore » root of unity to allow the data symbols to be combined for reducing size and complexity of odd syndrome circuits. Further, the implementation circuit for generating check symbols is derived from syndrome circuit using the inverse of the part of the syndrome matrix for check locations.« less
Verifying Multi-Agent Systems via Unbounded Model Checking
NASA Technical Reports Server (NTRS)
Kacprzak, M.; Lomuscio, A.; Lasica, T.; Penczek, W.; Szreter, M.
2004-01-01
We present an approach to the problem of verification of epistemic properties in multi-agent systems by means of symbolic model checking. In particular, it is shown how to extend the technique of unbounded model checking from a purely temporal setting to a temporal-epistemic one. In order to achieve this, we base our discussion on interpreted systems semantics, a popular semantics used in multi-agent systems literature. We give details of the technique and show how it can be applied to the well known train, gate and controller problem. Keywords: model checking, unbounded model checking, multi-agent systems
Symbolic PathFinder: Symbolic Execution of Java Bytecode
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Rungta, Neha
2010-01-01
Symbolic Pathfinder (SPF) combines symbolic execution with model checking and constraint solving for automated test case generation and error detection in Java programs with unspecified inputs. In this tool, programs are executed on symbolic inputs representing multiple concrete inputs. Values of variables are represented as constraints generated from the analysis of Java bytecode. The constraints are solved using off-the shelf solvers to generate test inputs guaranteed to achieve complex coverage criteria. SPF has been used successfully at NASA, in academia, and in industry.
Proceedings of the First NASA Formal Methods Symposium
NASA Technical Reports Server (NTRS)
Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)
2009-01-01
Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.
Model-Checking with Edge-Valued Decision Diagrams
NASA Technical Reports Server (NTRS)
Roux, Pierre; Siminiceanu, Radu I.
2010-01-01
We describe an algebra of Edge-Valued Decision Diagrams (EVMDDs) to encode arithmetic functions and its implementation in a model checking library along with state-of-the-art algorithms for building the transition relation and the state space of discrete state systems. We provide efficient algorithms for manipulating EVMDDs and give upper bounds of the theoretical time complexity of these algorithms for all basic arithmetic and relational operators. We also demonstrate that the time complexity of the generic recursive algorithm for applying a binary operator on EVMDDs is no worse than that of Multi-Terminal Decision Diagrams. We have implemented a new symbolic model checker with the intention to represent in one formalism the best techniques available at the moment across a spectrum of existing tools: EVMDDs for encoding arithmetic expressions, identity-reduced MDDs for representing the transition relation, and the saturation algorithm for reachability analysis. We compare our new symbolic model checking EVMDD library with the widely used CUDD package and show that, in many cases, our tool is several orders of magnitude faster than CUDD.
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Munoz, Cesar A.; Siminiceanu, Radu I.
2007-01-01
This paper describes a translator from a new planning language named the Abstract Plan Preparation Language (APPL) to the Symbolic Analysis Laboratory (SAL) model checker. This translator has been developed in support of the Spacecraft Autonomy for Vehicles and Habitats (SAVH) project sponsored by the Exploration Technology Development Program, which is seeking to mature autonomy technology for the vehicles and operations centers of Project Constellation.
Exact and Approximate Probabilistic Symbolic Execution
NASA Technical Reports Server (NTRS)
Luckow, Kasper; Pasareanu, Corina S.; Dwyer, Matthew B.; Filieri, Antonio; Visser, Willem
2014-01-01
Probabilistic software analysis seeks to quantify the likelihood of reaching a target event under uncertain environments. Recent approaches compute probabilities of execution paths using symbolic execution, but do not support nondeterminism. Nondeterminism arises naturally when no suitable probabilistic model can capture a program behavior, e.g., for multithreading or distributed systems. In this work, we propose a technique, based on symbolic execution, to synthesize schedulers that resolve nondeterminism to maximize the probability of reaching a target event. To scale to large systems, we also introduce approximate algorithms to search for good schedulers, speeding up established random sampling and reinforcement learning results through the quantification of path probabilities based on symbolic execution. We implemented the techniques in Symbolic PathFinder and evaluated them on nondeterministic Java programs. We show that our algorithms significantly improve upon a state-of- the-art statistical model checking algorithm, originally developed for Markov Decision Processes.
Verification of Java Programs using Symbolic Execution and Invariant Generation
NASA Technical Reports Server (NTRS)
Pasareanu, Corina; Visser, Willem
2004-01-01
Software verification is recognized as an important and difficult problem. We present a norel framework, based on symbolic execution, for the automated verification of software. The framework uses annotations in the form of method specifications an3 loop invariants. We present a novel iterative technique that uses invariant strengthening and approximation for discovering these loop invariants automatically. The technique handles different types of data (e.g. boolean and numeric constraints, dynamically allocated structures and arrays) and it allows for checking universally quantified formulas. Our framework is built on top of the Java PathFinder model checking toolset and it was used for the verification of several non-trivial Java programs.
Symbolic Heuristic Search for Factored Markov Decision Processes
NASA Technical Reports Server (NTRS)
Morris, Robert (Technical Monitor); Feng, Zheng-Zhu; Hansen, Eric A.
2003-01-01
We describe a planning algorithm that integrates two approaches to solving Markov decision processes with large state spaces. State abstraction is used to avoid evaluating states individually. Forward search from a start state, guided by an admissible heuristic, is used to avoid evaluating all states. We combine these two approaches in a novel way that exploits symbolic model-checking techniques and demonstrates their usefulness for decision-theoretic planning.
NASA Technical Reports Server (NTRS)
Yang, Guowei; Pasareanu, Corina S.; Khurshid, Sarfraz
2012-01-01
This paper introduces memoized symbolic execution (Memoise), a novel approach for more efficient application of forward symbolic execution, which is a well-studied technique for systematic exploration of program behaviors based on bounded execution paths. Our key insight is that application of symbolic execution often requires several successive runs of the technique on largely similar underlying problems, e.g., running it once to check a program to find a bug, fixing the bug, and running it again to check the modified program. Memoise introduces a trie-based data structure that stores the key elements of a run of symbolic execution. Maintenance of the trie during successive runs allows re-use of previously computed results of symbolic execution without the need for re-computing them as is traditionally done. Experiments using our prototype embodiment of Memoise show the benefits it holds in various standard scenarios of using symbolic execution, e.g., with iterative deepening of exploration depth, to perform regression analysis, or to enhance coverage.
Symbolically Modeling Concurrent MCAPI Executions
NASA Technical Reports Server (NTRS)
Fischer, Topher; Mercer, Eric; Rungta, Neha
2011-01-01
Improper use of Inter-Process Communication (IPC) within concurrent systems often creates data races which can lead to bugs that are challenging to discover. Techniques that use Satisfiability Modulo Theories (SMT) problems to symbolically model possible executions of concurrent software have recently been proposed for use in the formal verification of software. In this work we describe a new technique for modeling executions of concurrent software that use a message passing API called MCAPI. Our technique uses an execution trace to create an SMT problem that symbolically models all possible concurrent executions and follows the same sequence of conditional branch outcomes as the provided execution trace. We check if there exists a satisfying assignment to the SMT problem with respect to specific safety properties. If such an assignment exists, it provides the conditions that lead to the violation of the property. We show how our method models behaviors of MCAPI applications that are ignored in previously published techniques.
USDA-ARS?s Scientific Manuscript database
A variety of nutrition symbols and rating systems are in use on the front of food packages. They are intended to help consumers make healthier food choices. One system, the American Heart Association Heart (AHA) Heart-Check Program, has evolved over time to incorporate current science-based recommen...
Query Language for Location-Based Services: A Model Checking Approach
NASA Astrophysics Data System (ADS)
Hoareau, Christian; Satoh, Ichiro
We present a model checking approach to the rationale, implementation, and applications of a query language for location-based services. Such query mechanisms are necessary so that users, objects, and/or services can effectively benefit from the location-awareness of their surrounding environment. The underlying data model is founded on a symbolic model of space organized in a tree structure. Once extended to a semantic model for modal logic, we regard location query processing as a model checking problem, and thus define location queries as hybrid logicbased formulas. Our approach is unique to existing research because it explores the connection between location models and query processing in ubiquitous computing systems, relies on a sound theoretical basis, and provides modal logic-based query mechanisms for expressive searches over a decentralized data structure. A prototype implementation is also presented and will be discussed.
Safety Verification of a Fault Tolerant Reconfigurable Autonomous Goal-Based Robotic Control System
NASA Technical Reports Server (NTRS)
Braman, Julia M. B.; Murray, Richard M; Wagner, David A.
2007-01-01
Fault tolerance and safety verification of control systems are essential for the success of autonomous robotic systems. A control architecture called Mission Data System (MDS), developed at the Jet Propulsion Laboratory, takes a goal-based control approach. In this paper, a method for converting goal network control programs into linear hybrid systems is developed. The linear hybrid system can then be verified for safety in the presence of failures using existing symbolic model checkers. An example task is simulated in MDS and successfully verified using HyTech, a symbolic model checking software for linear hybrid systems.
Regression Verification Using Impact Summaries
NASA Technical Reports Server (NTRS)
Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana
2013-01-01
Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program versions [19]. These techniques compare two programs with a large degree of syntactic similarity to prove that portions of one program version are equivalent to the other. Regression verification can be used for guaranteeing backward compatibility, and for showing behavioral equivalence in programs with syntactic differences, e.g., when a program is refactored to improve its performance, maintainability, or readability. Existing regression verification techniques leverage similarities between program versions by using abstraction and decomposition techniques to improve scalability of the analysis [10, 12, 19]. The abstractions and decomposition in the these techniques, e.g., summaries of unchanged code [12] or semantically equivalent methods [19], compute an over-approximation of the program behaviors. The equivalence checking results of these techniques are sound but not complete-they may characterize programs as not functionally equivalent when, in fact, they are equivalent. In this work we describe a novel approach that leverages the impact of the differences between two programs for scaling regression verification. We partition program behaviors of each version into (a) behaviors impacted by the changes and (b) behaviors not impacted (unimpacted) by the changes. Only the impacted program behaviors are used during equivalence checking. We then prove that checking equivalence of the impacted program behaviors is equivalent to checking equivalence of all program behaviors for a given depth bound. In this work we use symbolic execution to generate the program behaviors and leverage control- and data-dependence information to facilitate the partitioning of program behaviors. The impacted program behaviors are termed as impact summaries. The dependence analyses that facilitate the generation of the impact summaries, we believe, could be used in conjunction with other abstraction and decomposition based approaches, [10, 12], as a complementary reduction technique. An evaluation of our regression verification technique shows that our approach is capable of leveraging similarities between program versions to reduce the size of the queries and the time required to check for logical equivalence. The main contributions of this work are: - A regression verification technique to generate impact summaries that can be checked for functional equivalence using an off-the-shelf decision procedure. - A proof that our approach is sound and complete with respect to the depth bound of symbolic execution. - An implementation of our technique using the LLVMcompiler infrastructure, the klee Symbolic Virtual Machine [4], and a variety of Satisfiability Modulo Theory (SMT) solvers, e.g., STP [7] and Z3 [6]. - An empirical evaluation on a set of C artifacts which shows that the use of impact summaries can reduce the cost of regression verification.
Model Checker for Java Programs
NASA Technical Reports Server (NTRS)
Visser, Willem
2007-01-01
Java Pathfinder (JPF) is a verification and testing environment for Java that integrates model checking, program analysis, and testing. JPF consists of a custom-made Java Virtual Machine (JVM) that interprets bytecode, combined with a search interface to allow the complete behavior of a Java program to be analyzed, including interleavings of concurrent programs. JPF is implemented in Java, and its architecture is highly modular to support rapid prototyping of new features. JPF is an explicit-state model checker, because it enumerates all visited states and, therefore, suffers from the state-explosion problem inherent in analyzing large programs. It is suited to analyzing programs less than 10kLOC, but has been successfully applied to finding errors in concurrent programs up to 100kLOC. When an error is found, a trace from the initial state to the error is produced to guide the debugging. JPF works at the bytecode level, meaning that all of Java can be model-checked. By default, the software checks for all runtime errors (uncaught exceptions), assertions violations (supports Java s assert), and deadlocks. JPF uses garbage collection and symmetry reductions of the heap during model checking to reduce state-explosion, as well as dynamic partial order reductions to lower the number of interleavings analyzed. JPF is capable of symbolic execution of Java programs, including symbolic execution of complex data such as linked lists and trees. JPF is extensible as it allows for the creation of listeners that can subscribe to events during searches. The creation of dedicated code to be executed in place of regular classes is supported and allows users to easily handle native calls and to improve the efficiency of the analysis.
Automatic Testcase Generation for Flight Software
NASA Technical Reports Server (NTRS)
Bushnell, David Henry; Pasareanu, Corina; Mackey, Ryan M.
2008-01-01
The TacSat3 project is applying Integrated Systems Health Management (ISHM) technologies to an Air Force spacecraft for operational evaluation in space. The experiment will demonstrate the effectiveness and cost of ISHM and vehicle systems management (VSM) technologies through onboard operation for extended periods. We present two approaches to automatic testcase generation for ISHM: 1) A blackbox approach that views the system as a blackbox, and uses a grammar-based specification of the system's inputs to automatically generate *all* inputs that satisfy the specifications (up to prespecified limits); these inputs are then used to exercise the system. 2) A whitebox approach that performs analysis and testcase generation directly on a representation of the internal behaviour of the system under test. The enabling technologies for both these approaches are model checking and symbolic execution, as implemented in the Ames' Java PathFinder (JPF) tool suite. Model checking is an automated technique for software verification. Unlike simulation and testing which check only some of the system executions and therefore may miss errors, model checking exhaustively explores all possible executions. Symbolic execution evaluates programs with symbolic rather than concrete values and represents variable values as symbolic expressions. We are applying the blackbox approach to generating input scripts for the Spacecraft Command Language (SCL) from Interface and Control Systems. SCL is an embedded interpreter for controlling spacecraft systems. TacSat3 will be using SCL as the controller for its ISHM systems. We translated the SCL grammar into a program that outputs scripts conforming to the grammars. Running JPF on this program generates all legal input scripts up to a prespecified size. Script generation can also be targeted to specific parts of the grammar of interest to the developers. These scripts are then fed to the SCL Executive. ICS's in-house coverage tools will be run to measure code coverage. Because the scripts exercise all parts of the grammar, we expect them to provide high code coverage. This blackbox approach is suitable for systems for which we do not have access to the source code. We are applying whitebox test generation to the Spacecraft Health INference Engine (SHINE) that is part of the ISHM system. In TacSat3, SHINE will execute an on-board knowledge base for fault detection and diagnosis. SHINE converts its knowledge base into optimized C code which runs onboard TacSat3. SHINE can translate its rules into an intermediate representation (Java) suitable for analysis with JPF. JPF will analyze SHINE's Java output using symbolic execution, producing testcases that can provide either complete or directed coverage of the code. Automatically generated test suites can provide full code coverage and be quickly regenerated when code changes. Because our tools analyze executable code, they fully cover the delivered code, not just models of the code. This approach also provides a way to generate tests that exercise specific sections of code under specific preconditions. This capability gives us more focused testing of specific sections of code.
Model Checking with Edge-Valued Decision Diagrams
NASA Technical Reports Server (NTRS)
Roux, Pierre; Siminiceanu, Radu I.
2010-01-01
We describe an algebra of Edge-Valued Decision Diagrams (EVMDDs) to encode arithmetic functions and its implementation in a model checking library. We provide efficient algorithms for manipulating EVMDDs and review the theoretical time complexity of these algorithms for all basic arithmetic and relational operators. We also demonstrate that the time complexity of the generic recursive algorithm for applying a binary operator on EVMDDs is no worse than that of Multi- Terminal Decision Diagrams. We have implemented a new symbolic model checker with the intention to represent in one formalism the best techniques available at the moment across a spectrum of existing tools. Compared to the CUDD package, our tool is several orders of magnitude faster
NASA Astrophysics Data System (ADS)
Halkos, George E.; Tsilika, Kyriaki D.
2011-09-01
In this paper we examine the property of asymptotic stability in several dynamic economic systems, modeled in ordinary differential equation formulations of time parameter t. Asymptotic stability ensures intertemporal equilibrium for the economic quantity the solution stands for, regardless of what the initial conditions happen to be. Existence of economic equilibrium in continuous time models is checked via a Symbolic language, the Xcas program editor. Using stability theorems of differential equations as background a brief overview of symbolic capabilities of free software Xcas is given. We present computational experience with a programming style for stability results of ordinary linear and nonlinear differential equations. Numerical experiments on traditional applications of economic dynamics exhibit the simplicity clarity and brevity of input and output of our computer codes.
Thermal Model Development for Ares I-X
NASA Technical Reports Server (NTRS)
Amundsen, Ruth M.; DelCorso, Joe
2008-01-01
Thermal analysis for the Ares I-X vehicle has involved extensive thermal model integration, since thermal models of vehicle elements came from several different NASA and industry organizations. Many valuable lessons were learned in terms of model integration and validation. Modeling practices such as submodel, analysis group and symbol naming were standardized to facilitate the later model integration. Upfront coordination of coordinate systems, timelines, units, symbols and case scenarios was very helpful in minimizing integration rework. A process for model integration was developed that included pre-integration runs and basic checks of both models, and a step-by-step process to efficiently integrate one model into another. Extensive use of model logic was used to create scenarios and timelines for avionics and air flow activation. Efficient methods of model restart between case scenarios were developed. Standardization of software version and even compiler version between organizations was found to be essential. An automated method for applying aeroheating to the full integrated vehicle model, including submodels developed by other organizations, was developed.
A Multi-Encoding Approach for LTL Symbolic Satisfiability Checking
NASA Technical Reports Server (NTRS)
Rozier, Kristin Y.; Vardi, Moshe Y.
2011-01-01
Formal behavioral specifications written early in the system-design process and communicated across all design phases have been shown to increase the efficiency, consistency, and quality of the system under development. To prevent introducing design or verification errors, it is crucial to test specifications for satisfiability. Our focus here is on specifications expressed in linear temporal logic (LTL). We introduce a novel encoding of symbolic transition-based Buchi automata and a novel, "sloppy," transition encoding, both of which result in improved scalability. We also define novel BDD variable orders based on tree decomposition of formula parse trees. We describe and extensively test a new multi-encoding approach utilizing these novel encoding techniques to create 30 encoding variations. We show that our novel encodings translate to significant, sometimes exponential, improvement over the current standard encoding for symbolic LTL satisfiability checking.
Online Tester for a Symbol Generator
NASA Technical Reports Server (NTRS)
Juday, D.; Mcconaugy, K.
1985-01-01
About 95 percent of faults detected. Programable instrument periodically checks for failures in system that generates alphanumerical and other symbol voltages for cathode-ray-tube displays. Symbol-generator tester compares gated test-point voltages with predetermined voltage limits while circuit under test performs commanded operation. A go/no-go indication given, depending on whether test voltage is or is not within its specification. Tester in plug-in modular form, temporarily wired to generator test points, or permanently wired to these points.
ERIC Educational Resources Information Center
Schilling, Tim
Thirty years ago a cashless society was predicted for the near future; paper currency and checks would be an antiquated symbol of the past. Consumers would embrace a new alternative for making payments: electronic money. But currency is still used for 87% of payments, mainly for "nickel and dime" purchases. And checks are the payment…
SimCheck: An Expressive Type System for Simulink
NASA Technical Reports Server (NTRS)
Roy, Pritam; Shankar, Natarajan
2010-01-01
MATLAB Simulink is a member of a class of visual languages that are used for modeling and simulating physical and cyber-physical systems. A Simulink model consists of blocks with input and output ports connected using links that carry signals. We extend the type system of Simulink with annotations and dimensions/units associated with ports and links. These types can capture invariants on signals as well as relations between signals. We define a type-checker that checks the wellformedness of Simulink blocks with respect to these type annotations. The type checker generates proof obligations that are solved by SRI's Yices solver for satisfiability modulo theories (SMT). This translation can be used to detect type errors, demonstrate counterexamples, generate test cases, or prove the absence of type errors. Our work is an initial step toward the symbolic analysis of MATLAB Simulink models.
Verus: A Tool for Quantitative Analysis of Finite-State Real-Time Systems.
1996-08-12
Symbolic model checking is a technique for verifying finite-state concurrent systems that has been extended to handle real - time systems . Models with...up to 10(exp 30) states can often be verified in minutes. In this paper, we present a new tool to analyze real - time systems , based on this technique...We have designed a language, called Verus, for the description of real - time systems . Such a description is compiled into a state-transition graph and
NASA Technical Reports Server (NTRS)
Rocha, Camilo; Meseguer, Jose; Munoz, Cesar A.
2013-01-01
Combining symbolic techniques such as: (i) SMT solving, (ii) rewriting modulo theories, and (iii) model checking can enable the analysis of infinite-state systems outside the scope of each such technique. This paper proposes rewriting modulo SMT as a new technique combining the powers of (i)-(iii) and ideally suited to model and analyze infinite-state open systems; that is, systems that interact with a non-deterministic environment. Such systems exhibit both internal non-determinism due to the system, and external non-determinism due to the environment. They are not amenable to finite-state model checking analysis because they typically are infinite-state. By being reducible to standard rewriting using reflective techniques, rewriting modulo SMT can both naturally model and analyze open systems without requiring any changes to rewriting-based reachability analysis techniques for closed systems. This is illustrated by the analysis of a real-time system beyond the scope of timed automata methods.
A Comparative Study : Microprogrammed Vs Risc Architectures For Symbolic Processing
NASA Astrophysics Data System (ADS)
Heudin, J. C.; Metivier, C.; Demigny, D.; Maurin, T.; Zavidovique, B.; Devos, F.
1987-05-01
It is oftenclaimed that conventional computers are not well suited for human-like tasks : Vision (Image Processing), Intelligence (Symbolic Processing) ... In the particular case of Artificial Intelligence, dynamic type-checking is one example of basic task that must be improved. The solution implemented in most Lisp work-stations consists in a microprogrammed architecture with a tagged memory. Another way to gain efficiency is to design a well suited instruction set for symbolic processing, which reduces the semantic gap between the high level language and the machine code. In this framework, the RISC concept provides a convenient approach to study new architectures for symbolic processing. This paper compares both approaches and describes our projectof designing a compact symbolic processor for Artificial Intelligence applications.
Truke, a web tool to check for and handle excel misidentified gene symbols.
Mallona, Izaskun; Peinado, Miguel A
2017-03-21
Genomic datasets accompanying scientific publications show a surprisingly high rate of gene name corruption. This error is generated when files and tables are imported into Microsoft Excel and certain gene symbols are automatically converted into dates. We have developed Truke, a fexible Web tool to detect, tag and fix, if possible, such misconversions. Aside, Truke is language and regional locale-aware, providing file format customization (decimal symbol, field sepator, etc.) following user's preferences. Truke is a data format conversion tool with a unique corrupted gene symbol detection utility. Truke is freely available without registration at http://maplab.cat/truke .
Proceedings of the Second NASA Formal Methods Symposium
NASA Technical Reports Server (NTRS)
Munoz, Cesar (Editor)
2010-01-01
This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.
Spin wave Feynman diagram vertex computation package
NASA Astrophysics Data System (ADS)
Price, Alexander; Javernick, Philip; Datta, Trinanjan
Spin wave theory is a well-established theoretical technique that can correctly predict the physical behavior of ordered magnetic states. However, computing the effects of an interacting spin wave theory incorporating magnons involve a laborious by hand derivation of Feynman diagram vertices. The process is tedious and time consuming. Hence, to improve productivity and have another means to check the analytical calculations, we have devised a Feynman Diagram Vertex Computation package. In this talk, we will describe our research group's effort to implement a Mathematica based symbolic Feynman diagram vertex computation package that computes spin wave vertices. Utilizing the non-commutative algebra package NCAlgebra as an add-on to Mathematica, symbolic expressions for the Feynman diagram vertices of a Heisenberg quantum antiferromagnet are obtained. Our existing code reproduces the well-known expressions of a nearest neighbor square lattice Heisenberg model. We also discuss the case of a triangular lattice Heisenberg model where non collinear terms contribute to the vertex interactions.
Model Based Analysis and Test Generation for Flight Software
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep
2009-01-01
We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.
Experimental Evaluation of a Planning Language Suitable for Formal Verification
NASA Technical Reports Server (NTRS)
Butler, Rick W.; Munoz, Cesar A.; Siminiceanu, Radu I.
2008-01-01
The marriage of model checking and planning faces two seemingly diverging alternatives: the need for a planning language expressive enough to capture the complexity of real-life applications, as opposed to a language simple, yet robust enough to be amenable to exhaustive verification and validation techniques. In an attempt to reconcile these differences, we have designed an abstract plan description language, ANMLite, inspired from the Action Notation Modeling Language (ANML) [17]. We present the basic concepts of the ANMLite language as well as an automatic translator from ANMLite to the model checker SAL (Symbolic Analysis Laboratory) [7]. We discuss various aspects of specifying a plan in terms of constraints and explore the implications of choosing a robust logic behind the specification of constraints, rather than simply propose a new planning language. Additionally, we provide an initial assessment of the efficiency of model checking to search for solutions of planning problems. To this end, we design a basic test benchmark and study the scalability of the generated SAL models in terms of plan complexity.
A complex symbol signal-to-noise ratio estimator and its performance
NASA Technical Reports Server (NTRS)
Feria, Y.
1994-01-01
This article presents an algorithm for estimating the signal-to-noise ratio (SNR) of signals that contain data on a downconverted suppressed carrier or the first harmonic of a square-wave subcarrier. This algorithm can be used to determine the performance of the full-spectrum combiner for the Galileo S-band (2.2- to 2.3-GHz) mission by measuring the input and output symbol SNR. A performance analysis of the algorithm shows that the estimator can estimate the complex symbol SNR using 10,000 symbols at a true symbol SNR of -5 dB with a mean of -4.9985 dB and a standard deviation of 0.2454 dB, and these analytical results are checked by simulations of 100 runs with a mean of -5.06 dB and a standard deviation of 0.2506 dB.
Timing analysis by model checking
NASA Technical Reports Server (NTRS)
Naydich, Dimitri; Guaspari, David
2000-01-01
The safety of modern avionics relies on high integrity software that can be verified to meet hard real-time requirements. The limits of verification technology therefore determine acceptable engineering practice. To simplify verification problems, safety-critical systems are commonly implemented under the severe constraints of a cyclic executive, which make design an expensive trial-and-error process highly intolerant of change. Important advances in analysis techniques, such as rate monotonic analysis (RMA), have provided a theoretical and practical basis for easing these onerous restrictions. But RMA and its kindred have two limitations: they apply only to verifying the requirement of schedulability (that tasks meet their deadlines) and they cannot be applied to many common programming paradigms. We address both these limitations by applying model checking, a technique with successful industrial applications in hardware design. Model checking algorithms analyze finite state machines, either by explicit state enumeration or by symbolic manipulation. Since quantitative timing properties involve a potentially unbounded state variable (a clock), our first problem is to construct a finite approximation that is conservative for the properties being analyzed-if the approximation satisfies the properties of interest, so does the infinite model. To reduce the potential for state space explosion we must further optimize this finite model. Experiments with some simple optimizations have yielded a hundred-fold efficiency improvement over published techniques.
NASA Technical Reports Server (NTRS)
Malekpour, Mahyar R.
2007-01-01
This report presents the mechanical verification of a simplified model of a rapid Byzantine-fault-tolerant self-stabilizing protocol for distributed clock synchronization systems. This protocol does not rely on any assumptions about the initial state of the system. This protocol tolerates bursts of transient failures, and deterministically converges within a time bound that is a linear function of the self-stabilization period. A simplified model of the protocol is verified using the Symbolic Model Verifier (SMV) [SMV]. The system under study consists of 4 nodes, where at most one of the nodes is assumed to be Byzantine faulty. The model checking effort is focused on verifying correctness of the simplified model of the protocol in the presence of a permanent Byzantine fault as well as confirmation of claims of determinism and linear convergence with respect to the self-stabilization period. Although model checking results of the simplified model of the protocol confirm the theoretical predictions, these results do not necessarily confirm that the protocol solves the general case of this problem. Modeling challenges of the protocol and the system are addressed. A number of abstractions are utilized in order to reduce the state space. Also, additional innovative state space reduction techniques are introduced that can be used in future verification efforts applied to this and other protocols.
NASA Technical Reports Server (NTRS)
Bensalem, Saddek; Ganesh, Vijay; Lakhnech, Yassine; Munoz, Cesar; Owre, Sam; Ruess, Harald; Rushby, John; Rusu, Vlad; Saiedi, Hassen; Shankar, N.
2000-01-01
To become practical for assurance, automated formal methods must be made more scalable, automatic, and cost-effective. Such an increase in scope, scale, automation, and utility can be derived from an emphasis on a systematic separation of concerns during verification. SAL (Symbolic Analysis Laboratory) attempts to address these issues. It is a framework for combining different tools to calculate properties of concurrent systems. The heart of SAL is a language, developed in collaboration with Stanford, Berkeley, and Verimag for specifying concurrent systems in a compositional way. Our instantiation of the SAL framework augments PVS with tools for abstraction, invariant generation, program analysis (such as slicing), theorem proving, and model checking to separate concerns as well as calculate properties (i.e., perform, symbolic analysis) of concurrent systems. We. describe the motivation, the language, the tools, their integration in SAL/PAS, and some preliminary experience of their use.
Pilotless Frame Synchronization Using LDPC Code Constraints
NASA Technical Reports Server (NTRS)
Jones, Christopher; Vissasenor, John
2009-01-01
A method of pilotless frame synchronization has been devised for low- density parity-check (LDPC) codes. In pilotless frame synchronization , there are no pilot symbols; instead, the offset is estimated by ex ploiting selected aspects of the structure of the code. The advantag e of pilotless frame synchronization is that the bandwidth of the sig nal is reduced by an amount associated with elimination of the pilot symbols. The disadvantage is an increase in the amount of receiver data processing needed for frame synchronization.
Throughput Optimization Via Adaptive MIMO Communications
2006-05-30
End-to-end matlab packet simulation platform. * Low density parity check code (LDPCC). * Field trials with Silvus DSP MIMO testbed. * High mobility...incorporate advanced LDPC (low density parity check) codes . Realizing that the power of LDPC codes come at the price of decoder complexity, we also...Channel Coding Binary Convolution Code or LDPC Packet Length 0 - 216-1, bytes Coding Rate 1/2, 2/3, 3/4, 5/6 MIMO Channel Training Length 0 - 4, symbols
31 CFR 240.17 - Powers of attorney.
Code of Federal Regulations, 2012 CFR
2012-07-01
... symbol numbers, date of issue, amount, and name of payee). (b) General powers of attorney. Checks may be... attorney forms are listed in the appendix to this part. These forms are available on the FMS website at...
Qin, Heng; Zuo, Yong; Zhang, Dong; Li, Yinghui; Wu, Jian
2017-03-06
Through slight modification on typical photon multiplier tube (PMT) receiver output statistics, a generalized received response model considering both scattered propagation and random detection is presented to investigate the impact of inter-symbol interference (ISI) on link data rate of short-range non-line-of-sight (NLOS) ultraviolet communication. Good agreement with the experimental results by numerical simulation is shown. Based on the received response characteristics, a heuristic check matrix construction algorithm of low-density-parity-check (LDPC) code is further proposed to approach the data rate bound derived in a delayed sampling (DS) binary pulse position modulation (PPM) system. Compared to conventional LDPC coding methods, better bit error ratio (BER) below 1E-05 is achieved for short-range NLOS UVC systems operating at data rate of 2Mbps.
Formal Verification Toolkit for Requirements and Early Design Stages
NASA Technical Reports Server (NTRS)
Badger, Julia M.; Miller, Sheena Judson
2011-01-01
Efficient flight software development from natural language requirements needs an effective way to test designs earlier in the software design cycle. A method to automatically derive logical safety constraints and the design state space from natural language requirements is described. The constraints can then be checked using a logical consistency checker and also be used in a symbolic model checker to verify the early design of the system. This method was used to verify a hybrid control design for the suit ports on NASA Johnson Space Center's Space Exploration Vehicle against safety requirements.
Property Differencing for Incremental Checking
NASA Technical Reports Server (NTRS)
Yang, Guowei; Khurshid, Sarfraz; Person, Suzette; Rungta, Neha
2014-01-01
This paper introduces iProperty, a novel approach that facilitates incremental checking of programs based on a property di erencing technique. Speci cally, iProperty aims to reduce the cost of checking properties as they are initially developed and as they co-evolve with the program. The key novelty of iProperty is to compute the di erences between the new and old versions of expected properties to reduce the number and size of the properties that need to be checked during the initial development of the properties. Furthermore, property di erencing is used in synergy with program behavior di erencing techniques to optimize common regression scenarios, such as detecting regression errors or checking feature additions for conformance to new expected properties. Experimental results in the context of symbolic execution of Java programs annotated with properties written as assertions show the e ectiveness of iProperty in utilizing change information to enable more ecient checking.
Performance of Low-Density Parity-Check Coded Modulation
NASA Astrophysics Data System (ADS)
Hamkins, J.
2011-02-01
This article presents the simulated performance of a family of nine AR4JA low-density parity-check (LDPC) codes when used with each of five modulations. In each case, the decoder inputs are codebit log-likelihood ratios computed from the received (noisy) modulation symbols using a general formula which applies to arbitrary modulations. Suboptimal soft-decision and hard-decision demodulators are also explored. Bit-interleaving and various mappings of bits to modulation symbols are considered. A number of subtle decoder algorithm details are shown to affect performance, especially in the error floor region. Among these are quantization dynamic range and step size, clipping degree-one variable nodes, "Jones clipping" of variable nodes, approximations of the min* function, and partial hard-limiting messages from check nodes. Using these decoder optimizations, all coded modulations simulated here are free of error floors down to codeword error rates below 10^{-6}. The purpose of generating this performance data is to aid system engineers in determining an appropriate code and modulation to use under specific power and bandwidth constraints, and to provide information needed to design a variable/adaptive coded modulation (VCM/ACM) system using the AR4JA codes. IPNPR Volume 42-185 Tagged File.txt
NASA Astrophysics Data System (ADS)
Franco, Patrick; Ogier, Jean-Marc; Loonis, Pierre; Mullot, Rémy
Recently we have developed a model for shape description and matching. Based on minimum spanning trees construction and specifics stages like the mixture, it seems to have many desirable properties. Recognition invariance in front shift, rotated and noisy shape was checked through median scale tests related to GREC symbol reference database. Even if extracting the topology of a shape by mapping the shortest path connecting all the pixels seems to be powerful, the construction of graph induces an expensive algorithmic cost. In this article we discuss on the ways to reduce time computing. An alternative solution based on image compression concepts is provided and evaluated. The model no longer operates in the image space but in a compact space, namely the Discrete Cosine space. The use of block discrete cosine transform is discussed and justified. The experimental results led on the GREC2003 database show that the proposed method is characterized by a good discrimination power, a real robustness to noise with an acceptable time computing.
Verification of a Byzantine-Fault-Tolerant Self-stabilizing Protocol for Clock Synchronization
NASA Technical Reports Server (NTRS)
Malekpour, Mahyar R.
2008-01-01
This paper presents the mechanical verification of a simplified model of a rapid Byzantine-fault-tolerant self-stabilizing protocol for distributed clock synchronization systems. This protocol does not rely on any assumptions about the initial state of the system except for the presence of sufficient good nodes, thus making the weakest possible assumptions and producing the strongest results. This protocol tolerates bursts of transient failures, and deterministically converges within a time bound that is a linear function of the self-stabilization period. A simplified model of the protocol is verified using the Symbolic Model Verifier (SMV). The system under study consists of 4 nodes, where at most one of the nodes is assumed to be Byzantine faulty. The model checking effort is focused on verifying correctness of the simplified model of the protocol in the presence of a permanent Byzantine fault as well as confirmation of claims of determinism and linear convergence with respect to the self-stabilization period. Although model checking results of the simplified model of the protocol confirm the theoretical predictions, these results do not necessarily confirm that the protocol solves the general case of this problem. Modeling challenges of the protocol and the system are addressed. A number of abstractions are utilized in order to reduce the state space.
NASA Technical Reports Server (NTRS)
Stoutemyer, D. R.
1977-01-01
The computer algebra language MACSYMA enables the programmer to include symbolic physical units in computer calculations, and features automatic detection of dimensionally-inhomogeneous formulas and conversion of inconsistent units in a dimensionally homogeneous formula. Some examples illustrate these features.
NASA Astrophysics Data System (ADS)
Wang, Liming; Qiao, Yaojun; Yu, Qian; Zhang, Wenbo
2016-04-01
We introduce a watermark non-binary low-density parity check code (NB-LDPC) scheme, which can estimate the time-varying noise variance by using prior information of watermark symbols, to improve the performance of NB-LDPC codes. And compared with the prior-art counterpart, the watermark scheme can bring about 0.25 dB improvement in net coding gain (NCG) at bit error rate (BER) of 1e-6 and 36.8-81% reduction of the iteration numbers. Obviously, the proposed scheme shows great potential in terms of error correction performance and decoding efficiency.
What Information is Stored in DNA: Does it Contain Digital Error Correcting Codes?
NASA Astrophysics Data System (ADS)
Liebovitch, Larry
1998-03-01
The longest term correlations in living systems are the information stored in DNA which reflects the evolutionary history of an organism. The 4 bases (A,T,G,C) encode sequences of amino acids as well as locations of binding sites for proteins that regulate DNA. The fidelity of this important information is maintained by ANALOG error check mechanisms. When a single strand of DNA is replicated the complementary base is inserted in the new strand. Sometimes the wrong base is inserted that sticks out disrupting the phosphate backbone. The new base is not yet methylated, so repair enzymes, that slide along the DNA, can tear out the wrong base and replace it with the right one. The bases in DNA form a sequence of 4 different symbols and so the information is encoded in a DIGITAL form. All the digital codes in our society (ISBN book numbers, UPC product codes, bank account numbers, airline ticket numbers) use error checking code, where some digits are functions of other digits to maintain the fidelity of transmitted informaiton. Does DNA also utitlize a DIGITAL error chekcing code to maintain the fidelity of its information and increase the accuracy of replication? That is, are some bases in DNA functions of other bases upstream or downstream? This raises the interesting mathematical problem: How does one determine whether some symbols in a sequence of symbols are a function of other symbols. It also bears on the issue of determining algorithmic complexity: What is the function that generates the shortest algorithm for reproducing the symbol sequence. The error checking codes most used in our technology are linear block codes. We developed an efficient method to test for the presence of such codes in DNA. We coded the 4 bases as (0,1,2,3) and used Gaussian elimination, modified for modulus 4, to test if some bases are linear combinations of other bases. We used this method to analyze the base sequence in the genes from the lac operon and cytochrome C. We did not find evidence for such error correcting codes in these genes. However, we analyzed only a small amount of DNA and if digitial error correcting schemes are present in DNA, they may be more subtle than such simple linear block codes. The basic issue we raise here, is how information is stored in DNA and an appreciation that digital symbol sequences, such as DNA, admit of interesting schemes to store and protect the fidelity of their information content. Liebovitch, Tao, Todorov, Levine. 1996. Biophys. J. 71:1539-1544. Supported by NIH grant EY6234.
My Corporis Fabrica: an ontology-based tool for reasoning and querying on complex anatomical models
2014-01-01
Background Multiple models of anatomy have been developed independently and for different purposes. In particular, 3D graphical models are specially useful for visualizing the different organs composing the human body, while ontologies such as FMA (Foundational Model of Anatomy) are symbolic models that provide a unified formal description of anatomy. Despite its comprehensive content concerning the anatomical structures, the lack of formal descriptions of anatomical functions in FMA limits its usage in many applications. In addition, the absence of connection between 3D models and anatomical ontologies makes it difficult and time-consuming to set up and access to the anatomical content of complex 3D objects. Results First, we provide a new ontology of anatomy called My Corporis Fabrica (MyCF), which conforms to FMA but extends it by making explicit how anatomical structures are composed, how they contribute to functions, and also how they can be related to 3D complex objects. Second, we have equipped MyCF with automatic reasoning capabilities that enable model checking and complex queries answering. We illustrate the added-value of such a declarative approach for interactive simulation and visualization as well as for teaching applications. Conclusions The novel vision of ontologies that we have developed in this paper enables a declarative assembly of different models to obtain composed models guaranteed to be anatomically valid while capturing the complexity of human anatomy. The main interest of this approach is its declarativity that makes possible for domain experts to enrich the knowledge base at any moment through simple editors without having to change the algorithmic machinery. This provides MyCF software environment a flexibility to process and add semantics on purpose for various applications that incorporate not only symbolic information but also 3D geometric models representing anatomical entities as well as other symbolic information like the anatomical functions. PMID:24936286
Model Checking A Self-Stabilizing Synchronization Protocol for Arbitrary Digraphs
NASA Technical Reports Server (NTRS)
Malekpour, Mahyar R.
2012-01-01
This report presents the mechanical verification of a self-stabilizing distributed clock synchronization protocol for arbitrary digraphs in the absence of faults. This protocol does not rely on assumptions about the initial state of the system, other than the presence of at least one node, and no central clock or a centrally generated signal, pulse, or message is used. The system under study is an arbitrary, non-partitioned digraph ranging from fully connected to 1-connected networks of nodes while allowing for differences in the network elements. Nodes are anonymous, i.e., they do not have unique identities. There is no theoretical limit on the maximum number of participating nodes. The only constraint on the behavior of the node is that the interactions with other nodes are restricted to defined links and interfaces. This protocol deterministically converges within a time bound that is a linear function of the self-stabilization period. A bounded model of the protocol is verified using the Symbolic Model Verifier (SMV) for a subset of digraphs. Modeling challenges of the protocol and the system are addressed. The model checking effort is focused on verifying correctness of the bounded model of the protocol as well as confirmation of claims of determinism and linear convergence with respect to the self-stabilization period.
Real-Time System Verification by Kappa-Induction
NASA Technical Reports Server (NTRS)
Pike, Lee S.
2005-01-01
We report the first formal verification of a reintegration protocol for a safety-critical, fault-tolerant, real-time distributed embedded system. A reintegration protocol increases system survivability by allowing a node that has suffered a fault to regain state consistent with the operational nodes. The protocol is verified in the Symbolic Analysis Laboratory (SAL), where bounded model checking and decision procedures are used to verify infinite-state systems by k-induction. The protocol and its environment are modeled as synchronizing timeout automata. Because k-induction is exponential with respect to k, we optimize the formal model to reduce the size of k. Also, the reintegrator's event-triggered behavior is conservatively modeled as time-triggered behavior to further reduce the size of k and to make it invariant to the number of nodes modeled. A corollary is that a clique avoidance property is satisfied.
Modeling Regular Replacement for String Constraint Solving
NASA Technical Reports Server (NTRS)
Fu, Xiang; Li, Chung-Chih
2010-01-01
Bugs in user input sanitation of software systems often lead to vulnerabilities. Among them many are caused by improper use of regular replacement. This paper presents a precise modeling of various semantics of regular substitution, such as the declarative, finite, greedy, and reluctant, using finite state transducers (FST). By projecting an FST to its input/output tapes, we are able to solve atomic string constraints, which can be applied to both the forward and backward image computation in model checking and symbolic execution of text processing programs. We report several interesting discoveries, e.g., certain fragments of the general problem can be handled using less expressive deterministic FST. A compact representation of FST is implemented in SUSHI, a string constraint solver. It is applied to detecting vulnerabilities in web applications
Throughput and latency programmable optical transceiver by using DSP and FEC control.
Tanimura, Takahito; Hoshida, Takeshi; Kato, Tomoyuki; Watanabe, Shigeki; Suzuki, Makoto; Morikawa, Hiroyuki
2017-05-15
We propose and experimentally demonstrate a proof-of-concept of a programmable optical transceiver that enables simultaneous optimization of multiple programmable parameters (modulation format, symbol rate, power allocation, and FEC) for satisfying throughput, signal quality, and latency requirements. The proposed optical transceiver also accommodates multiple sub-channels that can transport different optical signals with different requirements. Multi-degree-of-freedom of the parameters often leads to difficulty in finding the optimum combination among the parameters due to an explosion of the number of combinations. The proposed optical transceiver reduces the number of combinations and finds feasible sets of programmable parameters by using constraints of the parameters combined with a precise analytical model. For precise BER prediction with the specified set of parameters, we model the sub-channel BER as a function of OSNR, modulation formats, symbol rates, and power difference between sub-channels. Next, we formulate simple constraints of the parameters and combine the constraints with the analytical model to seek feasible sets of programmable parameters. Finally, we experimentally demonstrate the end-to-end operation of the proposed optical transceiver with offline manner including low-density parity-check (LDPC) FEC encoding and decoding under a specific use case with latency-sensitive application and 40-km transmission.
Sub-pixel analysis to support graphic security after scanning at low resolution
NASA Astrophysics Data System (ADS)
Haas, Bertrand; Cordery, Robert; Gou, Hongmei; Decker, Steve
2006-02-01
Whether in the domain of audio, video or finance, our world tends to become increasingly digital. However, for diverse reasons, the transition from analog to digital is often much extended in time, and proceeds by long steps (and sometimes never completes). One such step is the conversion of information on analog media to digital information. We focus in this paper on the conversion (scanning) of printed documents to digital images. Analog media have the advantage over digital channels that they can harbor much imperceptible information that can be used for fraud detection and forensic purposes. But this secondary information usually fails to be retrieved during the conversion step. This is particularly relevant since the Check-21 act (Check Clearing for the 21st Century act) became effective in 2004 and allows images of checks to be handled by banks as usual paper checks. We use here this situation of check scanning as our primary benchmark for graphic security features after scanning. We will first present a quick review of the most common graphic security features currently found on checks, with their specific purpose, qualities and disadvantages, and we demonstrate their poor survivability after scanning in the average scanning conditions expected from the Check-21 Act. We will then present a novel method of measurement of distances between and rotations of line elements in a scanned image: Based on an appropriate print model, we refine direct measurements to an accuracy beyond the size of a scanning pixel, so we can then determine expected distances, periodicity, sharpness and print quality of known characters, symbols and other graphic elements in a document image. Finally we will apply our method to fraud detection of documents after gray-scale scanning at 300dpi resolution. We show in particular that alterations on legitimate checks or copies of checks can be successfully detected by measuring with sub-pixel accuracy the irregularities inherently introduced by the illegitimate process.
Time-space modal logic for verification of bit-slice circuits
NASA Astrophysics Data System (ADS)
Hiraishi, Hiromi
1996-03-01
The major goal of this paper is to propose a new modal logic aiming at formal verification of bit-slice circuits. The new logic is called as time-space modal logic and its major feature is that it can handle two transition relations: one for time transition and the other for space transition. As for a verification algorithm, a symbolic model checking algorithm of the new logic is shown. This could be applicable to verification of bit-slice microprocessor of infinite bit width and 1D systolic array of infinite length. A simple benchmark result shows the effectiveness of the proposed approach.
Error correcting code with chip kill capability and power saving enhancement
Gara, Alan G [Mount Kisco, NY; Chen, Dong [Croton On Husdon, NY; Coteus, Paul W [Yorktown Heights, NY; Flynn, William T [Rochester, MN; Marcella, James A [Rochester, MN; Takken, Todd [Brewster, NY; Trager, Barry M [Yorktown Heights, NY; Winograd, Shmuel [Scarsdale, NY
2011-08-30
A method and system are disclosed for detecting memory chip failure in a computer memory system. The method comprises the steps of accessing user data from a set of user data chips, and testing the user data for errors using data from a set of system data chips. This testing is done by generating a sequence of check symbols from the user data, grouping the user data into a sequence of data symbols, and computing a specified sequence of syndromes. If all the syndromes are zero, the user data has no errors. If one of the syndromes is non-zero, then a set of discriminator expressions are computed, and used to determine whether a single or double symbol error has occurred. In the preferred embodiment, less than two full system data chips are used for testing and correcting the user data.
Encoders for block-circulant LDPC codes
NASA Technical Reports Server (NTRS)
Divsalar, Dariush (Inventor); Abbasfar, Aliazam (Inventor); Jones, Christopher R. (Inventor); Dolinar, Samuel J. (Inventor); Thorpe, Jeremy C. (Inventor); Andrews, Kenneth S. (Inventor); Yao, Kung (Inventor)
2009-01-01
Methods and apparatus to encode message input symbols in accordance with an accumulate-repeat-accumulate code with repetition three or four are disclosed. Block circulant matrices are used. A first method and apparatus make use of the block-circulant structure of the parity check matrix. A second method and apparatus use block-circulant generator matrices.
Performance of Low-Density Parity-Check Coded Modulation
NASA Technical Reports Server (NTRS)
Hamkins, Jon
2010-01-01
This paper reports the simulated performance of each of the nine accumulate-repeat-4-jagged-accumulate (AR4JA) low-density parity-check (LDPC) codes [3] when used in conjunction with binary phase-shift-keying (BPSK), quadrature PSK (QPSK), 8-PSK, 16-ary amplitude PSK (16- APSK), and 32-APSK.We also report the performance under various mappings of bits to modulation symbols, 16-APSK and 32-APSK ring scalings, log-likelihood ratio (LLR) approximations, and decoder variations. One of the simple and well-performing LLR approximations can be expressed in a general equation that applies to all of the modulation types.
NASA Astrophysics Data System (ADS)
Fehenberger, Tobias
2018-02-01
This paper studies probabilistic shaping in a multi-span wavelength-division multiplexing optical fiber system with 64-ary quadrature amplitude modulation (QAM) input. In split-step fiber simulations and via an enhanced Gaussian noise model, three figures of merit are investigated, which are signal-to-noise ratio (SNR), achievable information rate (AIR) for capacity-achieving forward error correction (FEC) with bit-metric decoding, and the information rate achieved with low-density parity-check (LDPC) FEC. For the considered system parameters and different shaped input distributions, shaping is found to decrease the SNR by 0.3 dB yet simultaneously increases the AIR by up to 0.4 bit per 4D-symbol. The information rates of LDPC-coded modulation with shaped 64QAM input are improved by up to 0.74 bit per 4D-symbol, which is larger than the shaping gain when considering AIRs. This increase is attributed to the reduced coding gap of the higher-rate code that is used for decoding the nonuniform QAM input.
Krajcsi, Attila; Lengyel, Gábor; Kojouharova, Petia
2018-01-01
HIGHLIGHTS We test whether symbolic number comparison is handled by an analog noisy system.Analog system model has systematic biases in describing symbolic number comparison.This suggests that symbolic and non-symbolic numbers are processed by different systems. Dominant numerical cognition models suppose that both symbolic and non-symbolic numbers are processed by the Analog Number System (ANS) working according to Weber's law. It was proposed that in a number comparison task the numerical distance and size effects reflect a ratio-based performance which is the sign of the ANS activation. However, increasing number of findings and alternative models propose that symbolic and non-symbolic numbers might be processed by different representations. Importantly, alternative explanations may offer similar predictions to the ANS prediction, therefore, former evidence usually utilizing only the goodness of fit of the ANS prediction is not sufficient to support the ANS account. To test the ANS model more rigorously, a more extensive test is offered here. Several properties of the ANS predictions for the error rates, reaction times, and diffusion model drift rates were systematically analyzed in both non-symbolic dot comparison and symbolic Indo-Arabic comparison tasks. It was consistently found that while the ANS model's prediction is relatively good for the non-symbolic dot comparison, its prediction is poorer and systematically biased for the symbolic Indo-Arabic comparison. We conclude that only non-symbolic comparison is supported by the ANS, and symbolic number comparisons are processed by other representation. PMID:29491845
Using LDPC Code Constraints to Aid Recovery of Symbol Timing
NASA Technical Reports Server (NTRS)
Jones, Christopher; Villasnor, John; Lee, Dong-U; Vales, Esteban
2008-01-01
A method of utilizing information available in the constraints imposed by a low-density parity-check (LDPC) code has been proposed as a means of aiding the recovery of symbol timing in the reception of a binary-phase-shift-keying (BPSK) signal representing such a code in the presence of noise, timing error, and/or Doppler shift between the transmitter and the receiver. This method and the receiver architecture in which it would be implemented belong to a class of timing-recovery methods and corresponding receiver architectures characterized as pilotless in that they do not require transmission and reception of pilot signals. Acquisition and tracking of a signal of the type described above have traditionally been performed upstream of, and independently of, decoding and have typically involved utilization of a phase-locked loop (PLL). However, the LDPC decoding process, which is iterative, provides information that can be fed back to the timing-recovery receiver circuits to improve performance significantly over that attainable in the absence of such feedback. Prior methods of coupling LDPC decoding with timing recovery had focused on the use of output code words produced as the iterations progress. In contrast, in the present method, one exploits the information available from the metrics computed for the constraint nodes of an LDPC code during the decoding process. In addition, the method involves the use of a waveform model that captures, better than do the waveform models of the prior methods, distortions introduced by receiver timing errors and transmitter/ receiver motions. An LDPC code is commonly represented by use of a bipartite graph containing two sets of nodes. In the graph corresponding to an (n,k) code, the n variable nodes correspond to the code word symbols and the n-k constraint nodes represent the constraints that the code places on the variable nodes in order for them to form a valid code word. The decoding procedure involves iterative computation of values associated with these nodes. A constraint node represents a parity-check equation using a set of variable nodes as inputs. A valid decoded code word is obtained if all parity-check equations are satisfied. After each iteration, the metrics associated with each constraint node can be evaluated to determine the status of the associated parity check. Heretofore, normally, these metrics would be utilized only within the LDPC decoding process to assess whether or not variable nodes had converged to a codeword. In the present method, it is recognized that these metrics can be used to determine accuracy of the timing estimates used in acquiring the sampled data that constitute the input to the LDPC decoder. In fact, the number of constraints that are satisfied exhibits a peak near the optimal timing estimate. Coarse timing estimation (or first-stage estimation as described below) is found via a parametric search for this peak. The present method calls for a two-stage receiver architecture illustrated in the figure. The first stage would correct large time delays and frequency offsets; the second stage would track random walks and correct residual time and frequency offsets. In the first stage, constraint-node feedback from the LDPC decoder would be employed in a search algorithm in which the searches would be performed in successively narrower windows to find the correct time delay and/or frequency offset. The second stage would include a conventional first-order PLL with a decision-aided timing-error detector that would utilize, as its decision aid, decoded symbols from the LDPC decoder. The method has been tested by means of computational simulations in cases involving various timing and frequency errors. The results of the simulations ined in the ideal case of perfect timing in the receiver.
Theorem Proving in Intel Hardware Design
NASA Technical Reports Server (NTRS)
O'Leary, John
2009-01-01
For the past decade, a framework combining model checking (symbolic trajectory evaluation) and higher-order logic theorem proving has been in production use at Intel. Our tools and methodology have been used to formally verify execution cluster functionality (including floating-point operations) for a number of Intel products, including the Pentium(Registered TradeMark)4 and Core(TradeMark)i7 processors. Hardware verification in 2009 is much more challenging than it was in 1999 - today s CPU chip designs contain many processor cores and significant firmware content. This talk will attempt to distill the lessons learned over the past ten years, discuss how they apply to today s problems, outline some future directions.
Decision Engines for Software Analysis Using Satisfiability Modulo Theories Solvers
NASA Technical Reports Server (NTRS)
Bjorner, Nikolaj
2010-01-01
The area of software analysis, testing and verification is now undergoing a revolution thanks to the use of automated and scalable support for logical methods. A well-recognized premise is that at the core of software analysis engines is invariably a component using logical formulas for describing states and transformations between system states. The process of using this information for discovering and checking program properties (including such important properties as safety and security) amounts to automatic theorem proving. In particular, theorem provers that directly support common software constructs offer a compelling basis. Such provers are commonly called satisfiability modulo theories (SMT) solvers. Z3 is a state-of-the-art SMT solver. It is developed at Microsoft Research. It can be used to check the satisfiability of logical formulas over one or more theories such as arithmetic, bit-vectors, lists, records and arrays. The talk describes some of the technology behind modern SMT solvers, including the solver Z3. Z3 is currently mainly targeted at solving problems that arise in software analysis and verification. It has been applied to various contexts, such as systems for dynamic symbolic simulation (Pex, SAGE, Vigilante), for program verification and extended static checking (Spec#/Boggie, VCC, HAVOC), for software model checking (Yogi, SLAM), model-based design (FORMULA), security protocol code (F7), program run-time analysis and invariant generation (VS3). We will describe how it integrates support for a variety of theories that arise naturally in the context of the applications. There are several new promising avenues and the talk will touch on some of these and the challenges related to SMT solvers. Proceedings
ERIC Educational Resources Information Center
Nakajima, Taira
2012-01-01
The author demonstrates a new system useful for reflective learning. Our new system offers an environment that one can use handwriting tablet devices to bookmark symbolic and descriptive feedbacks into simultaneously recorded videos in the environment. If one uses video recording and feedback check sheets in reflective learning sessions, one can…
A Methodology for Evaluating Artifacts Produced by a Formal Verification Process
NASA Technical Reports Server (NTRS)
Siminiceanu, Radu I.; Miner, Paul S.; Person, Suzette
2011-01-01
The goal of this study is to produce a methodology for evaluating the claims and arguments employed in, and the evidence produced by formal verification activities. To illustrate the process, we conduct a full assessment of a representative case study for the Enabling Technology Development and Demonstration (ETDD) program. We assess the model checking and satisfiabilty solving techniques as applied to a suite of abstract models of fault tolerant algorithms which were selected to be deployed in Orion, namely the TTEthernet startup services specified and verified in the Symbolic Analysis Laboratory (SAL) by TTTech. To this end, we introduce the Modeling and Verification Evaluation Score (MVES), a metric that is intended to estimate the amount of trust that can be placed on the evidence that is obtained. The results of the evaluation process and the MVES can then be used by non-experts and evaluators in assessing the credibility of the verification results.
Deconstructing spatiotemporal chaos using local symbolic dynamics.
Pethel, Shawn D; Corron, Ned J; Bollt, Erik
2007-11-23
We find that the global symbolic dynamics of a diffusively coupled map lattice is well approximated by a very small local model for weak to moderate coupling strengths. A local symbolic model is a truncation of the full symbolic model to one that considers only a single element and a few neighbors. Using interval analysis, we give rigorous results for a range of coupling strengths and different local model widths. Examples are presented of extracting a local symbolic model from data and of controlling spatiotemporal chaos.
The Mathematics of High School Physics
NASA Astrophysics Data System (ADS)
Kanderakis, Nikos
2016-10-01
In the seventeenth and eighteenth centuries, mathematicians and physical philosophers managed to study, via mathematics, various physical systems of the sublunar world through idealized and simplified models of these systems, constructed with the help of geometry. By analyzing these models, they were able to formulate new concepts, laws and theories of physics and then through models again, to apply these concepts and theories to new physical phenomena and check the results by means of experiment. Students' difficulties with the mathematics of high school physics are well known. Science education research attributes them to inadequately deep understanding of mathematics and mainly to inadequate understanding of the meaning of symbolic mathematical expressions. There seem to be, however, more causes of these difficulties. One of them, not independent from the previous ones, is the complex meaning of the algebraic concepts used in school physics (e.g. variables, parameters, functions), as well as the complexities added by physics itself (e.g. that equations' symbols represent magnitudes with empirical meaning and units instead of pure numbers). Another source of difficulties is that the theories and laws of physics are often applied, via mathematics, to simplified, and idealized physical models of the world and not to the world itself. This concerns not only the applications of basic theories but also all authentic end-of-the-chapter problems. Hence, students have to understand and participate in a complex interplay between physics concepts and theories, physical and mathematical models, and the real world, often without being aware that they are working with models and not directly with the real world.
Emrich, Teri E; Cohen, Joanna E; Lou, Wendy Y; L'Abbé, Mary R
2013-09-13
Concern has been raised that the coexistence of multiple front-of-pack (FOP) nutrition rating systems in a marketplace may mislead consumers into believing that a specific food with a FOP is 'healthier' than foods without the symbol. Eleven summary indicator FOP systems are in use in Canada, including one non-profit developed system, the Heart and Stroke Foundation's Health Check™, and ten manufacturer-developed systems, like Kraft's Sensible Solutions™. This study evaluated FOP's potential to mislead consumers by comparing the number of products qualifying to carry a given FOP symbol to the number of products that actually carry the symbol. The nutritional criteria for the Health Check™ and the Sensible Solutions™ systems were applied to a 2010-2011 Canadian national database of packaged food products. The proportion of foods qualifying for a given FOP system was compared to the proportion carrying the symbol using McNemar's test. Criteria were available to categorize 7503 and 3009 of the 10,487 foods in the database under Health Check™ and Sensible Solutions™, respectively. Overall 45% of the foods belonging to a Health Check™ category qualified for Health Check's™ symbol, while only 7.5% of the foods carried the symbol. Up to 79.1% of the foods belonging to a Sensible Solutions™, category qualified for Sensible Solutions's™ symbol while only 4.1% of the foods carried the symbol. The level of agreement between products qualifying for and carrying FOP systems was poor to moderate in the majority of food categories for both systems. More than 75% of the products in 24 of the 85 Health Check™ subcategories and 9 of 11 Sensible Solution™ categories/subcategories qualified for their respective symbols based on their nutritional composition. FOP systems as they are currently applied are not, in most instances, a useful guide to identifying healthier food products in the supermarket as many more products qualify for these systems than the number of products actually displaying these symbols on FOP, and the level of agreement between qualifying and carrying products is poor to moderate. The adoption of a single, standardized FOP system would assure consumers that all products meeting certain nutritional standards are designated by the symbol.
Model Checking of a Diabetes-Cancer Model
NASA Astrophysics Data System (ADS)
Gong, Haijun; Zuliani, Paolo; Clarke, Edmund M.
2011-06-01
Accumulating evidence suggests that cancer incidence might be associated with diabetes mellitus, especially Type II diabetes which is characterized by hyperinsulinaemia, hyperglycaemia, obesity, and overexpression of multiple WNT pathway components. These diabetes risk factors can activate a number of signaling pathways that are important in the development of different cancers. To systematically understand the signaling components that link diabetes and cancer risk, we have constructed a single-cell, Boolean network model by integrating the signaling pathways that are influenced by these risk factors to study insulin resistance, cancer cell proliferation and apoptosis. Then, we introduce and apply the Symbolic Model Verifier (SMV), a formal verification tool, to qualitatively study some temporal logic properties of our diabetes-cancer model. The verification results show that the diabetes risk factors might not increase cancer risk in normal cells, but they will promote cell proliferation if the cell is in a precancerous or cancerous stage characterized by losses of the tumor-suppressor proteins ARF and INK4a.
CrossCheck: an open-source web tool for high-throughput screen data analysis.
Najafov, Jamil; Najafov, Ayaz
2017-07-19
Modern high-throughput screening methods allow researchers to generate large datasets that potentially contain important biological information. However, oftentimes, picking relevant hits from such screens and generating testable hypotheses requires training in bioinformatics and the skills to efficiently perform database mining. There are currently no tools available to general public that allow users to cross-reference their screen datasets with published screen datasets. To this end, we developed CrossCheck, an online platform for high-throughput screen data analysis. CrossCheck is a centralized database that allows effortless comparison of the user-entered list of gene symbols with 16,231 published datasets. These datasets include published data from genome-wide RNAi and CRISPR screens, interactome proteomics and phosphoproteomics screens, cancer mutation databases, low-throughput studies of major cell signaling mediators, such as kinases, E3 ubiquitin ligases and phosphatases, and gene ontological information. Moreover, CrossCheck includes a novel database of predicted protein kinase substrates, which was developed using proteome-wide consensus motif searches. CrossCheck dramatically simplifies high-throughput screen data analysis and enables researchers to dig deep into the published literature and streamline data-driven hypothesis generation. CrossCheck is freely accessible as a web-based application at http://proteinguru.com/crosscheck.
NASA Technical Reports Server (NTRS)
Lee, Jeh Won
1990-01-01
The objective is the theoretical analysis and the experimental verification of dynamics and control of a two link flexible manipulator with a flexible parallel link mechanism. Nonlinear equations of motion of the lightweight manipulator are derived by the Lagrangian method in symbolic form to better understand the structure of the dynamic model. The resulting equation of motion have a structure which is useful to reduce the number of terms calculated, to check correctness, or to extend the model to higher order. A manipulator with a flexible parallel link mechanism is a constrained dynamic system whose equations are sensitive to numerical integration error. This constrained system is solved using singular value decomposition of the constraint Jacobian matrix. Elastic motion is expressed by the assumed mode method. Mode shape functions of each link are chosen using the load interfaced component mode synthesis. The discrepancies between the analytical model and the experiment are explained using a simplified and a detailed finite element model.
[Symbolical violence in the access of disabled persons to basic health units].
de França, Inacia Sátiro Xavier; Pagliuca, Lorita Marlena Freitag; Baptista, Rosilene Santos; de França, Eurípedes Gil; Coura, Alexsandro Silva; de Souza, Jeová Alves
2010-01-01
A descriptive study which aimed to characterize the conditions of people with disabilities (PD) in the Basic Health Units-UBS. Data were collected in January 2009 in 20 UBSF. It was used digital camera and check list based on the 9050-NBR ABNT. The results showed: Access town - no traffic lights (100%) of lanes for pedestrians (100%), bumpy sidewalks (90%); Access in UBS: non-standard ports (30%) staircases without banisters (20%); floor outside the standard (75%), in disagreement with standard mobile (20%), drinking at odds with standard (55%), making it difficult to people with disabilities to use a filter (30%), has no drinking or filters (15%); telephones installed inadequately (55%); inaccessible restrooms (96%). Access to UBS of PD is permeated by the symbolic violence.
Computing Environments for Data Analysis. Part 3. Programming Environments.
1986-05-21
to understand how the existing system works and how to modify them to get the desired effect. This depends on the programming ...editor that performs automatic syntax checking for all the programming languages. 3.3 How S fits in To make efficient use of the machine (maximize the... programming , manuscript from Symbolics, Inc., 5 Cambridge Center, Cambridge, Mass. 02142. [101 DEITEL H.M., (1983) An Introduction to Operating
Cartographic symbol library considering symbol relations based on anti-aliasing graphic library
NASA Astrophysics Data System (ADS)
Mei, Yang; Li, Lin
2007-06-01
Cartographic visualization represents geographic information with a map form, which enables us retrieve useful geospatial information. In digital environment, cartographic symbol library is the base of cartographic visualization and is an essential component of Geographic Information System as well. Existing cartographic symbol libraries have two flaws. One is the display quality and the other one is relations adjusting. Statistic data presented in this paper indicate that the aliasing problem is a major factor on the symbol display quality on graphic display devices. So, effective graphic anti-aliasing methods based on a new anti-aliasing algorithm are presented and encapsulated in an anti-aliasing graphic library with the form of Component Object Model. Furthermore, cartographic visualization should represent feature relation in the way of correctly adjusting symbol relations besides displaying an individual feature. But current cartographic symbol libraries don't have this capability. This paper creates a cartographic symbol design model to implement symbol relations adjusting. Consequently the cartographic symbol library based on this design model can provide cartographic visualization with relations adjusting capability. The anti-aliasing graphic library and the cartographic symbol library are sampled and the results prove that the two libraries both have better efficiency and effect.
Towards a Certified Lightweight Array Bound Checker for Java Bytecode
NASA Technical Reports Server (NTRS)
Pichardie, David
2009-01-01
Dynamic array bound checks are crucial elements for the security of a Java Virtual Machines. These dynamic checks are however expensive and several static analysis techniques have been proposed to eliminate explicit bounds checks. Such analyses require advanced numerical and symbolic manipulations that 1) penalize bytecode loading or dynamic compilation, 2) complexify the trusted computing base. Following the Foundational Proof Carrying Code methodology, our goal is to provide a lightweight bytecode verifier for eliminating array bound checks that is both efficient and trustable. In this work, we define a generic relational program analysis for an imperative, stackoriented byte code language with procedures, arrays and global variables and instantiate it with a relational abstract domain as polyhedra. The analysis has automatic inference of loop invariants and method pre-/post-conditions, and efficient checking of analysis results by a simple checker. Invariants, which can be large, can be specialized for proving a safety policy using an automatic pruning technique which reduces their size. The result of the analysis can be checked efficiently by annotating the program with parts of the invariant together with certificates of polyhedral inclusions. The resulting checker is sufficiently simple to be entirely certified within the Coq proof assistant for a simple fragment of the Java bytecode language. During the talk, we will also report on our ongoing effort to scale this approach for the full sequential JVM.
Emergent latent symbol systems in recurrent neural networks
NASA Astrophysics Data System (ADS)
Monner, Derek; Reggia, James A.
2012-12-01
Fodor and Pylyshyn [(1988). Connectionism and cognitive architecture: A critical analysis. Cognition, 28(1-2), 3-71] famously argued that neural networks cannot behave systematically short of implementing a combinatorial symbol system. A recent response from Frank et al. [(2009). Connectionist semantic systematicity. Cognition, 110(3), 358-379] claimed to have trained a neural network to behave systematically without implementing a symbol system and without any in-built predisposition towards combinatorial representations. We believe systems like theirs may in fact implement a symbol system on a deeper and more interesting level: one where the symbols are latent - not visible at the level of network structure. In order to illustrate this possibility, we demonstrate our own recurrent neural network that learns to understand sentence-level language in terms of a scene. We demonstrate our model's learned understanding by testing it on novel sentences and scenes. By paring down our model into an architecturally minimal version, we demonstrate how it supports combinatorial computation over distributed representations by using the associative memory operations of Vector Symbolic Architectures. Knowledge of the model's memory scheme gives us tools to explain its errors and construct superior future models. We show how the model designs and manipulates a latent symbol system in which the combinatorial symbols are patterns of activation distributed across the layers of a neural network, instantiating a hybrid of classical symbolic and connectionist representations that combines advantages of both.
Augmentation of machine structure to improve its diagnosability
NASA Technical Reports Server (NTRS)
Hsieh, L.
1973-01-01
Two methods of augmenting the structure of a sequential machine so that it is diagnosable are presented. The checkable (checking sequences) and repeated symbol distinguishing sequences (RDS) are discussed. It was found that as few as twice the number of outputs of the given machine is sufficient for constructing a state-output augmentation with RDS. Techniques for minimizing the number of states in resolving convergences and in resolving equivalent and nonreduced cycles are developed.
2013-05-10
Article Speech Pape.r Presentation Poster Thesis/Dissertation Book Other : Check all that apply (For Communications Purposes): 0 CRADA {Cooperative...Is America succeeding in her goal to break free of this wheel of fortune, or is she falling and failing like every other civilization in history...34Such a wheel of life symbolized the wandering of the soul from a higher existence to a lower, from a lower to a higher, many times repeated, as on a
Learning and Transition of Symbols: Towards a Dynamical Model of a Symbolic Individual
NASA Astrophysics Data System (ADS)
Hashimoto, Takashi; Masumi, Akira
The remarkable feature of linguistic communications is the use of symbols for transmitting information and mutual understanding. Deacon (1997) pointed out that humans are symbolic species, namely, we show symbolic cognitive activities such as learning, formation, and manipulation of symbols. In research into the origin and the evolution of language, we should elucidate the emerging process of such symbolic cognitive activities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
March-Leuba, S.; Jansen, J.F.; Kress, R.L.
1992-08-01
A new program package, Symbolic Manipulator Laboratory (SML), for the automatic generation of both kinematic and static manipulator models in symbolic form is presented. Critical design parameters may be identified and optimized using symbolic models as shown in the sample application presented for the Future Armor Rearm System (FARS) arm. The computer-aided development of the symbolic models yields equations with reduced numerical complexity. Important considerations have been placed on the closed form solutions simplification and on the user friendly operation. The main emphasis of this research is the development of a methodology which is implemented in a computer program capablemore » of generating symbolic kinematic and static forces models of manipulators. The fact that the models are obtained trigonometrically reduced is among the most significant results of this work and the most difficult to implement. Mathematica, a commercial program that allows symbolic manipulation, is used to implement the program package. SML is written such that the user can change any of the subroutines or create new ones easily. To assist the user, an on-line help has been written to make of SML a user friendly package. Some sample applications are presented. The design and optimization of the 5-degrees-of-freedom (DOF) FARS manipulator using SML is discussed. Finally, the kinematic and static models of two different 7-DOF manipulators are calculated symbolically.« less
ERIC Educational Resources Information Center
Mix, Kelly S.; Smith, Linda B.; Stockton, Jerri DaSha; Cheng, Yi-Ling; Barterian, Justin A.
2017-01-01
Two experiments examined whether concrete models support place value learning. In Experiment 1 (N = 149), 7-year-olds were trained with either a) symbols alone or b) symbols and base-10 blocks. Children in both groups showed significant growth overall, but there were specific effects favoring one training type over another. Symbols-only training…
DOE Office of Scientific and Technical Information (OSTI.GOV)
March-Leuba, S.; Jansen, J.F.; Kress, R.L.
A new program package, Symbolic Manipulator Laboratory (SML), for the automatic generation of both kinematic and static manipulator models in symbolic form is presented. Critical design parameters may be identified and optimized using symbolic models as shown in the sample application presented for the Future Armor Rearm System (FARS) arm. The computer-aided development of the symbolic models yields equations with reduced numerical complexity. Important considerations have been placed on the closed form solutions simplification and on the user friendly operation. The main emphasis of this research is the development of a methodology which is implemented in a computer program capablemore » of generating symbolic kinematic and static forces models of manipulators. The fact that the models are obtained trigonometrically reduced is among the most significant results of this work and the most difficult to implement. Mathematica, a commercial program that allows symbolic manipulation, is used to implement the program package. SML is written such that the user can change any of the subroutines or create new ones easily. To assist the user, an on-line help has been written to make of SML a user friendly package. Some sample applications are presented. The design and optimization of the 5-degrees-of-freedom (DOF) FARS manipulator using SML is discussed. Finally, the kinematic and static models of two different 7-DOF manipulators are calculated symbolically.« less
1987-03-01
model is one in which words or numerical descriptions are used to represent an entity or process. An example of a symbolic model is a mathematical ...are the third type of model used in modeling combat attrition. Analytical models are symbolic models which use mathematical symbols and equations to...simplicity and the ease of tracing through the mathematical computations. In this section I will discuss some of the shortcoming which have been
Nuclear Structure in China 2010
NASA Astrophysics Data System (ADS)
Bai, Hong-Bo; Meng, Jie; Zhao, En-Guang; Zhou, Shan-Gui
2011-08-01
Personal view on nuclear physics research / Jie Meng -- High-spin level structures in [symbol]Zr / X. P. Cao ... [et al.] -- Constraining the symmetry energy from the neutron skin thickness of tin isotopes / Lie-Wen Chen ... [et al.] -- Wobbling rotation in atomic nuclei / Y. S. Chen and Zao-Chun Gao -- The mixing of scalar mesons and the possible nonstrange dibaryons / L. R. Dai ... [et al.] -- Net baryon productions and gluon saturation in the SPS, RHIC and LHC energy regions / Sheng-Qin Feng -- Production of heavy isotopes with collisions between two actinide nuclides / Z. Q. Feng ... [et al.] -- The projected configuration interaction method / Zao-Chun Gao and Yong-Shou Chen -- Applications of Nilsson mean-field plus extended pairing model to rare-earth nuclei / Xin Guan ... [et al.] -- Complex scaling method and the resonant states / Jian-You Guo ... [et al.] -- Probing the equation of state by deep sub-barrier fusion reactions / Hong-Jun Hao and Jun-Long Tian -- Doublet structure study in A[symbol]105 mass region / C. Y. He ... [et al.] -- Rotational bands in transfermium nuclei / X. T. He -- Shape coexistence and shape evolution [symbol]Yb / H. Hua ... [et al.] -- Multistep shell model method in the complex energy plane / R. J. Liotta -- The evolution of protoneutron stars with kaon condensate / Ang Li -- High spin structures in the [symbol]Lu nucleus / Li Cong-Bo ... [et al.] -- Nuclear stopping and equation of state / QingFeng Li and Ying Yuan -- Covariant description of the low-lying states in neutron-deficient Kr isotopes / Z. X. Li ... [et al.] -- Isospin corrections for superallowed [symbol] transitions / HaoZhao Liang ... [et al.] -- The positive-parity band structures in [symbol]Ag / C. Liu ... [et al.] -- New band structures in odd-odd [symbol]I and [symbol]I / Liu GongYe ... [et al.] -- The sd-pair shell model and interacting boson model / Yan-An Luo ... [et al.] -- Cross-section distributions of fragments in the calcium isotopes projectile fragmentation at the intermediate energy / C. W. Ma ... [et al.].Systematic study of spin assignment and dynamic moment of inertia of high-j intruder band in [symbol]In / K. Y. Ma ... [et al.] -- Signals of diproton emission from the three-body breakup channel of [symbol]Al and [symbol]Mg / Ma Yu-Gang ... [et al.] -- Uncertainties of Th/Eu and Th/Hf chronometers from nucleus masses / Z. M. Niu ... [et al.] -- The chiral doublet bands with [symbol] configuration in A[symbol]100 mass region / B. Qi ... [et al.] -- [symbol] formation probabilities in nuclei and pairing collectivity / Chong Qi -- A theoretical prospective on triggered gamma emission from [symbol]Hf[symbol] isomer / ShuiFa Shen ... [et al.] -- Study of nuclear giant resonances using a Fermi-liquid method / Bao-Xi Sun -- Rotational bands in doubly odd [symbol]Sb / D. P. Sun ... [et al.] -- The study of the neutron N=90 nuclei / W. X. Teng ... [et al.] -- Dynamical modes and mechanisms in ternary reaction of [symbol]Au+[symbol]Au / Jun-Long Tian ... [et al.] -- Dynamical study of X(3872) as a D[symbol] molecular state / B. Wang ... [et al.] -- Super-heavy stability island with a semi-empirical nuclear mass formula / N. Wang ... [et al.] -- Pseudospin partner bands in [symbol]Sb / S. Y. Wang ... [et al.] -- Study of elastic resonance scattering at CIAE / Y. B. Wang ... [et al.] -- Systematic study of survival probability of excited superheavy nuclei / C. J. Xia ... [et al.] -- Angular momentum projection of the Nilsson mean-field plus nearest-orbit pairing interaction model / Ming-Xia Xie ... [et al.] -- Possible shape coexistence for [symbol]Sm in a reflection-asymmetric relativistic mean-field approach / W. Zhang ... [et al.] -- Nuclear pairing reduction due to rotation and blocking / Zhen-Hua Zhang -- Nucleon pair approximation of the shell model: a review and perspective / Y. M. Zhao ... [et al.] -- Band structures in doubly odd [symbol]I / Y. Zheng ... [et al.] -- Lifetimes of high spin states in [symbol]Ag / Y. Zheng ... [et al.] -- Effect of tensor interaction on the shell structure of superheavy nuclei / Xian-Rong Zhou ... [et al.].
NASA Astrophysics Data System (ADS)
Ryan, D. P.; Roth, G. S.
1982-04-01
Complete documentation of the 15 programs and 11 data files of the EPA Atomic Absorption Instrument Automation System is presented. The system incorporates the following major features: (1) multipoint calibration using first, second, or third degree regression or linear interpolation, (2) timely quality control assessments for spiked samples, duplicates, laboratory control standards, reagent blanks, and instrument check standards, (3) reagent blank subtraction, and (4) plotting of calibration curves and raw data peaks. The programs of this system are written in Data General Extended BASIC, Revision 4.3, as enhanced for multi-user, real-time data acquisition. They run in a Data General Nova 840 minicomputer under the operating system RDOS, Revision 6.2. There is a functional description, a symbol definitions table, a functional flowchart, a program listing, and a symbol cross reference table for each program. The structure of every data file is also detailed.
Frame Synchronization Without Attached Sync Markers
NASA Technical Reports Server (NTRS)
Hamkins, Jon
2011-01-01
We describe a method to synchronize codeword frames without making use of attached synchronization markers (ASMs). Instead, the synchronizer identifies the code structure present in the received symbols, by operating the decoder for a handful of iterations at each possible symbol offset and forming an appropriate metric. This method is computationally more complex and doesn't perform as well as frame synchronizers that utilize an ASM; nevertheless, the new synchronizer acquires frame synchronization in about two seconds when using a 600 kbps software decoder, and would take about 15 milliseconds on prototype hardware. It also eliminates the need for the ASMs, which is an attractive feature for short uplink codes whose coding gain would be diminished by the overheard of ASM bits. The lack of ASMs also would simplify clock distribution for the AR4JA low-density parity-check (LDPC) codes and adds a small amount to the coding gain as well (up to 0.2 dB).
Symbolic Play in the Treatment of Autism in Children.
ERIC Educational Resources Information Center
Voyat, Gilbert
1982-01-01
Explores the role of symbolic play in the cognitive and psychic development of the normal child and describes the autistic child. Reviews a model treatment program for autism developed at the City College of New York, discussing the therapeutic role of symbolic play in that model. (Author/MJL)
Selections from Kuang-Ming JIH-PAO (Source Span: 17 May - 26 June 1961), Number 8 Communist China.
1961-08-31
to have a feeling of being unaccustomed to a certain new method, much like the feeling they have towards the use of phonetic symbols, Romanization or...seems to me that there is a certain unanimity among those who advocate the checking of Chinese words through phonetic sounds. The differences are...children’s mental development. We could not possibly ask children in kindergarten to learn algebra because natural maturity is also important. We have
1998-02-24
step, the earthquake data set must be re-checked in order for the change to take effect. Once changed the new symbol stays changed until the session is...standard methods for discriminating between earthquakes and ripple fired explosions to a new geologic setting (northwest Morocco) in an effort to examine the...Tectonophysics, 217: 217-226. Shapira, A., Avni, R. & Nur, A., 1993. Note: A New Estimate For The Epicenter Of The Jericho Earthquake Of 11 July 1927. Israel
Symbol Recognition Using a Concept Lattice of Graphical Patterns
NASA Astrophysics Data System (ADS)
Rusiñol, Marçal; Bertet, Karell; Ogier, Jean-Marc; Lladós, Josep
In this paper we propose a new approach to recognize symbols by the use of a concept lattice. We propose to build a concept lattice in terms of graphical patterns. Each model symbol is decomposed in a set of composing graphical patterns taken as primitives. Each one of these primitives is described by boundary moment invariants. The obtained concept lattice relates which symbolic patterns compose a given graphical symbol. A Hasse diagram is derived from the context and is used to recognize symbols affected by noise. We present some preliminary results over a variation of the dataset of symbols from the GREC 2005 symbol recognition contest.
NASA Astrophysics Data System (ADS)
Degaudenzi, Riccardo; Vanghi, Vieri
1994-02-01
In all-digital Trellis-Coded 8PSK (TC-8PSK) demodulator well suited for VLSI implementation, including maximum likelihood estimation decision-directed (MLE-DD) carrier phase and clock timing recovery, is introduced and analyzed. By simply removing the trellis decoder the demodulator can efficiently cope with uncoded 8PSK signals. The proposed MLE-DD synchronization algorithm requires one sample for the phase and two samples per symbol for the timing loop. The joint phase and timing discriminator characteristics are analytically derived and numerical results checked by means of computer simulations. An approximated expression for steady-state carrier phase and clock timing mean square error has been derived and successfully checked with simulation findings. Synchronizer deviation from the Cramer Rao bound is also discussed. Mean acquisition time for the digital synchronizer has also been computed and checked, using the Monte Carlo simulation technique. Finally, TC-8PSK digital demodulator performance in terms of bit error rate and mean time to lose lock, including digital interpolators and synchronization loops, is presented.
NASA Astrophysics Data System (ADS)
Maravall, Darío; de Lope, Javier; Domínguez, Raúl
In Multi-agent systems, the study of language and communication is an active field of research. In this paper we present the application of evolutionary strategies to the self-emergence of a common lexicon in a population of agents. By modeling the vocabulary or lexicon of each agent as an association matrix or look-up table that maps the meanings (i.e. the objects encountered by the agents or the states of the environment itself) into symbols or signals we check whether it is possible for the population to converge in an autonomous, decentralized way to a common lexicon, so that the communication efficiency of the entire population is optimal. We have conducted several experiments, from the simplest case of a 2×2 association matrix (i.e. two meanings and two symbols) to a 3×3 lexicon case and in both cases we have attained convergence to the optimal communication system by means of evolutionary strategies. To analyze the convergence of the population of agents we have defined the population's consensus when all the agents (i.e. the 100% of the population) share the same association matrix or lexicon. As a general conclusion we have shown that evolutionary strategies are powerful enough optimizers to guarantee the convergence to lexicon consensus in a population of autonomous agents.
Double symbolic joint entropy in nonlinear dynamic complexity analysis
NASA Astrophysics Data System (ADS)
Yao, Wenpo; Wang, Jun
2017-07-01
Symbolizations, the base of symbolic dynamic analysis, are classified as global static and local dynamic approaches which are combined by joint entropy in our works for nonlinear dynamic complexity analysis. Two global static methods, symbolic transformations of Wessel N. symbolic entropy and base-scale entropy, and two local ones, namely symbolizations of permutation and differential entropy, constitute four double symbolic joint entropies that have accurate complexity detections in chaotic models, logistic and Henon map series. In nonlinear dynamical analysis of different kinds of heart rate variability, heartbeats of healthy young have higher complexity than those of the healthy elderly, and congestive heart failure (CHF) patients are lowest in heartbeats' joint entropy values. Each individual symbolic entropy is improved by double symbolic joint entropy among which the combination of base-scale and differential symbolizations have best complexity analysis. Test results prove that double symbolic joint entropy is feasible in nonlinear dynamic complexity analysis.
Maintaining the balance: older adults with chronic health problems manage life in the community.
Jacelon, Cynthia S
2010-01-01
The purpose of this research was to identify themes in the daily lives of community-dwelling older adults with chronic health problems. Qualitative descriptive methods based on symbolic interaction were used. Data were generated through unstructured interviews, participant diaries, and researcher logs. Participants were interviewed twice and kept diaries in between. Measures to enhance trustworthiness included bracketing, multiple data sources, repeated interviews, prolonged engagement, an audit trail, participant checking, and consultation with an expert qualitative researcher. Ten older adults 75-98 years of age living in their own homes with at least one self-reported chronic health problem participated in the research. Participants' health problems varied, and they developed strategies to maintain balance in activity, attitude, autonomy, health, and relationships. This research provides a new perspective on living with chronic illness, and the model may provide a framework for rehabilitation nurses who work with older adults.
Toropova, Alla P; Toropov, Andrey A; Rallo, Robert; Leszczynska, Danuta; Leszczynski, Jerzy
2015-02-01
The Monte Carlo technique has been used to build up quantitative structure-activity relationships (QSARs) for prediction of dark cytotoxicity and photo-induced cytotoxicity of metal oxide nanoparticles to bacteria Escherichia coli (minus logarithm of lethal concentration for 50% bacteria pLC50, LC50 in mol/L). The representation of nanoparticles include (i) in the case of the dark cytotoxicity a simplified molecular input-line entry system (SMILES), and (ii) in the case of photo-induced cytotoxicity a SMILES plus symbol '^'. The predictability of the approach is checked up with six random distributions of available data into the visible training and calibration sets, and invisible validation set. The statistical characteristics of these models are correlation coefficient 0.90-0.94 (training set) and 0.73-0.98 (validation set). Copyright © 2014 Elsevier Inc. All rights reserved.
Circular blurred shape model for multiclass symbol recognition.
Escalera, Sergio; Fornés, Alicia; Pujol, Oriol; Lladós, Josep; Radeva, Petia
2011-04-01
In this paper, we propose a circular blurred shape model descriptor to deal with the problem of symbol detection and classification as a particular case of object recognition. The feature extraction is performed by capturing the spatial arrangement of significant object characteristics in a correlogram structure. The shape information from objects is shared among correlogram regions, where a prior blurring degree defines the level of distortion allowed in the symbol, making the descriptor tolerant to irregular deformations. Moreover, the descriptor is rotation invariant by definition. We validate the effectiveness of the proposed descriptor in both the multiclass symbol recognition and symbol detection domains. In order to perform the symbol detection, the descriptors are learned using a cascade of classifiers. In the case of multiclass categorization, the new feature space is learned using a set of binary classifiers which are embedded in an error-correcting output code design. The results over four symbol data sets show the significant improvements of the proposed descriptor compared to the state-of-the-art descriptors. In particular, the results are even more significant in those cases where the symbols suffer from elastic deformations.
Environment Modeling Using Runtime Values for JPF-Android
NASA Technical Reports Server (NTRS)
van der Merwe, Heila; Tkachuk, Oksana; Nel, Seal; van der Merwe, Brink; Visser, Willem
2015-01-01
Software applications are developed to be executed in a specific environment. This environment includes external native libraries to add functionality to the application and drivers to fire the application execution. For testing and verification, the environment of an application is simplified abstracted using models or stubs. Empty stubs, returning default values, are simple to generate automatically, but they do not perform well when the application expects specific return values. Symbolic execution is used to find input parameters for drivers and return values for library stubs, but it struggles to detect the values of complex objects. In this work-in-progress paper, we explore an approach to generate drivers and stubs based on values collected during runtime instead of using default values. Entry-points and methods that need to be modeled are instrumented to log their parameters and return values. The instrumented applications are then executed using a driver and instrumented libraries. The values collected during runtime are used to generate driver and stub values on- the-fly that improve coverage during verification by enabling the execution of code that previously crashed or was missed. We are implementing this approach to improve the environment model of JPF-Android, our model checking and analysis tool for Android applications.
Parameters and symbols for use in nuclear magnetic resonance (IUPAC recommendations 1997).
Harris, R K; Kowalewski, J; Cabral de Menezes, S
1998-01-01
NMR is now frequently the technique of choice for the determination of chemical structure in solution. Its uses also span structure in solids and mobility at the molecular level in all phases. The research literature in the subject is vast and ever-increasing. Unfortunately, many articles do not contain sufficient information for experiments to be repeated elsewhere, and there are many variations in the usage of symbols for the same physical quantity. It is the aim of the present recommendations to provide simple check-lists that will enable such problems to be minimised in a way that is consistent with general IUPAC formulation. The area of medical NMR and imaging is not specifically addressed in these recommendations, which are principally aimed at the mainstream use of NMR by chemists (of all sub-disciplines) and by many physicists, biologists, materials scientists and geologists etc. working with NMR. The document presents recommended notation for use in journal publications involving a significant contribution of nuclear magnetic resonance (NMR) spectroscopy. The recommendations are in two parts: (1) Experimental parameters which should be listed so that the work in question can be repeated elsewhere. (2) A list of symbols (using Roman or Greek characters) to be used for quantities relevant to NMR.
Execution environment for intelligent real-time control systems
NASA Technical Reports Server (NTRS)
Sztipanovits, Janos
1987-01-01
Modern telerobot control technology requires the integration of symbolic and non-symbolic programming techniques, different models of parallel computations, and various programming paradigms. The Multigraph Architecture, which has been developed for the implementation of intelligent real-time control systems is described. The layered architecture includes specific computational models, integrated execution environment and various high-level tools. A special feature of the architecture is the tight coupling between the symbolic and non-symbolic computations. It supports not only a data interface, but also the integration of the control structures in a parallel computing environment.
NASA Astrophysics Data System (ADS)
Kuvich, Gary
2004-08-01
Vision is only a part of a system that converts visual information into knowledge structures. These structures drive the vision process, resolving ambiguity and uncertainty via feedback, and provide image understanding, which is an interpretation of visual information in terms of these knowledge models. These mechanisms provide a reliable recognition if the object is occluded or cannot be recognized as a whole. It is hard to split the entire system apart, and reliable solutions to the target recognition problems are possible only within the solution of a more generic Image Understanding Problem. Brain reduces informational and computational complexities, using implicit symbolic coding of features, hierarchical compression, and selective processing of visual information. Biologically inspired Network-Symbolic representation, where both systematic structural/logical methods and neural/statistical methods are parts of a single mechanism, is the most feasible for such models. It converts visual information into relational Network-Symbolic structures, avoiding artificial precise computations of 3-dimensional models. Network-Symbolic Transformations derive abstract structures, which allows for invariant recognition of an object as exemplar of a class. Active vision helps creating consistent models. Attention, separation of figure from ground and perceptual grouping are special kinds of network-symbolic transformations. Such Image/Video Understanding Systems will be reliably recognizing targets.
Automated reverse engineering of nonlinear dynamical systems
Bongard, Josh; Lipson, Hod
2007-01-01
Complex nonlinear dynamics arise in many fields of science and engineering, but uncovering the underlying differential equations directly from observations poses a challenging task. The ability to symbolically model complex networked systems is key to understanding them, an open problem in many disciplines. Here we introduce for the first time a method that can automatically generate symbolic equations for a nonlinear coupled dynamical system directly from time series data. This method is applicable to any system that can be described using sets of ordinary nonlinear differential equations, and assumes that the (possibly noisy) time series of all variables are observable. Previous automated symbolic modeling approaches of coupled physical systems produced linear models or required a nonlinear model to be provided manually. The advance presented here is made possible by allowing the method to model each (possibly coupled) variable separately, intelligently perturbing and destabilizing the system to extract its less observable characteristics, and automatically simplifying the equations during modeling. We demonstrate this method on four simulated and two real systems spanning mechanics, ecology, and systems biology. Unlike numerical models, symbolic models have explanatory value, suggesting that automated “reverse engineering” approaches for model-free symbolic nonlinear system identification may play an increasing role in our ability to understand progressively more complex systems in the future. PMID:17553966
Automated reverse engineering of nonlinear dynamical systems.
Bongard, Josh; Lipson, Hod
2007-06-12
Complex nonlinear dynamics arise in many fields of science and engineering, but uncovering the underlying differential equations directly from observations poses a challenging task. The ability to symbolically model complex networked systems is key to understanding them, an open problem in many disciplines. Here we introduce for the first time a method that can automatically generate symbolic equations for a nonlinear coupled dynamical system directly from time series data. This method is applicable to any system that can be described using sets of ordinary nonlinear differential equations, and assumes that the (possibly noisy) time series of all variables are observable. Previous automated symbolic modeling approaches of coupled physical systems produced linear models or required a nonlinear model to be provided manually. The advance presented here is made possible by allowing the method to model each (possibly coupled) variable separately, intelligently perturbing and destabilizing the system to extract its less observable characteristics, and automatically simplifying the equations during modeling. We demonstrate this method on four simulated and two real systems spanning mechanics, ecology, and systems biology. Unlike numerical models, symbolic models have explanatory value, suggesting that automated "reverse engineering" approaches for model-free symbolic nonlinear system identification may play an increasing role in our ability to understand progressively more complex systems in the future.
Product code optimization for determinate state LDPC decoding in robust image transmission.
Thomos, Nikolaos; Boulgouris, Nikolaos V; Strintzis, Michael G
2006-08-01
We propose a novel scheme for error-resilient image transmission. The proposed scheme employs a product coder consisting of low-density parity check (LDPC) codes and Reed-Solomon codes in order to deal effectively with bit errors. The efficiency of the proposed scheme is based on the exploitation of determinate symbols in Tanner graph decoding of LDPC codes and a novel product code optimization technique based on error estimation. Experimental evaluation demonstrates the superiority of the proposed system in comparison to recent state-of-the-art techniques for image transmission.
Symbol interval optimization for molecular communication with drift.
Kim, Na-Rae; Eckford, Andrew W; Chae, Chan-Byoung
2014-09-01
In this paper, we propose a symbol interval optimization algorithm in molecular communication with drift. Proper symbol intervals are important in practical communication systems since information needs to be sent as fast as possible with low error rates. There is a trade-off, however, between symbol intervals and inter-symbol interference (ISI) from Brownian motion. Thus, we find proper symbol interval values considering the ISI inside two kinds of blood vessels, and also suggest no ISI system for strong drift models. Finally, an isomer-based molecule shift keying (IMoSK) is applied to calculate achievable data transmission rates (achievable rates, hereafter). Normalized achievable rates are also obtained and compared in one-symbol ISI and no ISI systems.
Heuristic automation for decluttering tactical displays.
St John, Mark; Smallman, Harvey S; Manes, Daniel I; Feher, Bela A; Morrison, Jeffrey G
2005-01-01
Tactical displays can quickly become cluttered with large numbers of symbols that can compromise effective monitoring. Here, we studied how heuristic automation can aid users by intelligently "decluttering" the display. In a realistic simulated naval air defense task, 27 experienced U.S. Navy users monitored a cluttered airspace and executed defensive responses against significant threats. An algorithm continuously evaluated aircraft for their levels of threat and decluttered the less threatening ones by dimming their symbols. Users appropriately distrusted and spot-checked the automation's assessments, and decluttering had very little effect on which aircraft were judged as significantly threatening. Nonetheless, decluttering improved the timeliness of responses to threatening aircraft by 25% as compared with a baseline display with no decluttering; it was especially beneficial for threats in more peripheral locations, and 25 of 27 participants preferred decluttering. Heuristic automation, when properly designed to guide users' attention by decluttering less important objects, may prove valuable in many cluttered monitoring situations, including air traffic management, crisis team management, and tactical situation awareness in general.
NASA Astrophysics Data System (ADS)
He, Jing; Wen, Xuejie; Chen, Ming; Chen, Lin
2015-09-01
In this paper, a Golay complementary training sequence (TS)-based symbol synchronization scheme is proposed and experimentally demonstrated in multiband orthogonal frequency division multiplexing (MB-OFDM) ultra-wideband over fiber (UWBoF) system with a variable rate low-density parity-check (LDPC) code. Meanwhile, the coding gain and spectral efficiency in the variable rate LDPC-coded MB-OFDM UWBoF system are investigated. By utilizing the non-periodic auto-correlation property of the Golay complementary pair, the start point of LDPC-coded MB-OFDM UWB signal can be estimated accurately. After 100 km standard single-mode fiber (SSMF) transmission, at the bit error rate of 1×10-3, the experimental results show that the short block length 64QAM-LDPC coding provides a coding gain of 4.5 dB, 3.8 dB and 2.9 dB for a code rate of 62.5%, 75% and 87.5%, respectively.
There-apy: The Use of Task, Imagery, and Symbolism To Connect the Inner and Outer Worlds.
ERIC Educational Resources Information Center
Eisenstein-Naveh, A. Rosa
2001-01-01
Presents a model of therapy called there-apy, which weaves together the use of task, symbolism, and imagery into an ongoing process. Concrete tasks take on symbolic meaning, and symbolism gets actualized through achieving concrete tasks. There-apy connects the individual's outside and inside worlds and often involves the partner or family in the…
A symbolic/subsymbolic interface protocol for cognitive modeling
Simen, Patrick; Polk, Thad
2009-01-01
Researchers studying complex cognition have grown increasingly interested in mapping symbolic cognitive architectures onto subsymbolic brain models. Such a mapping seems essential for understanding cognition under all but the most extreme viewpoints (namely, that cognition consists exclusively of digitally implemented rules; or instead, involves no rules whatsoever). Making this mapping reduces to specifying an interface between symbolic and subsymbolic descriptions of brain activity. To that end, we propose parameterization techniques for building cognitive models as programmable, structured, recurrent neural networks. Feedback strength in these models determines whether their components implement classically subsymbolic neural network functions (e.g., pattern recognition), or instead, logical rules and digital memory. These techniques support the implementation of limited production systems. Though inherently sequential and symbolic, these neural production systems can exploit principles of parallel, analog processing from decision-making models in psychology and neuroscience to explain the effects of brain damage on problem solving behavior. PMID:20711520
Tiled vector data model for the geographical features of symbolized maps.
Li, Lin; Hu, Wei; Zhu, Haihong; Li, You; Zhang, Hang
2017-01-01
Electronic maps (E-maps) provide people with convenience in real-world space. Although web map services can display maps on screens, a more important function is their ability to access geographical features. An E-map that is based on raster tiles is inferior to vector tiles in terms of interactive ability because vector maps provide a convenient and effective method to access and manipulate web map features. However, the critical issue regarding rendering tiled vector maps is that geographical features that are rendered in the form of map symbols via vector tiles may cause visual discontinuities, such as graphic conflicts and losses of data around the borders of tiles, which likely represent the main obstacles to exploring vector map tiles on the web. This paper proposes a tiled vector data model for geographical features in symbolized maps that considers the relationships among geographical features, symbol representations and map renderings. This model presents a method to tailor geographical features in terms of map symbols and 'addition' (join) operations on the following two levels: geographical features and map features. Thus, these maps can resolve the visual discontinuity problem based on the proposed model without weakening the interactivity of vector maps. The proposed model is validated by two map data sets, and the results demonstrate that the rendered (symbolized) web maps present smooth visual continuity.
Some Thoughts (and Afterthoughts) on Context, Interpretation, and Organization Theory.
1982-02-01
viewpoint can be cast as symbolic interactionism (Blumer, 1969) but it is a viewpoint with many variations (Rock, 1979). 1 seek my own intellectual models in... Symbolic interactionism : Perspective and method. Englewood Cliffs, New Jersey: Prentice-Hall, 1969. Burrell, G. and Morgan, G. Sociological paradigms and...making of symbolic interactionism . London: Macmillan, 1979. Salancik, G.R. and Pfeffer, J. An examination of need satisfaction models of job attitudes
Nakatsui, M; Horimoto, K; Lemaire, F; Ürgüplü, A; Sedoglavic, A; Boulier, F
2011-09-01
Recent remarkable advances in computer performance have enabled us to estimate parameter values by the huge power of numerical computation, the so-called 'Brute force', resulting in the high-speed simultaneous estimation of a large number of parameter values. However, these advancements have not been fully utilised to improve the accuracy of parameter estimation. Here the authors review a novel method for parameter estimation using symbolic computation power, 'Bruno force', named after Bruno Buchberger, who found the Gröbner base. In the method, the objective functions combining the symbolic computation techniques are formulated. First, the authors utilise a symbolic computation technique, differential elimination, which symbolically reduces an equivalent system of differential equations to a system in a given model. Second, since its equivalent system is frequently composed of large equations, the system is further simplified by another symbolic computation. The performance of the authors' method for parameter accuracy improvement is illustrated by two representative models in biology, a simple cascade model and a negative feedback model in comparison with the previous numerical methods. Finally, the limits and extensions of the authors' method are discussed, in terms of the possible power of 'Bruno force' for the development of a new horizon in parameter estimation.
Symbolic Processing Combined with Model-Based Reasoning
NASA Technical Reports Server (NTRS)
James, Mark
2009-01-01
A computer program for the detection of present and prediction of future discrete states of a complex, real-time engineering system utilizes a combination of symbolic processing and numerical model-based reasoning. One of the biggest weaknesses of a purely symbolic approach is that it enables prediction of only future discrete states while missing all unmodeled states or leading to incorrect identification of an unmodeled state as a modeled one. A purely numerical approach is based on a combination of statistical methods and mathematical models of the applicable physics and necessitates development of a complete model to the level of fidelity required for prediction. In addition, a purely numerical approach does not afford the ability to qualify its results without some form of symbolic processing. The present software implements numerical algorithms to detect unmodeled events and symbolic algorithms to predict expected behavior, correlate the expected behavior with the unmodeled events, and interpret the results in order to predict future discrete states. The approach embodied in this software differs from that of the BEAM methodology (aspects of which have been discussed in several prior NASA Tech Briefs articles), which provides for prediction of future measurements in the continuous-data domain.
NASA Technical Reports Server (NTRS)
Remington, Roger; Williams, Douglas
1986-01-01
Three single-target visual search tasks were used to evaluate a set of cathode-ray tube (CRT) symbols for a helicopter situation display. The search tasks were representative of the information extraction required in practice, and reaction time was used to measure the efficiency with which symbols could be located and identified. Familiar numeric symbols were responded to more quickly than graphic symbols. The addition of modifier symbols, such as a nearby flashing dot or surrounding square, had a greater disruptive effect on the graphic symbols than did the numeric characters. The results suggest that a symbol set is, in some respects, like a list that must be learned. Factors that affect the time to identify items in a memory task, such as familiarity and visual discriminability, also affect the time to identify symbols. This analogy has broad implications for the design of symbol sets. An attempt was made to model information access with this class of display.
White, Christine M; Lillico, Heather G; Vanderlee, Lana; Hammond, David
2016-12-01
Health Check (HC) was a voluntary nutrition labeling program developed by the Heart and Stroke Foundation of Canada as a guide to help consumers choose healthy foods. Items meeting nutrient criteria were identified with a HC symbol. This study examined the impact of the program on differences in consumer awareness and use of nutritional information in restaurants. Exit surveys were conducted with 1126 patrons outside four HC and four comparison restaurants in Ontario, Canada (2013). Surveys assessed participant noticing of nutrition information, influence of nutrition information on menu selection, and nutrient intake. Significantly more patrons at HC restaurants noticed nutrition information than at comparison restaurants (34.2% vs. 28.1%; OR = 1.39; p = 0.019); however, only 5% of HC restaurant patrons recalled seeing the HC symbol. HC restaurant patrons were more likely to say that their order was influenced by nutrition information (10.9% vs. 4.5%; OR = 2.96, p < 0.001); and consumed less saturated fat and carbohydrates, and more protein and fibre (p < 0.05). Approximately 15% of HC restaurant patrons ordered HC approved items; however, only 1% ordered a HC item and mentioned seeing the symbol in the restaurant in an unprompted recall task, and only 4% ordered a HC item and reported seeing the symbol on the item when asked directly. The HC program was associated with greater levels of noticing and influence of nutrition information, and more favourable nutrient intake; however, awareness of the HC program was very low and differences most likely reflect the type of restaurants that "self-selected" into the program.
The Precedence of Global Features in the Perception of Map Symbols
1988-06-01
be continually updated. The present study evaluated the feasibility of a serial model of visual processing. By comparing performance between a symbol...symbols, is based on a " filter - ing" procedure, consisting of a series of passive-to-active or global- to-local stages. Navon (1977, 1981a) has proposed a...packages or segments. This advances the earlier, static feature aggregation ap- proaches to comprise a "figure." According to the global precedence model
Detail view of lamp in law library; Jennewein modeled symbols ...
Detail view of lamp in law library; Jennewein modeled symbols of the four seasons on the lamp's aluminum supports - United States Department of Justice, Constitution Avenue between Ninth & Tenth Streets, Northwest, Washington, District of Columbia, DC
Giannini, A J; Giannini, J N; Condon, M
2000-07-01
Medieval and Renaissance teaching techniques using linkage between course content and tangentially related visual symbols were applied to the teaching of the pharmacological principles of addiction. Forty medical students randomly divided into two blinded groups viewed a lecture. One lecture was supplemented by symbolic slides, and the second was not. Students who viewed symbolic slides had significantly higher scores in a written 15-question multiple-choice test 30 days after the lecture. These results were consistent with learning and semiotic models. These models hypothesize a linkage between conceptual content and perception of visual symbols that thereby increases conceptual retention. Recent neurochemical research supports the existence of a linkage between two chemically distinct memory systems. Simultaneous stimulation of both chemical systems by teaching formats similar to those employed in the study can augment neurochemical signaling in the neocortex.
Two-layer symbolic representation for stochastic models with phase-type distributed events
NASA Astrophysics Data System (ADS)
Longo, Francesco; Scarpa, Marco
2015-07-01
Among the techniques that have been proposed for the analysis of non-Markovian models, the state space expansion approach showed great flexibility in terms of modelling capacities.The principal drawback is the explosion of the state space. This paper proposes a two-layer symbolic method for efficiently storing the expanded reachability graph of a non-Markovian model in the case in which continuous phase-type distributions are associated with the firing times of system events, and different memory policies are considered. At the lower layer, the reachability graph is symbolically represented in the form of a set of Kronecker matrices, while, at the higher layer, all the information needed to correctly manage event memory is stored in a multi-terminal multi-valued decision diagram. Such an information is collected by applying a symbolic algorithm, which is based on a couple of theorems. The efficiency of the proposed approach, in terms of memory occupation and execution time, is shown by applying it to a set of non-Markovian stochastic Petri nets and comparing it with a classical explicit expansion algorithm. Moreover, a comparison with a classical symbolic approach is performed whenever possible.
ERIC Educational Resources Information Center
Romi, Shlomo; Teichman, Meir
1995-01-01
Discusses a training program for youth counselors aimed to improve counselors' self-efficacy and ability to cope with stressful situations. Two versions of the program were evaluated: one based on participant modeling, the other on symbolic modeling. Self-efficacy of subjects on the participant modeling increased compared to that of the subjects…
Simulations of Stagewise Development with a Symbolic Architecture
NASA Astrophysics Data System (ADS)
Gobet, Fernand
This chapter compares Piaget's theory of development with Feigenbaum & Simon's (1962; 1984) EPAM theory. An attempt is made to map the concepts of assimilation and accommodation in Piaget's theory onto the concepts of familiarisation and accommodation in EPAM. An EPAM-like model of the balance scale task is then presented, with a discussion of preliminary results showing how it accounts for children's discontinuous, stage-like development. The analysis focuses on the transition between rules, using catastrophe flags (Gilmore, 1981) as criteria. It is argued that some symbolic models may be described as dynamical systems, in the same way as some non-symbolic models.
Modification of social withdrawal through symbolic modeling1
O'Connor, Robert D.
1969-01-01
The present experiment was designed to test the efficacy of symbolic modeling as a treatment to enhance social behavior in preschool isolates. Nursery school children who displayed marked social withdrawal were assigned to one of two conditions. One group observed a film depicting increasingly more active social interactions between children with positive consequences ensuing in each scene, while a narrative soundtrack emphasized the appropriate behavior of the models. A control group observed a film that contained no social interaction. Control children displayed no change in withdrawal behavior, whereas those who had the benefit of symbolic modeling increased their level of social interaction to that of non-isolate nursery school children. PMID:16795196
A Symbolic Model of the Nonconscious Acquisition of Information.
ERIC Educational Resources Information Center
Ling, Charles X.; Marinov, Marin
1994-01-01
Challenges Smolensky's theory that human intuitive/nonconscious cognitive processes can only be accurately explained in terms of subsymbolic computations in artificial neural networks. Symbolic learning models of two cognitive tasks involving nonconscious acquisition of information are presented: learning production rules and artificial finite…
The Source of the Symbolic Numerical Distance and Size Effects
Krajcsi, Attila; Lengyel, Gábor; Kojouharova, Petia
2016-01-01
Human number understanding is thought to rely on the analog number system (ANS), working according to Weber’s law. We propose an alternative account, suggesting that symbolic mathematical knowledge is based on a discrete semantic system (DSS), a representation that stores values in a semantic network, similar to the mental lexicon or to a conceptual network. Here, focusing on the phenomena of numerical distance and size effects in comparison tasks, first we discuss how a DSS model could explain these numerical effects. Second, we demonstrate that the DSS model can give quantitatively as appropriate a description of the effects as the ANS model. Finally, we show that symbolic numerical size effect is mainly influenced by the frequency of the symbols, and not by the ratios of their values. This last result suggests that numerical distance and size effects cannot be caused by the ANS, while the DSS model might be the alternative approach that can explain the frequency-based size effect. PMID:27917139
A Prototype Symbolic Model of Canonical Functional Neuroanatomy of the Motor System
Rubin, Daniel L.; Halle, Michael; Musen, Mark; Kikinis, Ron
2008-01-01
Recent advances in bioinformatics have opened entire new avenues for organizing, integrating and retrieving neuroscientific data, in a digital, machine-processable format, which can be at the same time understood by humans, using ontological, symbolic data representations. Declarative information stored in ontological format can be perused and maintained by domain experts, interpreted by machines, and serve as basis for a multitude of decision-support, computerized simulation, data mining, and teaching applications. We have developed a prototype symbolic model of canonical neuroanatomy of the motor system. Our symbolic model is intended to support symbolic lookup, logical inference and mathematical modeling by integrating descriptive, qualitative and quantitative functional neuroanatomical knowledge. Furthermore, we show how our approach can be extended to modeling impaired brain connectivity in disease states, such as common movement disorders. In developing our ontology, we adopted a disciplined modeling approach, relying on a set of declared principles, a high-level schema, Aristotelian definitions, and a frame-based authoring system. These features, along with the use of the Unified Medical Language System (UMLS) vocabulary, enable the alignment of our functional ontology with an existing comprehensive ontology of human anatomy, and thus allow for combining the structural and functional views of neuroanatomy for clinical decision support and neuroanatomy teaching applications. Although the scope of our current prototype ontology is limited to a particular functional system in the brain, it may be possible to adapt this approach for modeling other brain functional systems as well. PMID:18164666
Socialization by Way of Symbolic Interactionism and Culture Theory: A Communication Perspective.
ERIC Educational Resources Information Center
Hartley, Karen C.
While not presuming to present a model of organizational socialization that is complete and totally accurate, this paper examines organizational socialization in a new way through the lenses of symbolic interactionism and culture theory. The first section of the paper describes the basic tenets of symbolic interactionism and how these have been…
A Self-Stabilizing Hybrid-Fault Tolerant Synchronization Protocol
NASA Technical Reports Server (NTRS)
Malekpour, Mahyar R.
2014-01-01
In this report we present a strategy for solving the Byzantine general problem for self-stabilizing a fully connected network from an arbitrary state and in the presence of any number of faults with various severities including any number of arbitrary (Byzantine) faulty nodes. Our solution applies to realizable systems, while allowing for differences in the network elements, provided that the number of arbitrary faults is not more than a third of the network size. The only constraint on the behavior of a node is that the interactions with other nodes are restricted to defined links and interfaces. Our solution does not rely on assumptions about the initial state of the system and no central clock nor centrally generated signal, pulse, or message is used. Nodes are anonymous, i.e., they do not have unique identities. We also present a mechanical verification of a proposed protocol. A bounded model of the protocol is verified using the Symbolic Model Verifier (SMV). The model checking effort is focused on verifying correctness of the bounded model of the protocol as well as confirming claims of determinism and linear convergence with respect to the self-stabilization period. We believe that our proposed solution solves the general case of the clock synchronization problem.
Lee, Boon-Ooi; Kirmayer, Laurence J.; Groleau, Danielle
2016-01-01
This study focuses on the therapeutic process and perceived helpfulness of dang-ki, a form of Chinese shamanistic healing, in Singapore. It aims to understand the healing symbols employed in dang-ki, whether or not patients find them helpful and whether their perceived helpfulness can be explained by the symbolic healing model (Dow, Am Anthropol 88(1):56–69, 1986; Levi-Strauss, Structural anthropology. Basic Books, New York, 1963). Although many researchers have applied this model to explain the efficacy of shamanistic healings, they did not directly provide empirical support. Furthermore, the therapeutic process of a shared clinical reality as proposed by the model may be achievable in small-scale traditional societies that are culturally more homogeneous than in contemporary societies that are culturally more diversified due to globalization and immigration. Patients may hold multidimensional health belief systems, as biomedicine and alternative healing systems coexist. Thus, it would be interesting to see the relevance and applicability of the symbolic healing model to shamanistic healing in contemporary societies. In this study, ethnographic interviews were conducted with 21 patients over three stages: immediately before and after the healing and approximately 1 month later. The dang-ki healing symbols were identified by observing the healing sessions with video recording. Results show that dang-kis normally applied more than one method to treat a given problem. These methods included words, talismans and physical manipulations. Overall, 11 patients perceived their consultations as helpful, 4 perceived their consultations as helpful but were unable to follow all recommendations, 5 were not sure of the outcome because they had yet to see any concrete results and only 1 patient considered his consultation unhelpful. Although the symbolic healing model provides a useful framework to understand perceived helpfulness, processes such as enactment of a common meaning system and symbolic transformation are complex and dynamic, and may be carried over several healing sessions. PMID:20012176
Bond graph modelling of multibody dynamics and its symbolic scheme
NASA Astrophysics Data System (ADS)
Kawase, Takehiko; Yoshimura, Hiroaki
A bond graph method of modeling multibody dynamics is demonstrated. Specifically, a symbolic generation scheme which fully utilizes the bond graph information is presented. It is also demonstrated that structural understanding and representation in bond graph theory is quite powerful for the modeling of such large scale systems, and that the nonenergic multiport of junction structure, which is a multiport expression of the system structure, plays an important role, as first suggested by Paynter. The principal part of the proposed symbolic scheme, that is, the elimination of excess variables, is done through tearing and interconnection in the sense of Kron using newly defined causal and causal coefficient arrays.
Effects of Symbolic Modeling on Children's Interpersonal Aggression.
ERIC Educational Resources Information Center
Liebert, Robert M.; Baron, Robert A.
Does exposure to symbolically modeled aggression (aggression in cartoons, movies, stories and simulated television programs) increase children's willingness to engage in behavior which might actually harm another human being? This paper presents a summary of three recent experiments offering affirmative answers to the question. A fourth experiment…
Toward a self-organizing pre-symbolic neural model representing sensorimotor primitives.
Zhong, Junpei; Cangelosi, Angelo; Wermter, Stefan
2014-01-01
The acquisition of symbolic and linguistic representations of sensorimotor behavior is a cognitive process performed by an agent when it is executing and/or observing own and others' actions. According to Piaget's theory of cognitive development, these representations develop during the sensorimotor stage and the pre-operational stage. We propose a model that relates the conceptualization of the higher-level information from visual stimuli to the development of ventral/dorsal visual streams. This model employs neural network architecture incorporating a predictive sensory module based on an RNNPB (Recurrent Neural Network with Parametric Biases) and a horizontal product model. We exemplify this model through a robot passively observing an object to learn its features and movements. During the learning process of observing sensorimotor primitives, i.e., observing a set of trajectories of arm movements and its oriented object features, the pre-symbolic representation is self-organized in the parametric units. These representational units act as bifurcation parameters, guiding the robot to recognize and predict various learned sensorimotor primitives. The pre-symbolic representation also accounts for the learning of sensorimotor primitives in a latent learning context.
Toward a self-organizing pre-symbolic neural model representing sensorimotor primitives
Zhong, Junpei; Cangelosi, Angelo; Wermter, Stefan
2014-01-01
The acquisition of symbolic and linguistic representations of sensorimotor behavior is a cognitive process performed by an agent when it is executing and/or observing own and others' actions. According to Piaget's theory of cognitive development, these representations develop during the sensorimotor stage and the pre-operational stage. We propose a model that relates the conceptualization of the higher-level information from visual stimuli to the development of ventral/dorsal visual streams. This model employs neural network architecture incorporating a predictive sensory module based on an RNNPB (Recurrent Neural Network with Parametric Biases) and a horizontal product model. We exemplify this model through a robot passively observing an object to learn its features and movements. During the learning process of observing sensorimotor primitives, i.e., observing a set of trajectories of arm movements and its oriented object features, the pre-symbolic representation is self-organized in the parametric units. These representational units act as bifurcation parameters, guiding the robot to recognize and predict various learned sensorimotor primitives. The pre-symbolic representation also accounts for the learning of sensorimotor primitives in a latent learning context. PMID:24550798
NASA Astrophysics Data System (ADS)
Kuvich, Gary
2003-08-01
Vision is a part of a larger information system that converts visual information into knowledge structures. These structures drive vision process, resolve ambiguity and uncertainty via feedback projections, and provide image understanding that is an interpretation of visual information in terms of such knowledge models. The ability of human brain to emulate knowledge structures in the form of networks-symbolic models is found. And that means an important shift of paradigm in our knowledge about brain from neural networks to "cortical software". Symbols, predicates and grammars naturally emerge in such active multilevel hierarchical networks, and logic is simply a way of restructuring such models. Brain analyzes an image as a graph-type decision structure created via multilevel hierarchical compression of visual information. Mid-level vision processes like clustering, perceptual grouping, separation of figure from ground, are special kinds of graph/network transformations. They convert low-level image structure into the set of more abstract ones, which represent objects and visual scene, making them easy for analysis by higher-level knowledge structures. Higher-level vision phenomena are results of such analysis. Composition of network-symbolic models works similar to frames and agents, combines learning, classification, analogy together with higher-level model-based reasoning into a single framework. Such models do not require supercomputers. Based on such principles, and using methods of Computational intelligence, an Image Understanding system can convert images into the network-symbolic knowledge models, and effectively resolve uncertainty and ambiguity, providing unifying representation for perception and cognition. That allows creating new intelligent computer vision systems for robotic and defense industries.
NASA Astrophysics Data System (ADS)
Banou, Emilia
In this article a previously proposed interpretation of Minoan 'horns of consecration' as a symbol of sun is reexamined. A clay model of 'horns of consecration' from the peak sanctuary of Petsophas, the results of astronomical research on Minoan peak sanctuaries, the idols of the so-called 'Goddess with Upraised Arms" and a clay model of 'horns of consecration' from the Mycenaean cemetery of Tanagra are put forward as evidence for a possible adoption - or a parallel development under the influence of adjacent cultures - by the Minoans (and by the Mycenaeans, at least after 1400 B.C.) of religious notions related to the Egyptian symbols of the 'mountain' and the 'horizon', both connected with the Sun in Egyptian cosmology and religion. It is concluded that the 'horns of consecration' may represent a practical device as well as an abstract symbol of the Sun, a symbol of catholic importance, which embraced many aspects of Minoan religious activities as represented on Minoan iconography.
Nigam, Ravi; Schlosser, Ralf W; Lloyd, Lyle L
2006-09-01
Matrix strategies employing parts of speech arranged in systematic language matrices and milieu language teaching strategies have been successfully used to teach word combining skills to children who have cognitive disabilities and some functional speech. The present study investigated the acquisition and generalized production of two-term semantic relationships in a new population using new types of symbols. Three children with cognitive disabilities and little or no functional speech were taught to combine graphic symbols. The matrix strategy and the mand-model procedure were used concomitantly as intervention procedures. A multiple probe design across sets of action-object combinations with generalization probes of untrained combinations was used to teach the production of graphic symbol combinations. Results indicated that two of the three children learned the early syntactic-semantic rule of combining action-object symbols and demonstrated generalization to untrained action-object combinations and generalization across trainers. The results and future directions for research are discussed.
Linguistic Mediation of Children's Performance in a New Symbolic Understanding Task
ERIC Educational Resources Information Center
Homer, Bruce D.; Petroff, Natalya; Hayward, Elizabeth O.
2013-01-01
The effects of language on symbolic functioning were examined using the "boxes task," a new symbolic understanding task based on DeLoache's model task. Children ("N" = 32; ages 2;4--3;8) observed an object being hidden in a stack of four boxes and were then asked to retrieve a similar object in the same location from a set of…
1986-06-01
model of the self-evaluation process as it differs from the evaluation process used by superiors. Symbolic Interactionism One view of self assessment is...supplied by the symbolic interactionists (Cooley, 1902; Head, 1934), who state that self perceptions are generated largely from individuals...disagreements remained even immediately after an appraisal interview in which a great deal of feedback was given. Research on the symbolic interactionist
Schemas in Problem Solving: An Integrated Model of Learning, Memory, and Instruction
1992-01-01
article: "Hybrid Computation in Cognitive Science: Neural Networks and Symbols" (J. A. Anderson, 1990). And, Marvin Minsky echoes the sentiment in his...distributed processing: A handbook of models, programs, and exercises. Cambridge, MA: The MIT Press. Minsky , M. (1991). Logical versus analogical or symbolic
Trudeau, Natacha; Sutton, Ann; Morford, Jill P
2014-09-01
While research on spoken language has a long tradition of studying and contrasting language production and comprehension, the study of graphic symbol communication has focused more on production than comprehension. As a result, the relationships between the ability to construct and to interpret graphic symbol sequences are not well understood. This study explored the use of graphic symbol sequences in children without disabilities aged 3;0 to 6;11 (years; months) (n=111). Children took part in nine tasks that systematically varied input and output modalities (speech, action, and graphic symbols). Results show that in 3- and 4-year-olds, attributing meaning to a sequence of symbols was particularly difficult even when the children knew the meaning of each symbol in the sequence. Similarly, while even 3- and 4-year-olds could produce a graphic symbol sequence following a model, transposing a spoken sentence into a graphic sequence was more difficult for them. Representing an action with graphic symbols was difficult even for 5-year-olds. Finally, the ability to comprehend graphic-symbol sequences preceded the ability to produce them. These developmental patterns, as well as memory-related variables, should be taken into account in choosing intervention strategies with young children who use AAC.
Wolfberg, Pamela; DeWitt, Mila; Young, Gregory S; Nguyen, Thanh
2015-03-01
Children with autism spectrum disorders (ASD) face pervasive challenges in symbolic and social play development. The Integrated Play Groups (IPG) model provides intensive guidance for children with ASD to participate with typical peers in mutually engaging experiences in natural settings. This study examined the effects of a 12-week IPG intervention on the symbolic and social play of 48 children with ASD using a repeated measures design. The findings revealed significant gains in symbolic and social play that generalized to unsupported play with unfamiliar peers. Consistent with prior studies, the outcomes provide robust and compelling evidence that further validate the efficacy of the IPG model. Theoretical and practical implications for maximizing children's developmental potential and social inclusion in play are discussed.
Huffman scanning: using language models within fixed-grid keyboard emulation☆
Roark, Brian; Beckley, Russell; Gibbons, Chris; Fried-Oken, Melanie
2012-01-01
Individuals with severe motor impairments commonly enter text using a single binary switch and symbol scanning methods. We present a new scanning method –Huffman scanning – which uses Huffman coding to select the symbols to highlight during scanning, thus minimizing the expected bits per symbol. With our method, the user can select the intended symbol even after switch activation errors. We describe two varieties of Huffman scanning – synchronous and asynchronous –and present experimental results, demonstrating speedups over row/column and linear scanning. PMID:24244070
Color and symbology: symbolic systems of color ordering
NASA Astrophysics Data System (ADS)
Varela, Diana
2002-06-01
Color has been used symbolically in various different fields, such as Heraldry, Music, Liturgy, Alchemy, Art and Literature. In this study, we shall investigate and analyse the structures of relationships that have taken shape as symbolic systems within each specific area of analysis. We shall discuss the most significant symbolic fields and their systems of color ording, considering each one of them as a topological model based on a logic that determines the total organization, according to the scale of reciprocities applied, and the cultural context that gives it meaning.
Inducing any virtual two-dimensional movement in humans by applying muscle tendon vibration.
Roll, Jean-Pierre; Albert, Frédéric; Thyrion, Chloé; Ribot-Ciscar, Edith; Bergenheim, Mikael; Mattei, Benjamin
2009-02-01
In humans, tendon vibration evokes illusory sensation of movement. We developed a model mimicking the muscle afferent patterns corresponding to any two-dimensional movement and checked its validity by inducing writing illusory movements through specific sets of muscle vibrators. Three kinds of illusory movements were compared. The first was induced by vibration patterns copying the responses of muscle spindle afferents previously recorded by microneurography during imposed ankle movements. The two others were generated by the model. Sixteen different vibratory patterns were applied to 20 motionless volunteers in the absence of vision. After each vibration sequence, the participants were asked to name the corresponding graphic symbol and then to reproduce the illusory movement perceived. Results showed that the afferent patterns generated by the model were very similar to those recorded microneurographically during actual ankle movements (r=0.82). The model was also very efficient for generating afferent response patterns at the wrist level, if the preferred sensory directions of the wrist muscle groups were first specified. Using recorded and modeled proprioceptive patterns to pilot sets of vibrators placed at the ankle or wrist levels evoked similar illusory movements, which were correctly identified by the participants in three quarters of the trials. Our proprioceptive model, based on neurosensory data recorded in behaving humans, should then be a useful tool in fields of research such as sensorimotor learning, rehabilitation, and virtual reality.
Symbolic dynamics techniques for complex systems: Application to share price dynamics
NASA Astrophysics Data System (ADS)
Xu, Dan; Beck, Christian
2017-05-01
The symbolic dynamics technique is well known for low-dimensional dynamical systems and chaotic maps, and lies at the roots of the thermodynamic formalism of dynamical systems. Here we show that this technique can also be successfully applied to time series generated by complex systems of much higher dimensionality. Our main example is the investigation of share price returns in a coarse-grained way. A nontrivial spectrum of Rényi entropies is found. We study how the spectrum depends on the time scale of returns, the sector of stocks considered, as well as the number of symbols used for the symbolic description. Overall our analysis confirms that in the symbol space transition probabilities of observed share price returns depend on the entire history of previous symbols, thus emphasizing the need for a modelling based on non-Markovian stochastic processes. Our method allows for quantitative comparisons of entirely different complex systems, for example the statistics of symbol sequences generated by share price returns using 4 symbols can be compared with that of genomic sequences.
Symbolic modeling of human anatomy for visualization and simulation
NASA Astrophysics Data System (ADS)
Pommert, Andreas; Schubert, Rainer; Riemer, Martin; Schiemann, Thomas; Tiede, Ulf; Hoehne, Karl H.
1994-09-01
Visualization of human anatomy in a 3D atlas requires both spatial and more abstract symbolic knowledge. Within our 'intelligent volume' model which integrates these two levels, we developed and implemented a semantic network model for describing human anatomy. Concepts for structuring (abstraction levels, domains, views, generic and case-specific modeling, inheritance) are introduced. Model, tools for generation and exploration and applications in our 3D anatomical atlas are presented and discussed.
Decision-Tree Formulation With Order-1 Lateral Execution
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
A compact symbolic formulation enables mapping of an arbitrarily complex decision tree of a certain type into a highly computationally efficient multidimensional software object. The type of decision trees to which this formulation applies is that known in the art as the Boolean class of balanced decision trees. Parallel lateral slices of an object created by means of this formulation can be executed in constant time considerably less time than would otherwise be required. Decision trees of various forms are incorporated into almost all large software systems. A decision tree is a way of hierarchically solving a problem, proceeding through a set of true/false responses to a conclusion. By definition, a decision tree has a tree-like structure, wherein each internal node denotes a test on an attribute, each branch from an internal node represents an outcome of a test, and leaf nodes represent classes or class distributions that, in turn represent possible conclusions. The drawback of decision trees is that execution of them can be computationally expensive (and, hence, time-consuming) because each non-leaf node must be examined to determine whether to progress deeper into a tree structure or to examine an alternative. The present formulation was conceived as an efficient means of representing a decision tree and executing it in as little time as possible. The formulation involves the use of a set of symbolic algorithms to transform a decision tree into a multi-dimensional object, the rank of which equals the number of lateral non-leaf nodes. The tree can then be executed in constant time by means of an order-one table lookup. The sequence of operations performed by the algorithms is summarized as follows: 1. Determination of whether the tree under consideration can be encoded by means of this formulation. 2. Extraction of decision variables. 3. Symbolic optimization of the decision tree to minimize its form. 4. Expansion and transformation of all nested conjunctive-disjunctive paths to a flattened conjunctive form composed only of equality checks when possible. If each reduced conjunctive form contains only equality checks and all of these forms use the same variables, then the decision tree can be reduced to an order-one operation through a table lookup. The speedup to order one is accomplished by distributing each decision variable over a surface of a multidimensional object by mapping the equality constant to an index
ERIC Educational Resources Information Center
Anderson, Ronald B.
2000-01-01
Tests the impact of symbolic modeling and persuasive efficacy information on self-efficacy beliefs and intentions to perform breast self-examinations among 147 undergraduate students. Assesses the effects of these modes of efficacy induction on fear arousal and response-outcome expectations. Finds symbolic modeling engendered greater efficacy…
Kumst, S; Scarf, D
2015-01-01
The ability of children to delay gratification is correlated with a range of positive outcomes in adulthood, showing the potential impact of helping young children increase their competence in this area. This study investigated the influence of symbolic models on the self-control of 3-year old children. Eighty-three children were randomly assigned to one of three modelling conditions: personal storytelling, impersonal storytelling, and control. Children were tested on the delay-of-gratification maintenance paradigm both before and after being exposed to a symbolic model or control condition. Repeated measures ANOVA revealed no significant differences between the two storytelling groups and the control group, indicating that the symbolic models did not influence children's ability to delay gratification. A serendipitous finding showed a positive relationship between the ability of children to wait and their production and accurate use of temporal terms, which was more pronounced in girls than boys. This finding may be an indication that a higher temporal vocabulary is linked to a continuous representation of the self in time, facilitating a child's representation of the future-self receiving a larger reward than what the present-self could receive.
Kumst, S
2015-01-01
The ability of children to delay gratification is correlated with a range of positive outcomes in adulthood, showing the potential impact of helping young children increase their competence in this area. This study investigated the influence of symbolic models on the self-control of 3-year old children. Eighty-three children were randomly assigned to one of three modelling conditions: personal storytelling, impersonal storytelling, and control. Children were tested on the delay-of-gratification maintenance paradigm both before and after being exposed to a symbolic model or control condition. Repeated measures ANOVA revealed no significant differences between the two storytelling groups and the control group, indicating that the symbolic models did not influence children’s ability to delay gratification. A serendipitous finding showed a positive relationship between the ability of children to wait and their production and accurate use of temporal terms, which was more pronounced in girls than boys. This finding may be an indication that a higher temporal vocabulary is linked to a continuous representation of the self in time, facilitating a child’s representation of the future-self receiving a larger reward than what the present-self could receive. PMID:25737814
The concept of branding: is it relevant to nursing?
Dominiak, Mary C
2004-10-01
This concept exploration examines branding and its relevance to nursing. Branding is used to differentiate products through use of symbols. The symbols are the brands that are designed to communicate the value of products. Nursing has had many identifying symbols, such as the nurse's cap and the white uniform, but these symbols have failed to clearly communicate the essence of nursing. Lack of a distinct nursing brand has led to confusion about the discipline. The Roy adaptation model provides a view of branding as a process for clearly defining the profession, improving its image, and differentiating its role within the healthcare milieu.
Wang, Qing; Oostindjer, Marije; Amdam, Gro V; Egelandsdal, Bjørg
2016-02-01
Consumers tend to have the perception that healthy equals less tasty. This study aimed to identify whether information provided by the Keyhole symbol, a widely used front-of-package symbol in Nordic countries to indicate nutritional content, and percent daily values (%DVs) affect Norwegian adolescents' perception of the healthiness of snacks and their intention to buy them. Two tasks were used to evaluate adolescents' perception of snacks with the Keyhole symbol: with %DVs or with no nutrition label. A third task was used to test their abilities to use %DVs (pairwise selections). A survey obtained personal attributes. A total of 566 Norwegian adolescents. Taste perception, health perception, and ability to use %DVs. Linear mixed models and logistic models that tested effects of labels and personal attributes on main outcome measures. The Keyhole symbol increased health perception without influencing taste perception of snacks. Norwegian adolescents had limited abilities to use information from the %DVs correctly to identify healthier foods. Norwegian adolescents had a positive perception of the Keyhole symbols. Keyhole symbols as a simple, heuristic front-of-package label have potential as an information strategy that may influence self-efficacy in promoting healthy snack choices among adolescents. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
HMM-based lexicon-driven and lexicon-free word recognition for online handwritten Indic scripts.
Bharath, A; Madhvanath, Sriganesh
2012-04-01
Research for recognizing online handwritten words in Indic scripts is at its early stages when compared to Latin and Oriental scripts. In this paper, we address this problem specifically for two major Indic scripts--Devanagari and Tamil. In contrast to previous approaches, the techniques we propose are largely data driven and script independent. We propose two different techniques for word recognition based on Hidden Markov Models (HMM): lexicon driven and lexicon free. The lexicon-driven technique models each word in the lexicon as a sequence of symbol HMMs according to a standard symbol writing order derived from the phonetic representation. The lexicon-free technique uses a novel Bag-of-Symbols representation of the handwritten word that is independent of symbol order and allows rapid pruning of the lexicon. On handwritten Devanagari word samples featuring both standard and nonstandard symbol writing orders, a combination of lexicon-driven and lexicon-free recognizers significantly outperforms either of them used in isolation. In contrast, most Tamil word samples feature the standard symbol order, and the lexicon-driven recognizer outperforms the lexicon free one as well as their combination. The best recognition accuracies obtained for 20,000 word lexicons are 87.13 percent for Devanagari when the two recognizers are combined, and 91.8 percent for Tamil using the lexicon-driven technique.
Toward a Model Framework of Generalized Parallel Componential Processing of Multi-Symbol Numbers
ERIC Educational Resources Information Center
Huber, Stefan; Cornelsen, Sonja; Moeller, Korbinian; Nuerk, Hans-Christoph
2015-01-01
In this article, we propose and evaluate a new model framework of parallel componential multi-symbol number processing, generalizing the idea of parallel componential processing of multi-digit numbers to the case of negative numbers by considering the polarity signs similar to single digits. In a first step, we evaluated this account by defining…
A/E/C CAD Standard, Release 5.0
2012-12-01
System Control Panels Halon System Inert Gas Smoke/Pressurization Control Egress Requirements Fire Protection System Appendix A ... System Natural Gas System Miter Gates Makeup Air System Appendix A Model File Level/Layer Assignment Tables A54 Discipline: Mechanical Model...SWITCHES Element type: Symbol Electrical: STP14B SURFACE 1X4 STRIP BATTERY Element type: Symbol Electrical: SUBST A
File compression and encryption based on LLS and arithmetic coding
NASA Astrophysics Data System (ADS)
Yu, Changzhi; Li, Hengjian; Wang, Xiyu
2018-03-01
e propose a file compression model based on arithmetic coding. Firstly, the original symbols, to be encoded, are input to the encoder one by one, we produce a set of chaotic sequences by using the Logistic and sine chaos system(LLS), and the values of this chaotic sequences are randomly modified the Upper and lower limits of current symbols probability. In order to achieve the purpose of encryption, we modify the upper and lower limits of all character probabilities when encoding each symbols. Experimental results show that the proposed model can achieve the purpose of data encryption while achieving almost the same compression efficiency as the arithmetic coding.
Computerized symbolic manipulation in structural mechanics Progress and potential
NASA Technical Reports Server (NTRS)
Noor, A. K.; Andersen, C. M.
1978-01-01
Status and recent applications of computerized symbolic manipulation to structural mechanics problems are summarized. The applications discussed include; (1) generation of characteristic arrays of finite elements; (2) evaluation of effective stiffness and mass coefficients of continuum models for repetitive lattice structures; and (3) application of Rayleigh-Ritz technique to free vibration analysis of laminated composite elliptic plates. The major advantages of using computerized symbolic manipulation in each of these applications are outlined. A number of problem areas which limit the realization of the full potential of computerized symbolic manipulation in structural mechanics are examined and some of the means of alleviating them are discussed.
NASA Astrophysics Data System (ADS)
Wiji, W.; Mulyani, S.
2018-05-01
The purpose of this study is to obtain a profile of students' mental models, misconceptions, troublesome knowledge, and threshold concept on thermochemistry. The subjects in this study were 35 students. The method used in this research was descriptive method with instruments Diagnostic Test of Mental Model - Prediction, Observation, and Explanation (DToM-POE). The results showed that the students' ability to predict, observe, and explain ΔH of neutralization reaction of NaOH with HCl was still lacking. Most students tended to memorize chemical concepts related to symbolic level and they did not understand the meaning of the symbols used. Furthermore, most students were unable to connect the results of observations at the macroscopic level with the symbolic level to determine ΔH of neutralization reaction of NaOH with HCl. Then, most students tended to give an explanation by a net ionic equation or a chemical reaction equation at the symbolic level when explaining ΔH of neutralization reaction at the submicroscopic level. In addition, there are seven misconceptions, three troublesome knowledges, and three threshold concepts held by students on thermochemistry.
On the symbolic manipulation and code generation for elasto-plastic material matrices
NASA Technical Reports Server (NTRS)
Chang, T. Y.; Saleeb, A. F.; Wang, P. S.; Tan, H. Q.
1991-01-01
A computerized procedure for symbolic manipulations and FORTRAN code generation of an elasto-plastic material matrix for finite element applications is presented. Special emphasis is placed on expression simplifications during intermediate derivations, optimal code generation, and interface with the main program. A systematic procedure is outlined to avoid redundant algebraic manipulations. Symbolic expressions of the derived material stiffness matrix are automatically converted to RATFOR code which is then translated into FORTRAN statements through a preprocessor. To minimize the interface problem with the main program, a template file is prepared so that the translated FORTRAN statements can be merged into the file to form a subroutine (or a submodule). Three constitutive models; namely, von Mises plasticity, Drucker-Prager model, and a concrete plasticity model, are used as illustrative examples.
Chen, Xiurong; Zhao, Rubo
2017-01-01
In this paper, we study the cross-market effects of Brexit on the stock and bond markets of nine major countries in the world. By incorporating information theory, we introduce the time-varying impact weights based on symbolic transfer entropy to improve the traditional GARCH model. The empirical results show that under the influence of Brexit, flight-to-quality not only commonly occurs between the stocks and bonds of each country but also simultaneously occurs among different countries. We also find that the accuracy of the time-varying symbolic transfer entropy GARCH model proposed in this paper has been improved compared to the traditional GARCH model, which indicates that it has a certain practical application value. PMID:28817712
Dameron, O; Gibaud, B; Morandi, X
2004-06-01
The human cerebral cortex anatomy describes the brain organization at the scale of gyri and sulci. It is used as landmarks for neurosurgery as well as localization support for functional data analysis or inter-subject data comparison. Existing models of the cortex anatomy either rely on image labeling but fail to represent variability and structural properties or rely on a conceptual model but miss the inner 3D nature and relations of anatomical structures. This study was therefore conducted to propose a model of sulco-gyral anatomy for the healthy human brain. We hypothesized that both numeric knowledge (i.e., image-based) and symbolic knowledge (i.e., concept-based) have to be represented and coordinated. In addition, the representation of this knowledge should be application-independent in order to be usable in various contexts. Therefore, we devised a symbolic model describing specialization, composition and spatial organization of cortical anatomical structures. We also collected numeric knowledge such as 3D models of shape and shape variation about cortical anatomical structures. For each numeric piece of knowledge, a companion file describes the concept it refers to and the nature of the relationship. Demonstration software performs a mapping between the numeric and the symbolic aspects for browsing the knowledge base.
Trails of meaning construction: Symbolic artifacts engage the social brain.
Tylén, Kristian; Philipsen, Johanne Stege; Roepstorff, Andreas; Fusaroli, Riccardo
2016-07-01
Symbolic artifacts present a challenge to theories of neurocognitive processing due to their hybrid nature: they are at the same time physical objects and vehicles of intangible social meanings. While their physical properties can be read of their perceptual appearance, the meaning of symbolic artifacts depends on the perceiver's interpretative attitude and embeddedness in cultural practices. In this study, participants built models of LEGO bricks to illustrate their understanding of abstract concepts. They were then scanned with fMRI while presented to photographs of their own and others' models. When participants attended to the meaning of the models in contrast to their bare physical properties, we observed activations in mPFC and TPJ, areas often associated with social cognition, and IFG, possibly related to semantics. When contrasting own and others' models, we also found activations in precuneus, an area associated with autobiographical memory and agency, while looking at one's own collective models yielded interaction effects in rostral ACC, right IFG and left Insula. Interestingly, variability in the insula was predicted by individual differences in participants' feeling of relatedness to their fellow group members during LEGO construction activity. Our findings support a view of symbolic artifacts as neuro-cognitive trails of human social interactions. Copyright © 2016 Elsevier Inc. All rights reserved.
Spectral characteristics of convolutionally coded digital signals
NASA Technical Reports Server (NTRS)
Divsalar, D.
1979-01-01
The power spectral density of the output symbol sequence of a convolutional encoder is computed for two different input symbol stream source models, namely, an NRZ signaling format and a first order Markov source. In the former, the two signaling states of the binary waveform are not necessarily assumed to occur with equal probability. The effects of alternate symbol inversion on this spectrum are also considered. The mathematical results are illustrated with many examples corresponding to optimal performance codes.
Addressing Dynamic Issues of Program Model Checking
NASA Technical Reports Server (NTRS)
Lerda, Flavio; Visser, Willem
2001-01-01
Model checking real programs has recently become an active research area. Programs however exhibit two characteristics that make model checking difficult: the complexity of their state and the dynamic nature of many programs. Here we address both these issues within the context of the Java PathFinder (JPF) model checker. Firstly, we will show how the state of a Java program can be encoded efficiently and how this encoding can be exploited to improve model checking. Next we show how to use symmetry reductions to alleviate some of the problems introduced by the dynamic nature of Java programs. Lastly, we show how distributed model checking of a dynamic program can be achieved, and furthermore, how dynamic partitions of the state space can improve model checking. We support all our findings with results from applying these techniques within the JPF model checker.
Symbolic Interactionism and Social Action Theory
ERIC Educational Resources Information Center
Morrione, Thomas J.
1975-01-01
An explanation and elaboration of existing theory on interaction, this article describes a point of convergence between Parsons' Voluntaristic Theory of Action and Blumer's conceptualization of Symbolic Interactionism and develops specific problems of divergence in these normative and interpretive models of interaction. (JC)
Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing
2017-01-01
Formal techniques have been devoted to analyzing whether network protocol specifications violate security policies; however, these methods cannot detect vulnerabilities in the implementations of the network protocols themselves. Symbolic execution can be used to analyze the paths of the network protocol implementations, but for stateful network protocols, it is difficult to reach the deep states of the protocol. This paper proposes a novel model-guided approach to detect vulnerabilities in network protocol implementations. Our method first abstracts a finite state machine (FSM) model, then utilizes the model to guide the symbolic execution. This approach achieves high coverage of both the code and the protocol states. The proposed method is implemented and applied to test numerous real-world network protocol implementations. The experimental results indicate that the proposed method is more effective than traditional fuzzing methods such as SPIKE at detecting vulnerabilities in the deep states of network protocol implementations.
Busch, Fredric N
2017-01-01
In psychoanalytic theory, the importance of actual neuroses-considered to be devoid of psychic content-diminished as Freud and subsequent analysts focused on unconscious intrapsychic conflict. This paper explores the relationship between actual neurotic and unrepresented states, which are believed to be best addressed through attention to countertransference, intersubjectivity, and enactments rather than interpretation of intrapsychic conflict. Models suggesting how actual neurotic states and symbolized intrapsychic conflict may interact with each other and environmental stressors are described. Symbolizing actual neurotic states and establishing meaningful linkages between somatic/affective experiences and intrapsychic conflict are viewed as necessary for effective treatment of many disorders. © 2017 The Psychoanalytic Quarterly, Inc.
ERIC Educational Resources Information Center
Bonnet, Lauren Kravetz
2012-01-01
This single-subject research study was designed to examine the effects of point-of-view video modeling (POVM) on the symbolic play actions and play-associated language of four preschool students with autism. A multiple baseline design across participants was conducted in order to evaluate the effectiveness of using POVM as an intervention for…
Program Model Checking: A Practitioner's Guide
NASA Technical Reports Server (NTRS)
Pressburger, Thomas T.; Mansouri-Samani, Masoud; Mehlitz, Peter C.; Pasareanu, Corina S.; Markosian, Lawrence Z.; Penix, John J.; Brat, Guillaume P.; Visser, Willem C.
2008-01-01
Program model checking is a verification technology that uses state-space exploration to evaluate large numbers of potential program executions. Program model checking provides improved coverage over testing by systematically evaluating all possible test inputs and all possible interleavings of threads in a multithreaded system. Model-checking algorithms use several classes of optimizations to reduce the time and memory requirements for analysis, as well as heuristics for meaningful analysis of partial areas of the state space Our goal in this guidebook is to assemble, distill, and demonstrate emerging best practices for applying program model checking. We offer it as a starting point and introduction for those who want to apply model checking to software verification and validation. The guidebook will not discuss any specific tool in great detail, but we provide references for specific tools.
Symbol-and-Arrow Diagrams in Teaching Pharmacokinetics.
ERIC Educational Resources Information Center
Hayton, William L.
1990-01-01
Symbol-and-arrow diagrams are helpful adjuncts to equations derived from pharmacokinetic models. Both show relationships among dependent and independent variables. Diagrams show only qualitative relationships, but clearly show which variables are dependent and which are independent, helping students understand complex but important functional…
Use of symbolic computation in robotics education
NASA Technical Reports Server (NTRS)
Vira, Naren; Tunstel, Edward
1992-01-01
An application of symbolic computation in robotics education is described. A software package is presented which combines generality, user interaction, and user-friendliness with the systematic usage of symbolic computation and artificial intelligence techniques. The software utilizes MACSYMA, a LISP-based symbolic algebra language, to automatically generate closed-form expressions representing forward and inverse kinematics solutions, the Jacobian transformation matrices, robot pose error-compensation models equations, and Lagrange dynamics formulation for N degree-of-freedom, open chain robotic manipulators. The goal of such a package is to aid faculty and students in the robotics course by removing burdensome tasks of mathematical manipulations. The software package has been successfully tested for its accuracy using commercially available robots.
[The concept of risk and its estimation].
Zocchetti, C; Della Foglia, M; Colombi, A
1996-01-01
The concept of risk, in relation to human health, is a topic of primary interest for occupational health professionals. A new legislation recently established in Italy (626/94) according to European Community directives in the field of Preventive Medicine, called attention to this topic, and in particular to risk assessment and evaluation. Motivated by this context and by the impression that the concept of risk is frequently misunderstood, the present paper has two aims: the identification of the different meanings of the term "risk" in the new Italian legislation and the critical discussion of some commonly used definitions; and the proposal of a general definition, with the specification of a mathematical expression for quantitative risk estimation. The term risk (and risk estimation, assessment, or evaluation) has mainly referred to three different contexts: hazard identification, exposure assessment, and adverse health effects occurrence. Unfortunately, there are contexts in the legislation in which it is difficult to identify the true meaning of the term. This might cause equivocal interpretations and erroneous applications of the law because hazard evaluation, exposure assessment, and adverse health effects identification are completely different topics that require integrated but distinct approaches to risk management. As far as a quantitative definition of risk is of concern, we suggest an algorithm which connects the three basic risk elements (hazard, exposure, adverse health effects) by means of their probabilities of occurrence: the probability of being exposed (to a definite dose) given that a specific hazard is present (Pr(e[symbol: see text]p)), and the probability of occurrence of an adverse health effect as a consequence of that exposure (Pr(d[symbol: see text]e)). Using these quantitative components, risk can be defined as a sequence of measurable events that starts with hazard identification and terminates with disease occurrence; therefore, the following formal definition of risk is proposed: the probability of occurrence, in a given period of time, of an adverse health effect as a consequence of the existence of an hazard. In formula: R(d[symbol: see text]p) = Pr(e[symbol: see text]p) x Pr(d[symbol: see text]e). While Pr(e[symbol: see text]p) (exposure given hazard) must be evaluated in the situation under study, two alternatives exist for the estimation of the occurrence of adverse health effects (Pr(d[symbol: see text]e)): a "direct" estimation of the damage (Pr(d[symbol: see text]e) through formal epidemiologic studies conducted in the situation under observation; and an "indirect" estimation of Pr(d[symbol: see text]e) using information taken from the scientific literature (epidemiologic evaluations, dose-response relationships, extrapolations, ...). Both conditions are presented along with their respective advantages, disadvantages, and uncertainties. The usefulness of the proposed algorithm is discussed with respect to commonly used applications of risk assessment in occupational medicine; the relevance of time for risk estimation (both in the term of duration of observation, duration of exposure, and latency of effect) is briefly explained; and how the proposed algorithm takes into account (in terms of prevention and public health) both the etiologic relevance of the exposure and the consequences of exposure removal is highlighted. As a last comment, it is suggested that the diffuse application of good work practices (technical, behavioral, organizational, ...), or the exhaustive use of check lists, can be relevant in terms of improvement of prevention efficacy, but does not represent any quantitative procedure of risk assessment which, in any circumstance, must be considered the elective approach to adverse health effect prevention.
A New Program Structuring Mechanism Based on Layered Graphs.
1984-12-01
which is a single-page diagram. Diagrams are constructed from some 40 symbols , chiefly A- boxes, arrows and annotations. A single model specifies a...are identified and used in describing it. 20The symbol "G" derives from the original use of the term "group" for "object slice". Since Ŕ" is already an...overloaded mathematical symbol , retaining "G" seems as good as any alternative. 21The names object slices and views reflect the interpretation placed
Approaches to the study of intelligence
NASA Technical Reports Server (NTRS)
Norman, Donald A.
1991-01-01
A survey and an evaluation are conducted for the Rosenbloom et al. (1991) 'SOAR' model of intelligence, both as found in humans and in prospective AI systems, which views it as a representational system for goal-oriented symbolic activity based on a physical symbol system. Attention is given to SOAR's implications for semantic and episodic memory, symbol processing, and search within a uniform problem space; also noted are the relationships of SOAR to competing AI schemes, and its potential usefulness as a theoretical tool for cognitive psychology.
Szardenings, Carsten; Kuhn, Jörg-Tobias; Ranger, Jochen; Holling, Heinz
2017-01-01
The respective roles of the approximate number system (ANS) and an access deficit (AD) in developmental dyscalculia (DD) are not well-known. Most studies rely on response times (RTs) or accuracy (error rates) separately. We analyzed the results of two samples of elementary school children in symbolic magnitude comparison (MC) and non-symbolic MC using a diffusion model. This approach uses the joint distribution of both RTs and accuracy in order to synthesize measures closer to ability and response caution or response conservatism. The latter can be understood in the context of the speed-accuracy tradeoff: It expresses how much a subject trades in speed for improved accuracy. We found significant effects of DD on both ability (negative) and response caution (positive) in MC tasks and a negative interaction of DD with symbolic task material on ability. These results support that DD subjects suffer from both an impaired ANS and an AD and in particular support that slower RTs of children with DD are indeed related to impaired processing of numerical information. An interaction effect of symbolic task material and DD (low mathematical ability) on response caution could not be refuted. However, in a sample more representative of the general population we found a negative association of mathematical ability and response caution in symbolic but not in non-symbolic task material. The observed differences in response behavior highlight the importance of accounting for response caution in the analysis of MC tasks. The results as a whole present a good example of the benefits of a diffusion model analysis.
Szardenings, Carsten; Kuhn, Jörg-Tobias; Ranger, Jochen; Holling, Heinz
2018-01-01
The respective roles of the approximate number system (ANS) and an access deficit (AD) in developmental dyscalculia (DD) are not well-known. Most studies rely on response times (RTs) or accuracy (error rates) separately. We analyzed the results of two samples of elementary school children in symbolic magnitude comparison (MC) and non-symbolic MC using a diffusion model. This approach uses the joint distribution of both RTs and accuracy in order to synthesize measures closer to ability and response caution or response conservatism. The latter can be understood in the context of the speed-accuracy tradeoff: It expresses how much a subject trades in speed for improved accuracy. We found significant effects of DD on both ability (negative) and response caution (positive) in MC tasks and a negative interaction of DD with symbolic task material on ability. These results support that DD subjects suffer from both an impaired ANS and an AD and in particular support that slower RTs of children with DD are indeed related to impaired processing of numerical information. An interaction effect of symbolic task material and DD (low mathematical ability) on response caution could not be refuted. However, in a sample more representative of the general population we found a negative association of mathematical ability and response caution in symbolic but not in non-symbolic task material. The observed differences in response behavior highlight the importance of accounting for response caution in the analysis of MC tasks. The results as a whole present a good example of the benefits of a diffusion model analysis. PMID:29379450
NASA Technical Reports Server (NTRS)
Reddy, T. S. R.
1986-01-01
The process of performing an automated stability analysis for an elastic-bladed helicopter rotor is discussed. A symbolic manipulation program, written in FORTRAN, is used to aid in the derivation of the governing equations of motion for the rotor. The blades undergo coupled bending and torsional deformations. Two-dimensional quasi-steady aerodynamics below stall are used. Although reversed flow effects are neglected, unsteady effects, modeled as dynamic inflow are included. Using a Lagrangian approach, the governing equations are derived in generalized coordinates using the symbolic program. The program generates the steady and perturbed equations and writes into subroutines to be called by numerical routines. The symbolic program can operate on both expressions and matrices. For the case of hovering flight, the blade and dynamic inflow equations are converted to equations in a multiblade coordinate system by rearranging the coefficients of the equations. For the case of forward flight, the multiblade equations are obtained through the symbolic program. The final multiblade equations are capable of accommodating any number of elastic blade modes. The computer implementation of this procedure consists of three stages: (1) the symbolic derivation of equations; (2) the coding of the equations into subroutines; and (3) the numerical study after identifying mass, damping, and stiffness coefficients. Damping results are presented in hover and in forward flight with and without dynamic inflow effects for various rotor blade models, including rigid blade lag-flap, elastic flap-lag, flap-lag-torsion, and quasi-static torsion. Results from dynamic inflow effects which are obtained from a lift deficiency function for a quasi-static inflow model in hover are also presented.
Toward improved design of check dam systems: A case study in the Loess Plateau, China
NASA Astrophysics Data System (ADS)
Pal, Debasish; Galelli, Stefano; Tang, Honglei; Ran, Qihua
2018-04-01
Check dams are one of the most common strategies for controlling sediment transport in erosion prone areas, along with soil and water conservation measures. However, existing mathematical models that simulate sediment production and delivery are often unable to simulate how the storage capacity of check dams varies with time. To explicitly account for this process-and to support the design of check dam systems-we developed a modelling framework consisting of two components, namely (1) the spatially distributed Soil Erosion and Sediment Delivery Model (WaTEM/SEDEM), and (2) a network-based model of check dam storage dynamics. The two models are run sequentially, with the second model receiving the initial sediment input to check dams from WaTEM/SEDEM. The framework is first applied to Shejiagou catchment, a 4.26 km2 area located in the Loess Plateau, China, where we study the effect of the existing check dam system on sediment dynamics. Results show that the deployment of check dams altered significantly the sediment delivery ratio of the catchment. Furthermore, the network-based model reveals a large variability in the life expectancy of check dams and abrupt changes in their filling rates. The application of the framework to six alternative check dam deployment scenarios is then used to illustrate its usefulness for planning purposes, and to derive some insights on the effect of key decision variables, such as the number, size, and site location of check dams. Simulation results suggest that better performance-in terms of life expectancy and sediment delivery ratio-could have been achieved with an alternative deployment strategy.
Students' different understandings of class diagrams
NASA Astrophysics Data System (ADS)
Boustedt, Jonas
2012-03-01
The software industry needs well-trained software designers and one important aspect of software design is the ability to model software designs visually and understand what visual models represent. However, previous research indicates that software design is a difficult task to many students. This article reports empirical findings from a phenomenographic investigation on how students understand class diagrams, Unified Modeling Language (UML) symbols, and relations to object-oriented (OO) concepts. The informants were 20 Computer Science students from four different universities in Sweden. The results show qualitatively different ways to understand and describe UML class diagrams and the "diamond symbols" representing aggregation and composition. The purpose of class diagrams was understood in a varied way, from describing it as a documentation to a more advanced view related to communication. The descriptions of class diagrams varied from seeing them as a specification of classes to a more advanced view, where they were described to show hierarchic structures of classes and relations. The diamond symbols were seen as "relations" and a more advanced way was seeing the white and the black diamonds as different symbols for aggregation and composition. As a consequence of the results, it is recommended that UML should be adopted in courses. It is briefly indicated how the phenomenographic results in combination with variation theory can be used by teachers to enhance students' possibilities to reach advanced understanding of phenomena related to UML class diagrams. Moreover, it is recommended that teachers should put more effort in assessing skills in proper usage of the basic symbols and models and students should be provided with opportunities to practise collaborative design, e.g. using whiteboards.
Implementing Model-Check for Employee and Management Satisfaction
NASA Technical Reports Server (NTRS)
Jones, Corey; LaPha, Steven
2013-01-01
This presentation will discuss methods to which ModelCheck can be implemented to not only improve model quality, but also satisfy both employees and management through different sets of quality checks. This approach allows a standard set of modeling practices to be upheld throughout a company, with minimal interaction required by the end user. The presenter will demonstrate how to create multiple ModelCheck standards, preventing users from evading the system, and how it can improve the quality of drawings and models.
Common world model for unmanned systems: Phase 2
NASA Astrophysics Data System (ADS)
Dean, Robert M. S.; Oh, Jean; Vinokurov, Jerry
2014-06-01
The Robotics Collaborative Technology Alliance (RCTA) seeks to provide adaptive robot capabilities which move beyond traditional metric algorithms to include cognitive capabilities. Key to this effort is the Common World Model, which moves beyond the state-of-the-art by representing the world using semantic and symbolic as well as metric information. It joins these layers of information to define objects in the world. These objects may be reasoned upon jointly using traditional geometric, symbolic cognitive algorithms and new computational nodes formed by the combination of these disciplines to address Symbol Grounding and Uncertainty. The Common World Model must understand how these objects relate to each other. It includes the concept of Self-Information about the robot. By encoding current capability, component status, task execution state, and their histories we track information which enables the robot to reason and adapt its performance using Meta-Cognition and Machine Learning principles. The world model also includes models of how entities in the environment behave which enable prediction of future world states. To manage complexity, we have adopted a phased implementation approach. Phase 1, published in these proceedings in 2013 [1], presented the approach for linking metric with symbolic information and interfaces for traditional planners and cognitive reasoning. Here we discuss the design of "Phase 2" of this world model, which extends the Phase 1 design API, data structures, and reviews the use of the Common World Model as part of a semantic navigation use case.
Finding Feasible Abstract Counter-Examples
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Dwyer, Matthew B.; Visser, Willem; Clancy, Daniel (Technical Monitor)
2002-01-01
A strength of model checking is its ability to automate the detection of subtle system errors and produce traces that exhibit those errors. Given the high computational cost of model checking most researchers advocate the use of aggressive property-preserving abstractions. Unfortunately, the more aggressively a system is abstracted the more infeasible behavior it will have. Thus, while abstraction enables efficient model checking it also threatens the usefulness of model checking as a defect detection tool, since it may be difficult to determine whether a counter-example is feasible and hence worth developer time to analyze. We have explored several strategies for addressing this problem by extending an explicit-state model checker, Java PathFinder (JPF), to search for and analyze counter-examples in the presence of abstractions. We demonstrate that these techniques effectively preserve the defect detection ability of model checking in the presence of aggressive abstraction by applying them to check properties of several abstracted multi-threaded Java programs. These new capabilities are not specific to JPF and can be easily adapted to other model checking frameworks; we describe how this was done for the Bandera toolset.
Code of Federal Regulations, 2014 CFR
2014-01-01
... processing regions)]. If you make the deposit in person to one of our employees, funds from the following... in different states or check processing regions)]. If you make the deposit in person to one of our...] Substitute Checks and Your Rights What Is a Substitute Check? To make check processing faster, federal law...
Modelling protein functional domains in signal transduction using Maude
NASA Technical Reports Server (NTRS)
Sriram, M. G.
2003-01-01
Modelling of protein-protein interactions in signal transduction is receiving increased attention in computational biology. This paper describes recent research in the application of Maude, a symbolic language founded on rewriting logic, to the modelling of functional domains within signalling proteins. Protein functional domains (PFDs) are a critical focus of modern signal transduction research. In general, Maude models can simulate biological signalling networks and produce specific testable hypotheses at various levels of abstraction. Developing symbolic models of signalling proteins containing functional domains is important because of the potential to generate analyses of complex signalling networks based on structure-function relationships.
Symbolic interactionism: a framework for the care of parents of preterm infants.
Edwards, L D; Saunders, R B
1990-04-01
Because of stressors surrounding preterm birth, parents can be expected to have difficulty in early interactions with their preterm infants. Care givers who work with preterm infants and their parents can positively affect the early parental experiences of these mothers and fathers. If care givers are consciously guided by a conceptual model, therapeutic care for distressed parents is more likely to be provided. A logical framework, such as symbolic interactionism, helps care givers to proceed systematically in assessing parental behaviors, in intervening appropriately, and in evaluating both the process and outcome of the care. Selected aspects of the symbolic interaction model are described in this article and applied to the care of parents of preterm infants.
Knowledge Representation and Ontologies
NASA Astrophysics Data System (ADS)
Grimm, Stephan
Knowledge representation and reasoning aims at designing computer systems that reason about a machine-interpretable representation of the world. Knowledge-based systems have a computational model of some domain of interest in which symbols serve as surrogates for real world domain artefacts, such as physical objects, events, relationships, etc. [1]. The domain of interest can cover any part of the real world or any hypothetical system about which one desires to represent knowledge for com-putational purposes. A knowledge-based system maintains a knowledge base, which stores the symbols of the computational model in the form of statements about the domain, and it performs reasoning by manipulating these symbols. Applications can base their decisions on answers to domain-relevant questions posed to a knowledge base.
The Mediating Relation between Symbolic and Nonsymbolic Foundations of Math Competence
Price, Gavin R.; Fuchs, Lynn S.
2016-01-01
This study investigated the relation between symbolic and nonsymbolic magnitude processing abilities with 2 standardized measures of math competence (WRAT Arithmetic and KeyMath Numeration) in 150 3rd- grade children (mean age 9.01 years). Participants compared sets of dots and pairs of Arabic digits with numerosities 1–9 for relative numerical magnitude. In line with previous studies, performance on both symbolic and nonsymbolic magnitude processing was related to math ability. Performance metrics combining reaction and accuracy, as well as weber fractions, were entered into mediation models with standardized math test scores. Results showed that symbolic magnitude processing ability fully mediates the relation between nonsymbolic magnitude processing and math ability, regardless of the performance metric or standardized test. PMID:26859564
The Mediating Relation between Symbolic and Nonsymbolic Foundations of Math Competence.
Price, Gavin R; Fuchs, Lynn S
2016-01-01
This study investigated the relation between symbolic and nonsymbolic magnitude processing abilities with 2 standardized measures of math competence (WRAT Arithmetic and KeyMath Numeration) in 150 3rd-grade children (mean age 9.01 years). Participants compared sets of dots and pairs of Arabic digits with numerosities 1-9 for relative numerical magnitude. In line with previous studies, performance on both symbolic and nonsymbolic magnitude processing was related to math ability. Performance metrics combining reaction and accuracy, as well as weber fractions, were entered into mediation models with standardized math test scores. Results showed that symbolic magnitude processing ability fully mediates the relation between nonsymbolic magnitude processing and math ability, regardless of the performance metric or standardized test.
Modelling dynamics with context-free grammars
NASA Astrophysics Data System (ADS)
García-Huerta, Juan-M.; Jiménez-Hernández, Hugo; Herrera-Navarro, Ana-M.; Hernández-Díaz, Teresa; Terol-Villalobos, Ivan
2014-03-01
This article presents a strategy to model the dynamics performed by vehicles in a freeway. The proposal consists on encode the movement as a set of finite states. A watershed-based segmentation is used to localize regions with high-probability of motion. Each state represents a proportion of a camera projection in a two-dimensional space, where each state is associated to a symbol, such that any combination of symbols is expressed as a language. Starting from a sequence of symbols through a linear algorithm a free-context grammar is inferred. This grammar represents a hierarchical view of common sequences observed into the scene. Most probable grammar rules express common rules associated to normal movement behavior. Less probable rules express themselves a way to quantify non-common behaviors and they might need more attention. Finally, all sequences of symbols that does not match with the grammar rules, may express itself uncommon behaviors (abnormal). The grammar inference is built with several sequences of images taken from a freeway. Testing process uses the sequence of symbols emitted by the scenario, matching the grammar rules with common freeway behaviors. The process of detect abnormal/normal behaviors is managed as the task of verify if any word generated by the scenario is recognized by the grammar.
2004-05-01
grounded in structuration theory (Giddens, 1984), social information processing theory (Salancik and Pfeffer, 1978) and symbolic interactionism (Manis...and B. N. Meltzer. Symbolic interaction: A reader in social psychology. Boston: Allyn & Bacon. 1978 Mcpherson, J. M. and L. Smith-Lovin
Toward synthesizing executable models in biology.
Fisher, Jasmin; Piterman, Nir; Bodik, Rastislav
2014-01-01
Over the last decade, executable models of biological behaviors have repeatedly provided new scientific discoveries, uncovered novel insights, and directed new experimental avenues. These models are computer programs whose execution mechanistically simulates aspects of the cell's behaviors. If the observed behavior of the program agrees with the observed biological behavior, then the program explains the phenomena. This approach has proven beneficial for gaining new biological insights and directing new experimental avenues. One advantage of this approach is that techniques for analysis of computer programs can be applied to the analysis of executable models. For example, one can confirm that a model agrees with experiments for all possible executions of the model (corresponding to all environmental conditions), even if there are a huge number of executions. Various formal methods have been adapted for this context, for example, model checking or symbolic analysis of state spaces. To avoid manual construction of executable models, one can apply synthesis, a method to produce programs automatically from high-level specifications. In the context of biological modeling, synthesis would correspond to extracting executable models from experimental data. We survey recent results about the usage of the techniques underlying synthesis of computer programs for the inference of biological models from experimental data. We describe synthesis of biological models from curated mutation experiment data, inferring network connectivity models from phosphoproteomic data, and synthesis of Boolean networks from gene expression data. While much work has been done on automated analysis of similar datasets using machine learning and artificial intelligence, using synthesis techniques provides new opportunities such as efficient computation of disambiguating experiments, as well as the ability to produce different kinds of models automatically from biological data.
Developing pretend play in children with autism: a case study.
Sherratt, Dave
2002-06-01
A classroom-based intervention study aimed to explore whether it was possible to teach children with autism and additional learning difficulties to use symbolic pretend play. Five children with autism were involved in a 4 month intervention that used structure, affect and repetition. The intervention progressively faded out the structuring over three phases. All the children were able to use some symbolic acts within play. The study suggests that some of the symbolic play was not the result of replicating previously modelled examples but was spontaneous and novel.
Prediction of dynamical systems by symbolic regression
NASA Astrophysics Data System (ADS)
Quade, Markus; Abel, Markus; Shafi, Kamran; Niven, Robert K.; Noack, Bernd R.
2016-07-01
We study the modeling and prediction of dynamical systems based on conventional models derived from measurements. Such algorithms are highly desirable in situations where the underlying dynamics are hard to model from physical principles or simplified models need to be found. We focus on symbolic regression methods as a part of machine learning. These algorithms are capable of learning an analytically tractable model from data, a highly valuable property. Symbolic regression methods can be considered as generalized regression methods. We investigate two particular algorithms, the so-called fast function extraction which is a generalized linear regression algorithm, and genetic programming which is a very general method. Both are able to combine functions in a certain way such that a good model for the prediction of the temporal evolution of a dynamical system can be identified. We illustrate the algorithms by finding a prediction for the evolution of a harmonic oscillator based on measurements, by detecting an arriving front in an excitable system, and as a real-world application, the prediction of solar power production based on energy production observations at a given site together with the weather forecast.
Application of conditional moment tests to model checking for generalized linear models.
Pan, Wei
2002-06-01
Generalized linear models (GLMs) are increasingly being used in daily data analysis. However, model checking for GLMs with correlated discrete response data remains difficult. In this paper, through a case study on marginal logistic regression using a real data set, we illustrate the flexibility and effectiveness of using conditional moment tests (CMTs), along with other graphical methods, to do model checking for generalized estimation equation (GEE) analyses. Although CMTs provide an array of powerful diagnostic tests for model checking, they were originally proposed in the econometrics literature and, to our knowledge, have never been applied to GEE analyses. CMTs cover many existing tests, including the (generalized) score test for an omitted covariate, as special cases. In summary, we believe that CMTs provide a class of useful model checking tools.
Take the Reins on Model Quality with ModelCHECK and Gatekeeper
NASA Technical Reports Server (NTRS)
Jones, Corey
2012-01-01
Model quality and consistency has been an issue for us due to the diverse experience level and imaginative modeling techniques of our users. Fortunately, setting up ModelCHECK and Gatekeeper to enforce our best practices has helped greatly, but it wasn't easy. There were many challenges associated with setting up ModelCHECK and Gatekeeper including: limited documentation, restrictions within ModelCHECK, and resistance from end users. However, we consider ours a success story. In this presentation we will describe how we overcame these obstacles and present some of the details of how we configured them to work for us.
Nonlinear, nonbinary cyclic group codes
NASA Technical Reports Server (NTRS)
Solomon, G.
1992-01-01
New cyclic group codes of length 2(exp m) - 1 over (m - j)-bit symbols are introduced. These codes can be systematically encoded and decoded algebraically. The code rates are very close to Reed-Solomon (RS) codes and are much better than Bose-Chaudhuri-Hocquenghem (BCH) codes (a former alternative). The binary (m - j)-tuples are identified with a subgroup of the binary m-tuples which represents the field GF(2 exp m). Encoding is systematic and involves a two-stage procedure consisting of the usual linear feedback register (using the division or check polynomial) and a small table lookup. For low rates, a second shift-register encoding operation may be invoked. Decoding uses the RS error-correcting procedures for the m-tuple codes for m = 4, 5, and 6.
[American participation in the creation of a nurse model in Brazilian society in the 1920's].
Santos, Tânia Cristina Franco; Barreira, Ieda de Alencar; da Fonte, Aline Silva; de Oliveira, Alexandre Barbosa
2011-08-01
The objectives of this historical-social study are: to describe the circumstances that determined the participation of North American nurses in the formation of the Brazilian nurse; and analyse the process of implementing institutional rituals as a strategy of symbolic fight, to confer visibility to the nurse profession and discuss the symbolic effects of institutional rituals for the consecration of a nurse model for Brazilian society at the time. The primary sources are constituted of pertaining written and photographic documents relative to the studied theme. By reading the documentary corpus an analysis was made of the symbols that had distinguished and established the hierarchies of the actions, as well as the strategies undertaken for the North American nurses, towards implementing a new model of nurses in Brazilian society, coherent with the model of the North American schools of nursing. Institutional rituals, conducted or testified by prestigious figures of the history of Brazil and nursing, were fundamental for the construction of professional identity.
Fission and Properties of Neutron-Rich Nuclei
NASA Astrophysics Data System (ADS)
Hamilton, Joseph H.; Ramayya, A. V.; Carter, H. K.
2008-08-01
Opening session. Nuclear processes in stellar explosions / M. Wiescher. In-beam [symbol]-ray spectroscopy of neutron-rich nuclei at NSCL / A. Gade -- Nuclear structure I. Shell-model structure of neutron-rich nuclei beyond [symbol]Sn / A. Covello ... [et al.]. Shell structure and evolution of collectivity in nuclei above the [symbol]Sn core / S. Sarkar and M. S. Sarkar. Heavy-ion fusion using density-constrained TDHF / A. S. Umar and V. E. Oberacker. Towards an extended microscopic theory for upper-fp shell nuclei / K. P. Drumev. Properties of the Zr and Pb isotopes near the drip-line / V. N. Tarasov ... [et al.]. Identification of high spin states in [symbol] Cs nuclei and shell model calculations / K. Li ... [et al.]. Recent measurements of spherical and deformed isomers using the Lohengrin fission-fragment spectrometer / G. S. Simpson ... [et al.] -- Nuclear structure II. Nuclear structure investigation with rare isotope spectroscopic investigations at GSI / P. Boutachkov. Exploring the evolution of the shell structures by means of deep inelastic reactions / G. de Anaelis. Probing shell closures in neutron-rich nuclei / R. Krücken for the S277 and REX-ISOLDEMINIBALL collaborations. Structure of Fe isotopes at the limits of the pf-shell / N. Hoteling ... [et al.]. Spectroscopy of K isomers in shell-stabilized trans-fermium nuclei / S. K. Tandel ... [et al.] -- Radioactive ion beam facilities. SPIRAL2 at GANIL: a world leading ISOL facility for the next decade / S. Gales. New physics at the International Facility for Antiproton and Ion Research (FAIR) next to GSI / I. Augustin ... [et al.]. Radioactive beams from a high powered ISOL system / A. C. Shotter. RlKEN RT beam factory / T. Motobayashi. NSCL - ongoing activities and future perspectives / C. K. Gelbke. Rare isotope beams at Argonne / W. F. Henning. HRIBF: scientific highlights and future prospects / J. R. Beene. Radioactive ion beam research done in Dubna / G. M. Ter-Akopian ... [et al.] -- Fission I. Fission-fragment spectroscopy with STEFF / A. G. Smith ... [et al.]. Gamma ray multiplicity of [symbol]Cf spontaneous fission using LiBerACE / D. L. Bleuel ... [et al.]. Excitation energy dependence of fragment mass and total kinetic energy distributions in proton-induced fission of light actinides / I. Nishinaka ... [et al.]. A dynamical calculation of multi-modal nuclear fission / T. Wada and T. Asano. Structure of fission potential energy surfaces in ten-dimensional spaces / V. V. Pashkevich, Y. K Pyatkov and A. V. Unzhakova. A possible enhancement of nuclear fission in scattering with low energy charged particles / V. Gudkov. Dynamical multi-break processes in the [symbol]Sn + [symbol]Ni system at 35 MeV/Nucleon / M. Papa and ISOSPIN-RE VERSE collaboration -- New experimental techniques. MTOF - a high resolution isobar separator for studies of exotic decays / A. Piechaczek ... [et al.]. Development of ORRUBA: a silicon array for the measurement of transfer reactions in inverse kinematics / S. D. Pain ... [et al.]. Indian national gamma array: present & future / R. K. Bhowmik. Absolute intensities of [symbol] rays emitted in the decay of [symbol]U / H. C. Griffin -- Superheavy elements theory and experiments / M. G. Itkis ... [et al.]. Study of superheavy elements at SHIP / S. Hofinann. Heaviest nuclei from [symbol]Ca-induced reactions / Yu. Ts. Oaanessian. Superheavy nuclei and giant nuclear systems / W. Greiner and V. Zagrebaev. Fission approach to alpha-decay of superheavy nuclei / D.N. Poenaru and W. Greiner. Superheavy elements in the Magic Islands / C. Samanta. Relativistic mean field studies of superheavy nuclei / A. V. Afanas jev. Understanding the synthesis of the heaviest nuclei / W. Loveland -- Mass measurements and g-factors. G factor measurements in neutron-rich [symbol]Cf fission fragments, measured using the gammasphere array / R. Orlandi ... [et al.]. Technique for measuring angular correlations and g-factors in neutron rich nuclei produced by the spontaneous fission of [symbol]Cf / A. V. Daniel ... [et al.]. Magnetic moment measurements in a radioactive beam environment / N. Benczer-Koller and G. Kumbartzki. g-Factor measurements of picosecond states: opportunities and limitations of the recoil-in-vacuum method / N. J. Stone ... [et al.]. Precision mass measurements and trap-assisted spectroscopy of fission products from Ni to Pd / A. Jokinen -- Fission II. Fission research at IRMM / F.-J. Hambsch. Fission yield measurements at the IGISOL facility with JYFLTRAP / H. Penttilä ... [et al.]. Fission of radioactive beams and dissipation in nuclear matter / A. Heinz (for the CHARMS collaboration). Fission of [symbol]U at 80 MeVlu and search for new neutron-rich isotopes / C.M. Folden, III ... [et al.]. Measurement of the average energy and multiplicity of prompt-fission neutrons and gamma rays from [symbol], [symbol], and [symbol] for incident neutron energies of 1 to 200 MeV / R. C. Haight ... [et al.]. Fission measurements with DANCE / M. Jandel ... [et al.]. Measured and calculated neutron-induced fission cross sections of [symbol]Pu / F. Tovesson and T. S. Hill. The fission barrier landscape / L. Phair and L. G. Moretto. Fast neutron-induced fission of some actinides and sub-actinides / A. B. Lautev ... [et al.] -- Fission III/Nuclear structure III. Complex structure in even-odd staggering of fission fragment yields / M. Caamāno and F. Rejmund. The surrogate method: past, present and future / S. R. Lesher ... [et al]. Effects of nuclear incompressibility on heavy-ion fusion / H. Esbensen and Ş. Mişicu. High spin states in [symbol]Pm / A. Dhal ... [et al]. Structure of [symbol]Sm, spherical vibrator versus softly deformed rotor / J. B. Gupta -- Astrophysics. Measuring the astrophysical S-factor in plasmas / A. Bonasera ... [et al.]. Is there shell quenching or shape coexistence in Cd isotopes near N = 82? / J. K. Hwang, A. V. Ramayya and J. H. Hamilton. Spectroscopy of neutron-rich palladium and cadmium isostopes near A= 120 / M. A. Stoyer and W. B. Walters -- Nuclear structure IV. First observation of new neutron-rich magnesium, aluminum and silicon isotopes / A. Stolz ... [et al.]. Spectroscopy of [symbol]Na revolution of shell structure with isospin / V. Tripathi ... [et al.]. Rearrangement of proton single particle orbitals in neutron-rich potassium isotopes - spectroscopy of [symbol]K / W. Królas ... [et al.]. Laser spectroscopy and the nature of the shape transition at N [symbol] 60 / B. Cheal ... [et al.]. Study of nuclei near stability as fission fragments following heavy-ion reactions / N. Fotiadis. [symbol]C and [symbol]N: lifetime measurements of their first-excited states / M. Wiedeking ... [et al.] -- Nuclear astrophysics. Isomer spectroscopy near [symbol]Sn - first observation of excited states in [symbol]Cd / M. Pfitzner ... [et al.]. Nuclear masses and what they imply for the structures of neutron rich nuclei / A. Awahamian and A. Teymurazyan. Multiple nucleosynthesis processes in the early universe / F. Montes. Single-neutron structure of neutron-rich nuclei near N = 50 and N = 82 / J. A. Cizewski ... [et al.]. [symbol]Cadmium: ugly duckling or young swan / W. B. Walters ... [et al.] -- Nuclear structure V. Evidence for chiral doublet bands in [symbol]Ru / Y. X. Luo ... [et al.]. Unusual octupole shape deformation terms and K-mixing / J. O. Rasmussen ... [et al.]. Spin assignments, mixing ratios, and g-factors in neutron rich [symbol]Cf fission products / C. Goodin ... [et al.]. Level structures and double [symbol]-bands in [symbol]Mo, [symbol]Mo and [symbol]Ru / S. J. Zhu ... [et al.] -- Nuclear theory. Microscopic dynamics of shape coexistence phenomena around [symbol]Se and [symbol]Kr / N. Hinohara ... [et al.]. Nuclear structure, double beta decay and test of physics beyond the standard model / A. Faessler. Collective modes in elastic nuclear matter / Ş. Mişicu and S. Bastrukov. From N = Z to neutron rich: magnetic moments of Cu isotopes at and above the [symbol]Ni and [symbol]Ni double shell closures - what next? / N. J. Stone, J. R. Stone and U. Köster -- Nuclear structure VI. Decay studies of nuclei near [symbol]Ni / R. Grzywacz. Weakening of the [symbol]Ni core for Z > 28, N > 50? / J. A. Winger ... [et al.]. Coulomb excitation of the odd-A [symbol]Cu isotopes with MINIBALL and REX-ISOLDE / I. Stefanescu ... [et al.]. Neutron single particle states and isomers in odd mass nickel isotopes near [symbol]Ni / M. M. Raiabali ... [et al.]. [symbol] and [symbol]-delayed neutron decay studies of [symbol]Ch at the HRIBF / S. V. Ilvushkin ... [et al.] -- Posters. Properties of Fe, Ni and Zn isotope chains near the drip-line / V. N. Tarasov ... [et al.]. Probing nuclear structure of [symbol]Xe / J. B. Gupta. Shape coexistence in [symbol]Zr and large deformation in [symbol]Zr / J. K. Hwang ... [et al.]. Digital electronics and their application to beta decay spectroscopy / S. N. Liddick, S. Padgett and R. Grzywacz. Nuclear shape and structure in neutron-rich [symbol]Tc / Y. X. Luo ... [et al.]. Speeding up the r-process. Investigation of first forbidden [symbol] decays in N > 50 isotopes near [symbol]Ni / S. Padgett ... [et al.]. Yields of fission products from various actinide targets / E. H. Sveiewski ... [et al.].
Computational intelligence models to predict porosity of tablets using minimum features
Khalid, Mohammad Hassan; Kazemi, Pezhman; Perez-Gandarillas, Lucia; Michrafy, Abderrahim; Szlęk, Jakub; Jachowicz, Renata; Mendyk, Aleksander
2017-01-01
The effects of different formulations and manufacturing process conditions on the physical properties of a solid dosage form are of importance to the pharmaceutical industry. It is vital to have in-depth understanding of the material properties and governing parameters of its processes in response to different formulations. Understanding the mentioned aspects will allow tighter control of the process, leading to implementation of quality-by-design (QbD) practices. Computational intelligence (CI) offers an opportunity to create empirical models that can be used to describe the system and predict future outcomes in silico. CI models can help explore the behavior of input parameters, unlocking deeper understanding of the system. This research endeavor presents CI models to predict the porosity of tablets created by roll-compacted binary mixtures, which were milled and compacted under systematically varying conditions. CI models were created using tree-based methods, artificial neural networks (ANNs), and symbolic regression trained on an experimental data set and screened using root-mean-square error (RMSE) scores. The experimental data were composed of proportion of microcrystalline cellulose (MCC) (in percentage), granule size fraction (in micrometers), and die compaction force (in kilonewtons) as inputs and porosity as an output. The resulting models show impressive generalization ability, with ANNs (normalized root-mean-square error [NRMSE] =1%) and symbolic regression (NRMSE =4%) as the best-performing methods, also exhibiting reliable predictive behavior when presented with a challenging external validation data set (best achieved symbolic regression: NRMSE =3%). Symbolic regression demonstrates the transition from the black box modeling paradigm to more transparent predictive models. Predictive performance and feature selection behavior of CI models hints at the most important variables within this factor space. PMID:28138223
Computational intelligence models to predict porosity of tablets using minimum features.
Khalid, Mohammad Hassan; Kazemi, Pezhman; Perez-Gandarillas, Lucia; Michrafy, Abderrahim; Szlęk, Jakub; Jachowicz, Renata; Mendyk, Aleksander
2017-01-01
The effects of different formulations and manufacturing process conditions on the physical properties of a solid dosage form are of importance to the pharmaceutical industry. It is vital to have in-depth understanding of the material properties and governing parameters of its processes in response to different formulations. Understanding the mentioned aspects will allow tighter control of the process, leading to implementation of quality-by-design (QbD) practices. Computational intelligence (CI) offers an opportunity to create empirical models that can be used to describe the system and predict future outcomes in silico. CI models can help explore the behavior of input parameters, unlocking deeper understanding of the system. This research endeavor presents CI models to predict the porosity of tablets created by roll-compacted binary mixtures, which were milled and compacted under systematically varying conditions. CI models were created using tree-based methods, artificial neural networks (ANNs), and symbolic regression trained on an experimental data set and screened using root-mean-square error (RMSE) scores. The experimental data were composed of proportion of microcrystalline cellulose (MCC) (in percentage), granule size fraction (in micrometers), and die compaction force (in kilonewtons) as inputs and porosity as an output. The resulting models show impressive generalization ability, with ANNs (normalized root-mean-square error [NRMSE] =1%) and symbolic regression (NRMSE =4%) as the best-performing methods, also exhibiting reliable predictive behavior when presented with a challenging external validation data set (best achieved symbolic regression: NRMSE =3%). Symbolic regression demonstrates the transition from the black box modeling paradigm to more transparent predictive models. Predictive performance and feature selection behavior of CI models hints at the most important variables within this factor space.
Model checking for linear temporal logic: An efficient implementation
NASA Technical Reports Server (NTRS)
Sherman, Rivi; Pnueli, Amir
1990-01-01
This report provides evidence to support the claim that model checking for linear temporal logic (LTL) is practically efficient. Two implementations of a linear temporal logic model checker is described. One is based on transforming the model checking problem into a satisfiability problem; the other checks an LTL formula for a finite model by computing the cross-product of the finite state transition graph of the program with a structure containing all possible models for the property. An experiment was done with a set of mutual exclusion algorithms and tested safety and liveness under fairness for these algorithms.
A Self-Stabilizing Hybrid Fault-Tolerant Synchronization Protocol
NASA Technical Reports Server (NTRS)
Malekpour, Mahyar R.
2015-01-01
This paper presents a strategy for solving the Byzantine general problem for self-stabilizing a fully connected network from an arbitrary state and in the presence of any number of faults with various severities including any number of arbitrary (Byzantine) faulty nodes. The strategy consists of two parts: first, converting Byzantine faults into symmetric faults, and second, using a proven symmetric-fault tolerant algorithm to solve the general case of the problem. A protocol (algorithm) is also present that tolerates symmetric faults, provided that there are more good nodes than faulty ones. The solution applies to realizable systems, while allowing for differences in the network elements, provided that the number of arbitrary faults is not more than a third of the network size. The only constraint on the behavior of a node is that the interactions with other nodes are restricted to defined links and interfaces. The solution does not rely on assumptions about the initial state of the system and no central clock nor centrally generated signal, pulse, or message is used. Nodes are anonymous, i.e., they do not have unique identities. A mechanical verification of a proposed protocol is also present. A bounded model of the protocol is verified using the Symbolic Model Verifier (SMV). The model checking effort is focused on verifying correctness of the bounded model of the protocol as well as confirming claims of determinism and linear convergence with respect to the self-stabilization period.
The dilemma of the symbols: analogies between philosophy, biology and artificial life.
Spadaro, Salvatore
2013-01-01
This article analyzes some analogies going from Artificial Life questions about the symbol-matter connection to Artificial Intelligence questions about symbol-grounding. It focuses on the notion of the interpretability of syntax and how the symbols are integrated in a unity ("binding problem"). Utilizing the DNA code as a model, this paper discusses how syntactic features could be defined as high-grade characteristics of the non syntactic relations in a material-dynamic structure, by using an emergentist approach. This topic furnishes the ground for a confutation of J. Searle's statement that syntax is observer-relative, as he wrote in his book "Mind: A Brief Introduction". Moreover the evolving discussion also modifies the classic symbol-processing doctrine in the mind which Searle attacks as a strong AL argument, that life could be implemented in a computational mode. Lastly, this paper furnishes a new way of support for the autonomous systems thesis in Artificial Life and Artificial Intelligence, using, inter alia, the "adaptive resonance theory" (ART).
Zhao, Jiaduo; Gong, Weiguo; Tang, Yuzhen; Li, Weihong
2016-01-20
In this paper, we propose an effective human and nonhuman pyroelectric infrared (PIR) signal recognition method to reduce PIR detector false alarms. First, using the mathematical model of the PIR detector, we analyze the physical characteristics of the human and nonhuman PIR signals; second, based on the analysis results, we propose an empirical mode decomposition (EMD)-based symbolic dynamic analysis method for the recognition of human and nonhuman PIR signals. In the proposed method, first, we extract the detailed features of a PIR signal into five symbol sequences using an EMD-based symbolization method, then, we generate five feature descriptors for each PIR signal through constructing five probabilistic finite state automata with the symbol sequences. Finally, we use a weighted voting classification strategy to classify the PIR signals with their feature descriptors. Comparative experiments show that the proposed method can effectively classify the human and nonhuman PIR signals and reduce PIR detector's false alarms.
Symbolic inversion of control relationships in model-based expert systems
NASA Technical Reports Server (NTRS)
Thomas, Stan
1988-01-01
Symbolic inversion is examined from several perspectives. First, a number of symbolic algebra and mathematical tool packages were studied in order to evaluate their capabilities and methods, specifically with respect to symbolic inversion. Second, the KATE system (without hardware interface) was ported to a Zenith Z-248 microcomputer running Golden Common Lisp. The interesting thing about the port is that it allows the user to have measurements vary and components fail in a non-deterministic manner based upon random value from probability distributions. Third, INVERT was studied as currently implemented in KATE, its operation documented, some of its weaknesses identified, and corrections made to it. The corrections and enhancements are primarily in the way that logical conditions involving AND's and OR's and inequalities are processed. In addition, the capability to handle equalities was also added. Suggestions were also made regarding the handling of ranges in INVERT. Last, other approaches to the inversion process were studied and recommendations were made as to how future versions of KATE should perform symbolic inversion.
Generalized memory associativity in a network model for the neuroses
NASA Astrophysics Data System (ADS)
Wedemann, Roseli S.; Donangelo, Raul; de Carvalho, Luís A. V.
2009-03-01
We review concepts introduced in earlier work, where a neural network mechanism describes some mental processes in neurotic pathology and psychoanalytic working-through, as associative memory functioning, according to the findings of Freud. We developed a complex network model, where modules corresponding to sensorial and symbolic memories interact, representing unconscious and conscious mental processes. The model illustrates Freud's idea that consciousness is related to symbolic and linguistic memory activity in the brain. We have introduced a generalization of the Boltzmann machine to model memory associativity. Model behavior is illustrated with simulations and some of its properties are analyzed with methods from statistical mechanics.
Socialization and Adolescent Self-Esteem: Symbolic Interaction and Social Learning Explanations.
ERIC Educational Resources Information Center
Openshaw, D. Kim; And Others
1983-01-01
Investigated the effects of social learning and symbolic interaction on adolescent self-esteem. Adolescents (N=368) and their parents completed measures of self-esteem, parental behavior and parental power. Results suggested adolescent self-esteem is more a function of social interaction and the reflected appraisals of others than a modeling of…
ERIC Educational Resources Information Center
Claybrook, Billy G.
A new heuristic factorization scheme uses learning to improve the efficiency of determining the symbolic factorization of multivariable polynomials with interger coefficients and an arbitrary number of variables and terms. The factorization scheme makes extensive use of artificial intelligence techniques (e.g., model-building, learning, and…
ERIC Educational Resources Information Center
Thadison, Felicia Culver
2011-01-01
Explanations of chemical phenomena rely on understanding the behavior of submicroscopic particles. Because this level is "invisible," it is described using symbols such as models, diagrams and equations. For this reason, students often view chemistry as a "difficult" subject. The laboratory offers a unique opportunity for the students to…
ERIC Educational Resources Information Center
Khattab, Ali-Maher; Michael, William B.
1986-01-01
Based on reanalyses of correlational data obtained from the University of Southern California Aptitudes Research Project, this investigation examined the extent to which two higher order factors of semantic content and symbolic content form Guilford's structure-of-intellect model reflected distinct constructs. (Author/LMO)
Human Symbol Manipulation within an Integrated Cognitive Architecture
ERIC Educational Resources Information Center
Anderson, John R.
2005-01-01
This article describes the Adaptive Control of Thought-Rational (ACT-R) cognitive architecture (Anderson et al., 2004; Anderson & Lebiere, 1998) and its detailed application to the learning of algebraic symbol manipulation. The theory is applied to modeling the data from a study by Qin, Anderson, Silk, Stenger, & Carter (2004) in which children…
Using hidden Markov models to align multiple sequences.
Mount, David W
2009-07-01
A hidden Markov model (HMM) is a probabilistic model of a multiple sequence alignment (msa) of proteins. In the model, each column of symbols in the alignment is represented by a frequency distribution of the symbols (called a "state"), and insertions and deletions are represented by other states. One moves through the model along a particular path from state to state in a Markov chain (i.e., random choice of next move), trying to match a given sequence. The next matching symbol is chosen from each state, recording its probability (frequency) and also the probability of going to that state from a previous one (the transition probability). State and transition probabilities are multiplied to obtain a probability of the given sequence. The hidden nature of the HMM is due to the lack of information about the value of a specific state, which is instead represented by a probability distribution over all possible values. This article discusses the advantages and disadvantages of HMMs in msa and presents algorithms for calculating an HMM and the conditions for producing the best HMM.
The influence of math anxiety on symbolic and non-symbolic magnitude processing.
Dietrich, Julia F; Huber, Stefan; Moeller, Korbinian; Klein, Elise
2015-01-01
Deficits in basic numerical abilities have been investigated repeatedly as potential risk factors of math anxiety. Previous research suggested that also a deficient approximate number system (ANS), which is discussed as being the foundation for later math abilities, underlies math anxiety. However, these studies examined this hypothesis by investigating ANS acuity using a symbolic number comparison task. Recent evidence questions the view that ANS acuity can be assessed using a symbolic number comparison task. To investigate whether there is an association between math anxiety and ANS acuity, we employed both a symbolic number comparison task and a non-symbolic dot comparison task, which is currently the standard task to assess ANS acuity. We replicated previous findings regarding the association between math anxiety and the symbolic distance effect for response times. High math anxious individuals showed a larger distance effect than less math anxious individuals. However, our results revealed no association between math anxiety and ANS acuity assessed using a non-symbolic dot comparison task. Thus, our results did not provide evidence for the hypothesis that a deficient ANS underlies math anxiety. Therefore, we propose that a deficient ANS does not constitute a risk factor for the development of math anxiety. Moreover, our results suggest that previous interpretations regarding the interaction of math anxiety and the symbolic distance effect have to be updated. We suggest that impaired number comparison processes in high math anxious individuals might account for the results rather than deficient ANS representations. Finally, impaired number comparison processes might constitute a risk factor for the development of math anxiety. Implications for current models regarding the origins of math anxiety are discussed.
The influence of math anxiety on symbolic and non-symbolic magnitude processing
Dietrich, Julia F.; Huber, Stefan; Moeller, Korbinian; Klein, Elise
2015-01-01
Deficits in basic numerical abilities have been investigated repeatedly as potential risk factors of math anxiety. Previous research suggested that also a deficient approximate number system (ANS), which is discussed as being the foundation for later math abilities, underlies math anxiety. However, these studies examined this hypothesis by investigating ANS acuity using a symbolic number comparison task. Recent evidence questions the view that ANS acuity can be assessed using a symbolic number comparison task. To investigate whether there is an association between math anxiety and ANS acuity, we employed both a symbolic number comparison task and a non-symbolic dot comparison task, which is currently the standard task to assess ANS acuity. We replicated previous findings regarding the association between math anxiety and the symbolic distance effect for response times. High math anxious individuals showed a larger distance effect than less math anxious individuals. However, our results revealed no association between math anxiety and ANS acuity assessed using a non-symbolic dot comparison task. Thus, our results did not provide evidence for the hypothesis that a deficient ANS underlies math anxiety. Therefore, we propose that a deficient ANS does not constitute a risk factor for the development of math anxiety. Moreover, our results suggest that previous interpretations regarding the interaction of math anxiety and the symbolic distance effect have to be updated. We suggest that impaired number comparison processes in high math anxious individuals might account for the results rather than deficient ANS representations. Finally, impaired number comparison processes might constitute a risk factor for the development of math anxiety. Implications for current models regarding the origins of math anxiety are discussed. PMID:26579012
Supernova 2007bi as a pair-instability explosion.
Gal-Yam, A; Mazzali, P; Ofek, E O; Nugent, P E; Kulkarni, S R; Kasliwal, M M; Quimby, R M; Filippenko, A V; Cenko, S B; Chornock, R; Waldman, R; Kasen, D; Sullivan, M; Beshore, E C; Drake, A J; Thomas, R C; Bloom, J S; Poznanski, D; Miller, A A; Foley, R J; Silverman, J M; Arcavi, I; Ellis, R S; Deng, J
2009-12-03
Stars with initial masses such that 10M[symbol: see text]
Code of Federal Regulations, 2010 CFR
2010-01-01
... 12 Banks and Banking 3 2010-01-01 2010-01-01 false Model Availability Policy Disclosures, Clauses, and Notices; Model Substitute Check Policy Disclosure and Notices C Appendix C to Part 229 Banks and... OF FUNDS AND COLLECTION OF CHECKS (REGULATION CC) Pt. 229, App. C Appendix C to Part 229—Model...
Vanbinst, Kiran; Ceulemans, Eva; Peters, Lien; Ghesquière, Pol; De Smedt, Bert
2018-02-01
Although symbolic numerical magnitude processing skills are key for learning arithmetic, their developmental trajectories remain unknown. Therefore, we delineated during the first 3years of primary education (5-8years of age) groups with distinguishable developmental trajectories of symbolic numerical magnitude processing skills using a model-based clustering approach. Three clusters were identified and were labeled as inaccurate, accurate but slow, and accurate and fast. The clusters did not differ in age, sex, socioeconomic status, or IQ. We also tested whether these clusters differed in domain-specific (nonsymbolic magnitude processing and digit identification) and domain-general (visuospatial short-term memory, verbal working memory, and processing speed) cognitive competencies that might contribute to children's ability to (efficiently) process the numerical meaning of Arabic numerical symbols. We observed minor differences between clusters in these cognitive competencies except for verbal working memory for which no differences were observed. Follow-up analyses further revealed that the above-mentioned cognitive competencies did not merely account for the cluster differences in children's development of symbolic numerical magnitude processing skills, suggesting that other factors account for these individual differences. On the other hand, the three trajectories of symbolic numerical magnitude processing revealed remarkable and stable differences in children's arithmetic fact retrieval, which stresses the importance of symbolic numerical magnitude processing for learning arithmetic. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Macala, G. A.
1983-01-01
A computer program is described that can automatically generate symbolic equations of motion for systems of hinge-connected rigid bodies with tree topologies. The dynamical formulation underlying the program is outlined, and examples are given to show how a symbolic language is used to code the formulation. The program is applied to generate the equations of motion for a four-body model of the Galileo spacecraft. The resulting equations are shown to be a factor of three faster in execution time than conventional numerical subroutines.
NASA Technical Reports Server (NTRS)
Reddy, T. S. R.; Warmbrodt, W.
1985-01-01
The combined effects of blade torsion and dynamic inflow on the aeroelastic stability of an elastic rotor blade in forward flight are studied. The governing sets of equations of motion (fully nonlinear, linearized, and multiblade equations) used in this study are derived symbolically using a program written in FORTRAN. Stability results are presented for different structural models with and without dynamic inflow. A combination of symbolic and numerical programs at the proper stage in the derivation process makes the obtainment of final stability results an efficient and straightforward procedure.
Dill: an algorithm and a symbolic software package for doing classical supersymmetry calculations
NASA Astrophysics Data System (ADS)
Luc̆ić, Vladan
1995-11-01
An algorithm is presented that formalizes different steps in a classical Supersymmetric (SUSY) calculation. Based on the algorithm Dill, a symbolic software package, that can perform the calculations, is developed in the Mathematica programming language. While the algorithm is quite general, the package is created for the 4 - D, N = 1 model. Nevertheless, with little modification, the package could be used for other SUSY models. The package has been tested and some of the results are presented.
The influence of social anxiety on the body checking behaviors of female college students.
White, Emily K; Warren, Cortney S
2014-09-01
Social anxiety and eating pathology frequently co-occur. However, there is limited research examining the relationship between anxiety and body checking, aside from one study in which social physique anxiety partially mediated the relationship between body checking cognitions and body checking behavior (Haase, Mountford, & Waller, 2007). In an independent sample of 567 college women, we tested the fit of Haase and colleagues' foundational model but did not find evidence of mediation. Thus we tested the fit of an expanded path model that included eating pathology and clinical impairment. In the best-fitting path model (CFI=.991; RMSEA=.083) eating pathology and social physique anxiety positively predicted body checking, and body checking positively predicted clinical impairment. Therefore, women who endorse social physique anxiety may be more likely to engage in body checking behaviors and experience impaired psychosocial functioning. Published by Elsevier Ltd.
ERIC Educational Resources Information Center
Sartore, Melanie L.; Cunningham, George B.
2007-01-01
Research suggests that females are vastly under-represented in the upper echelons of sport organizations. As such, the purpose of the current article was to apply a symbolic interactionist perspective to the lacking presence of women in leadership positions of sport organizations. The model proposes that gender-role meanings and stereotypes…
Code of Federal Regulations, 2012 CFR
2012-01-01
... in different states or check processing regions)]. If you make the deposit in person to one of our... processing regions)]. If you make the deposit in person to one of our employees, funds from the following... Your Rights What Is a Substitute Check? To make check processing faster, federal law permits banks to...
Code of Federal Regulations, 2011 CFR
2011-01-01
... in different states or check processing regions)]. If you make the deposit in person to one of our... processing regions)]. If you make the deposit in person to one of our employees, funds from the following... Your Rights What Is a Substitute Check? To make check processing faster, federal law permits banks to...
Code of Federal Regulations, 2013 CFR
2013-01-01
... in different states or check processing regions)]. If you make the deposit in person to one of our... processing regions)]. If you make the deposit in person to one of our employees, funds from the following... Your Rights What Is a Substitute Check? To make check processing faster, federal law permits banks to...
Determination of MLC model parameters for Monaco using commercial diode arrays.
Kinsella, Paul; Shields, Laura; McCavana, Patrick; McClean, Brendan; Langan, Brian
2016-07-08
Multileaf collimators (MLCs) need to be characterized accurately in treatment planning systems to facilitate accurate intensity-modulated radiation therapy (IMRT) and volumetric-modulated arc therapy (VMAT). The aim of this study was to examine the use of MapCHECK 2 and ArcCHECK diode arrays for optimizing MLC parameters in Monaco X-ray voxel Monte Carlo (XVMC) dose calculation algorithm. A series of radiation test beams designed to evaluate MLC model parameters were delivered to MapCHECK 2, ArcCHECK, and EBT3 Gafchromic film for comparison. Initial comparison of the calculated and ArcCHECK-measured dose distributions revealed it was unclear how to change the MLC parameters to gain agreement. This ambiguity arose due to an insufficient sampling of the test field dose distributions and unexpected discrepancies in the open parts of some test fields. Consequently, the XVMC MLC parameters were optimized based on MapCHECK 2 measurements. Gafchromic EBT3 film was used to verify the accuracy of MapCHECK 2 measured dose distributions. It was found that adjustment of the MLC parameters from their default values resulted in improved global gamma analysis pass rates for MapCHECK 2 measurements versus calculated dose. The lowest pass rate of any MLC-modulated test beam improved from 68.5% to 93.5% with 3% and 2 mm gamma criteria. Given the close agreement of the optimized model to both MapCHECK 2 and film, the optimized model was used as a benchmark to highlight the relatively large discrepancies in some of the test field dose distributions found with ArcCHECK. Comparison between the optimized model-calculated dose and ArcCHECK-measured dose resulted in global gamma pass rates which ranged from 70.0%-97.9% for gamma criteria of 3% and 2 mm. The simple square fields yielded high pass rates. The lower gamma pass rates were attributed to the ArcCHECK overestimating the dose in-field for the rectangular test fields whose long axis was parallel to the long axis of the ArcCHECK. Considering ArcCHECK measurement issues and the lower gamma pass rates for the MLC-modulated test beams, it was concluded that MapCHECK 2 was a more suitable detector than ArcCHECK for the optimization process. © 2016 The Authors
Propel: Tools and Methods for Practical Source Code Model Checking
NASA Technical Reports Server (NTRS)
Mansouri-Samani, Massoud; Mehlitz, Peter; Markosian, Lawrence; OMalley, Owen; Martin, Dale; Moore, Lantz; Penix, John; Visser, Willem
2003-01-01
The work reported here is an overview and snapshot of a project to develop practical model checking tools for in-the-loop verification of NASA s mission-critical, multithreaded programs in Java and C++. Our strategy is to develop and evaluate both a design concept that enables the application of model checking technology to C++ and Java, and a model checking toolset for C++ and Java. The design concept and the associated model checking toolset is called Propel. It builds upon the Java PathFinder (JPF) tool, an explicit state model checker for Java applications developed by the Automated Software Engineering group at NASA Ames Research Center. The design concept that we are developing is Design for Verification (D4V). This is an adaption of existing best design practices that has the desired side-effect of enhancing verifiability by improving modularity and decreasing accidental complexity. D4V, we believe, enhances the applicability of a variety of V&V approaches; we are developing the concept in the context of model checking. The model checking toolset, Propel, is based on extending JPF to handle C++. Our principal tasks in developing the toolset are to build a translator from C++ to Java, productize JPF, and evaluate the toolset in the context of D4V. Through all these tasks we are testing Propel capabilities on customer applications.
Program Model Checking as a New Trend
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Visser, Willem; Clancy, Daniel (Technical Monitor)
2002-01-01
This paper introduces a special section of STTT (International Journal on Software Tools for Technology Transfer) containing a selection of papers that were presented at the 7th International SPIN workshop, Stanford, August 30 - September 1, 2000. The workshop was named SPIN Model Checking and Software Verification, with an emphasis on model checking of programs. The paper outlines the motivation for stressing software verification, rather than only design and model verification, by presenting the work done in the Automated Software Engineering group at NASA Ames Research Center within the last 5 years. This includes work in software model checking, testing like technologies and static analysis.
UTP and Temporal Logic Model Checking
NASA Astrophysics Data System (ADS)
Anderson, Hugh; Ciobanu, Gabriel; Freitas, Leo
In this paper we give an additional perspective to the formal verification of programs through temporal logic model checking, which uses Hoare and He Unifying Theories of Programming (UTP). Our perspective emphasizes the use of UTP designs, an alphabetised relational calculus expressed as a pre/post condition pair of relations, to verify state or temporal assertions about programs. The temporal model checking relation is derived from a satisfaction relation between the model and its properties. The contribution of this paper is that it shows a UTP perspective to temporal logic model checking. The approach includes the notion of efficiency found in traditional model checkers, which reduced a state explosion problem through the use of efficient data structures
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-21
... pressurise the hydraulic reservoirs, due to leakage of the Crissair reservoir air pressurisation check valves. * * * The leakage of the check valves was caused by an incorrect spring material. The affected Crissair check valves * * * were then replaced with improved check valves P/N [part number] 2S2794-1 * * *. More...
a Model Study of Small-Scale World Map Generalization
NASA Astrophysics Data System (ADS)
Cheng, Y.; Yin, Y.; Li, C. M.; Wu, W.; Guo, P. P.; Ma, X. L.; Hu, F. M.
2018-04-01
With the globalization and rapid development every filed is taking an increasing interest in physical geography and human economics. There is a surging demand for small scale world map in large formats all over the world. Further study of automated mapping technology, especially the realization of small scale production on a large scale global map, is the key of the cartographic field need to solve. In light of this, this paper adopts the improved model (with the map and data separated) in the field of the mapmaking generalization, which can separate geographic data from mapping data from maps, mainly including cross-platform symbols and automatic map-making knowledge engine. With respect to the cross-platform symbol library, the symbol and the physical symbol in the geographic information are configured at all scale levels. With respect to automatic map-making knowledge engine consists 97 types, 1086 subtypes, 21845 basic algorithm and over 2500 relevant functional modules.In order to evaluate the accuracy and visual effect of our model towards topographic maps and thematic maps, we take the world map generalization in small scale as an example. After mapping generalization process, combining and simplifying the scattered islands make the map more explicit at 1 : 2.1 billion scale, and the map features more complete and accurate. Not only it enhance the map generalization of various scales significantly, but achieve the integration among map-makings of various scales, suggesting that this model provide a reference in cartographic generalization for various scales.
Schober, Jennifer; Schleicher, Dominik; Federrath, Christoph; Klessen, Ralf; Banerjee, Robi
2012-02-01
The small-scale dynamo is a process by which turbulent kinetic energy is converted into magnetic energy, and thus it is expected to depend crucially on the nature of the turbulence. In this paper, we present a model for the small-scale dynamo that takes into account the slope of the turbulent velocity spectrum v(ℓ)proportional ℓ([symbol see text])V}, where ℓ and v(ℓ) are the size of a turbulent fluctuation and the typical velocity on that scale. The time evolution of the fluctuation component of the magnetic field, i.e., the small-scale field, is described by the Kazantsev equation. We solve this linear differential equation for its eigenvalues with the quantum-mechanical WKB approximation. The validity of this method is estimated as a function of the magnetic Prandtl number Pm. We calculate the minimal magnetic Reynolds number for dynamo action, Rm_{crit}, using our model of the turbulent velocity correlation function. For Kolmogorov turbulence ([symbol see text] = 1/3), we find that the critical magnetic Reynolds number is Rm(crit) (K) ≈ 110 and for Burgers turbulence ([symbol see text] = 1/2) Rm(crit)(B) ≈ 2700. Furthermore, we derive that the growth rate of the small-scale magnetic field for a general type of turbulence is Γ proportional Re((1-[symbol see text])/(1+[symbol see text])) in the limit of infinite magnetic Prandtl number. For decreasing magnetic Prandtl number (down to Pm >/~ 10), the growth rate of the small-scale dynamo decreases. The details of this drop depend on the WKB approximation, which becomes invalid for a magnetic Prandtl number of about unity.
NASA Astrophysics Data System (ADS)
He, Jing; Wen, Xuejie; Chen, Ming; Chen, Lin; Su, Jinshu
2015-01-01
To improve the transmission performance of multiband orthogonal frequency division multiplexing (MB-OFDM) ultra-wideband (UWB) over optical fiber, a pre-coding scheme based on low-density parity-check (LDPC) is adopted and experimentally demonstrated in the intensity-modulation and direct-detection MB-OFDM UWB over fiber system. Meanwhile, a symbol synchronization and pilot-aided channel estimation scheme is implemented on the receiver of the MB-OFDM UWB over fiber system. The experimental results show that the LDPC pre-coding scheme can work effectively in the MB-OFDM UWB over fiber system. After 70 km standard single-mode fiber (SSMF) transmission, at the bit error rate of 1 × 10-3, the receiver sensitivities are improved about 4 dB when the LDPC code rate is 75%.
Discussion on LDPC Codes and Uplink Coding
NASA Technical Reports Server (NTRS)
Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio
2007-01-01
This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.
An investigation of phonology and orthography in spoken-word recognition.
Slowiaczek, Louisa M; Soltano, Emily G; Wieting, Shani J; Bishop, Karyn L
2003-02-01
The possible influence of initial phonological and/or orthographic information on spoken-word processing was examined in six experiments modelled after and extending the work Jakimik, Cole, and Rudnicky (1985). Following Jakimik et al., Experiment 1 used polysyllabic primes with monosyllabic targets (e.g., BUCKLE-BUCK/[symbol: see text]; MYSTERY-MISS,/[symbol: see text]). Experiments 2, 3, and 4 used polysyllabic primes and polysyllabic targets whose initial syllables shared phonological information (e.g., NUISANCE-NOODLE,/[symbol: see text]), orthographic information (e.g., RATIO-RATIFY,/[symbol: see text]), both (e.g., FUNNEL-FUNNY,/[symbol: see text]), or were unrelated (e.g., SERMON-NOODLE,/[symbol: see text]). Participants engaged in a lexical decision (Experiments 1, 3, and 4) or a shadowing (Experiment 2) task with a single-trial (Experiments 2 and 3) or subsequent-trial (Experiments 1 and 4) priming procedure. Experiment 5 tested primes and targets that varied in the number of shared graphemes while holding shared phonemes constant at one. Experiment 6 used the procedures of Experiment 2 but a low proportion of related trials. Results revealed that response times were facilitated for prime-target pairs that shared initial phonological and orthographic information. These results were confirmed under conditions when strategic processing was greatly reduced suggesting that phonological and orthographic information is automatically activated during spoken-word processing.
Smolensky, Paul; Goldrick, Matthew; Mathis, Donald
2014-08-01
Mental representations have continuous as well as discrete, combinatorial properties. For example, while predominantly discrete, phonological representations also vary continuously; this is reflected by gradient effects in instrumental studies of speech production. Can an integrated theoretical framework address both aspects of structure? The framework we introduce here, Gradient Symbol Processing, characterizes the emergence of grammatical macrostructure from the Parallel Distributed Processing microstructure (McClelland, Rumelhart, & The PDP Research Group, 1986) of language processing. The mental representations that emerge, Distributed Symbol Systems, have both combinatorial and gradient structure. They are processed through Subsymbolic Optimization-Quantization, in which an optimization process favoring representations that satisfy well-formedness constraints operates in parallel with a distributed quantization process favoring discrete symbolic structures. We apply a particular instantiation of this framework, λ-Diffusion Theory, to phonological production. Simulations of the resulting model suggest that Gradient Symbol Processing offers a way to unify accounts of grammatical competence with both discrete and continuous patterns in language performance. Copyright © 2013 Cognitive Science Society, Inc.
Images of Leadership and their Effect Upon School Principals' Performance
NASA Astrophysics Data System (ADS)
Gaziel, Haim
2003-09-01
The purpose of the present study is to identify how school principals perceive their world and how their perceptions influence their effectiveness as managers and leaders. The principals' views of their world were categorised into four different metaphorical ways of describing the workings of organisations: (1) the structural model (organisations as machines); (2) the human-resource model (organisations as organisms); (3) the political model (organisations as political systems); (4) the symbolic model (organisations as cultural patterns and clusters of myths and symbols). The results reveal that the best predictors of school principals' effectiveness as managers, according to their own assessments and teachers' reports, are the structural and human resource models, while the best predictors of effective leadership are the political and human-resource models.
A Clash of Symbols: An Analysis of Competing Images and Arguments in the AIDS Controversy.
ERIC Educational Resources Information Center
Gilder, Eric
Efforts to contain the spread of Acquired Immune Deficiency Syndrome (AIDS) have been slowed by numerous arguing factions, political, religious, and medical, all of which perceive the AIDS epidemic through a different set of symbols. The images can be more easily understood using Kenneth Boulding's Threat, Integry, and Exchange (or TIE) model. The…
Binary Disassembly Block Coverage by Symbolic Execution vs. Recursive Descent
2012-03-01
explores the effectiveness of symbolic execution on packed or obfuscated samples of the same binaries to generate a model-based evaluation of success...24 2.3.4.1 Packing . . . . . . . . . . . . . . . . . . . . . . . . . . 25 2.3.4.2 Techniques...inner workings of UPX (Universal Packer for eXecutables), a common packing tool, on a Windows binary. Image source: GFC08 . . . . . . . . . . . 25 3.1
Marković, Slobodan
2012-01-01
In this paper aesthetic experience is defined as an experience qualitatively different from everyday experience and similar to other exceptional states of mind. Three crucial characteristics of aesthetic experience are discussed: fascination with an aesthetic object (high arousal and attention), appraisal of the symbolic reality of an object (high cognitive engagement), and a strong feeling of unity with the object of aesthetic fascination and aesthetic appraisal. In a proposed model, two parallel levels of aesthetic information processing are proposed. On the first level two sub-levels of narrative are processed, story (theme) and symbolism (deeper meanings). The second level includes two sub-levels, perceptual associations (implicit meanings of object's physical features) and detection of compositional regularities. Two sub-levels are defined as crucial for aesthetic experience, appraisal of symbolism and compositional regularities. These sub-levels require some specific cognitive and personality dispositions, such as expertise, creative thinking, and openness to experience. Finally, feedback of emotional processing is included in our model: appraisals of everyday emotions are specified as a matter of narrative content (eg, empathy with characters), whereas the aesthetic emotion is defined as an affective evaluation in the process of symbolism appraisal or the detection of compositional regularities. PMID:23145263
Assessment of check-dam groundwater recharge with water-balance calculations
NASA Astrophysics Data System (ADS)
Djuma, Hakan; Bruggeman, Adriana; Camera, Corrado; Eliades, Marinos
2017-04-01
Studies on the enhancement of groundwater recharge by check-dams in arid and semi-arid environments mainly focus on deriving water infiltration rates from the check-dam ponding areas. This is usually achieved by applying simple water balance models, more advanced models (e.g., two dimensional groundwater models) and field tests (e.g., infiltrometer test or soil pit tests). Recharge behind the check-dam can be affected by the built-up of sediment as a result of erosion in the upstream watershed area. This natural process can increase the uncertainty in the estimates of the recharged water volume, especially for water balance calculations. Few water balance field studies of individual check-dams have been presented in the literature and none of them presented associated uncertainties of their estimates. The objectives of this study are i) to assess the effect of a check-dam on groundwater recharge from an ephemeral river; and ii) to assess annual sedimentation at the check-dam during a 4-year period. The study was conducted on a check-dam in the semi-arid island of Cyprus. Field campaigns were carried out to measure water flow, water depth and check-dam topography in order to establish check-dam water height, volume, evaporation, outflow and recharge relations. Topographic surveys were repeated at the end of consecutive hydrological years to estimate the sediment built up in the reservoir area of the check dam. Also, sediment samples were collected from the check-dam reservoir area for bulk-density analyses. To quantify the groundwater recharge, a water balance model was applied at two locations: at the check-dam and corresponding reservoir area, and at a 4-km stretch of the river bed without check-dam. Results showed that a check-dam with a storage capacity of 25,000 m3 was able to recharge to the aquifer, in four years, a total of 12 million m3 out of the 42 million m3 of measured (or modelled) streamflow. Recharge from the analyzed 4-km long river section without check-dam was estimated to be 1 million m3. Upper and lower limits of prediction intervals were computed to assess the uncertainties of the results. The model was rerun with these values and resulted in recharge values of 0.4 m3 as lower and 38 million m3 as upper limit. The sediment survey in the check-dam reservoir area showed that the reservoir area was filled with 2,000 to 3,000 tons of sediment after one rainfall season. This amount of sediment corresponds to 0.2 to 2 t h-1 y-1 sediment yield at the watershed level and reduces the check-dam storage capacity by approximately 10%. Results indicate that check-dams are valuable structures for increasing groundwater resources, but special attention should be given to soil erosion occurring in the upstream area and the resulting sediment built-up in the check-dam reservoir area. This study has received funding from the EU FP7 RECARE Project (GA 603498)
A Self-Stabilizing Distributed Clock Synchronization Protocol for Arbitrary Digraphs
NASA Technical Reports Server (NTRS)
Malekpour, Mahyar R.
2011-01-01
This report presents a self-stabilizing distributed clock synchronization protocol in the absence of faults in the system. It is focused on the distributed clock synchronization of an arbitrary, non-partitioned digraph ranging from fully connected to 1-connected networks of nodes while allowing for differences in the network elements. This protocol does not rely on assumptions about the initial state of the system, other than the presence of at least one node, and no central clock or a centrally generated signal, pulse, or message is used. Nodes are anonymous, i.e., they do not have unique identities. There is no theoretical limit on the maximum number of participating nodes. The only constraint on the behavior of the node is that the interactions with other nodes are restricted to defined links and interfaces. We present an outline of a deductive proof of the correctness of the protocol. A model of the protocol was mechanically verified using the Symbolic Model Verifier (SMV) for a variety of topologies. Results of the mechanical proof of the correctness of the protocol are provided. The model checking results have verified the correctness of the protocol as they apply to the networks with unidirectional and bidirectional links. In addition, the results confirm the claims of determinism and linear convergence. As a result, we conjecture that the protocol solves the general case of this problem. We also present several variations of the protocol and discuss that this synchronization protocol is indeed an emergent system.
Efficient model checking of network authentication protocol based on SPIN
NASA Astrophysics Data System (ADS)
Tan, Zhi-hua; Zhang, Da-fang; Miao, Li; Zhao, Dan
2013-03-01
Model checking is a very useful technique for verifying the network authentication protocols. In order to improve the efficiency of modeling and verification on the protocols with the model checking technology, this paper first proposes a universal formalization description method of the protocol. Combined with the model checker SPIN, the method can expediently verify the properties of the protocol. By some modeling simplified strategies, this paper can model several protocols efficiently, and reduce the states space of the model. Compared with the previous literature, this paper achieves higher degree of automation, and better efficiency of verification. Finally based on the method described in the paper, we model and verify the Privacy and Key Management (PKM) authentication protocol. The experimental results show that the method of model checking is effective, which is useful for the other authentication protocols.
Entropy and long-range memory in random symbolic additive Markov chains
NASA Astrophysics Data System (ADS)
Melnik, S. S.; Usatenko, O. V.
2016-06-01
The goal of this paper is to develop an estimate for the entropy of random symbolic sequences with elements belonging to a finite alphabet. As a plausible model, we use the high-order additive stationary ergodic Markov chain with long-range memory. Supposing that the correlations between random elements of the chain are weak, we express the conditional entropy of the sequence by means of the symbolic pair correlation function. We also examine an algorithm for estimating the conditional entropy of finite symbolic sequences. We show that the entropy contains two contributions, i.e., the correlation and the fluctuation. The obtained analytical results are used for numerical evaluation of the entropy of written English texts and DNA nucleotide sequences. The developed theory opens the way for constructing a more consistent and sophisticated approach to describe the systems with strong short-range and weak long-range memory.
Prior familiarity with components enhances unconscious learning of relations.
Scott, Ryan B; Dienes, Zoltan
2010-03-01
The influence of prior familiarity with components on the implicit learning of relations was examined using artificial grammar learning. Prior to training on grammar strings, participants were familiarized with either the novel symbols used to construct the strings or with irrelevant geometric shapes. Participants familiarized with the relevant symbols showed greater accuracy when judging the correctness of new grammar strings. Familiarity with elemental components did not increase conscious awareness of the basis for discriminations (structural knowledge) but increased accuracy even in its absence. The subjective familiarity of test strings predicted grammaticality judgments. However, prior exposure to relevant symbols did not increase overall test string familiarity or reliance on familiarity when making grammaticality judgments. Familiarity with the symbols increased the learning of relations between them (bigrams and trigrams) thus resulting in greater familiarity for grammatical versus ungrammatical strings. The results have important implications for models of implicit learning.
Entropy and long-range memory in random symbolic additive Markov chains.
Melnik, S S; Usatenko, O V
2016-06-01
The goal of this paper is to develop an estimate for the entropy of random symbolic sequences with elements belonging to a finite alphabet. As a plausible model, we use the high-order additive stationary ergodic Markov chain with long-range memory. Supposing that the correlations between random elements of the chain are weak, we express the conditional entropy of the sequence by means of the symbolic pair correlation function. We also examine an algorithm for estimating the conditional entropy of finite symbolic sequences. We show that the entropy contains two contributions, i.e., the correlation and the fluctuation. The obtained analytical results are used for numerical evaluation of the entropy of written English texts and DNA nucleotide sequences. The developed theory opens the way for constructing a more consistent and sophisticated approach to describe the systems with strong short-range and weak long-range memory.
An Integrated Environment for Efficient Formal Design and Verification
NASA Technical Reports Server (NTRS)
1998-01-01
The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.
Caricati, Luca; Mancini, Tiziana; Marletta, Giuseppe
2017-01-01
This research investigated the relationship among perception of ingroup threats (realistic and symbolic), conservative ideologies (social dominance orientation [SDO] and right-wing authoritarianism [RWA]), and prejudice against immigrants. Data were collected with a cross-sectional design in two samples: non-student Italian adults (n = 223) and healthcare professionals (n = 679). Results were similar in both samples and indicated that symbolic and realistic threats, as well as SDO and RWA, positively and significantly predicted anti-immigrant prejudice. Moreover, the model considering SDO and RWA as mediators of threats' effects on prejudice showed a better fit than the model in which ingroup threats mediated the effects of SDO and RWA on prejudice against immigrants. Accordingly, SDO and RWA partially mediated the effect of both symbolic and realistic threats, which maintained a significant effect on prejudice against immigrants, however.
A voice-actuated wind tunnel model leak checking system
NASA Technical Reports Server (NTRS)
Larson, William E.
1989-01-01
A computer program has been developed that improves the efficiency of wind tunnel model leak checking. The program uses a voice recognition unit to relay a technician's commands to the computer. The computer, after receiving a command, can respond to the technician via a voice response unit. Information about the model pressure orifice being checked is displayed on a gas-plasma terminal. On command, the program records up to 30 seconds of pressure data. After the recording is complete, the raw data and a straight line fit of the data are plotted on the terminal. This allows the technician to make a decision on the integrity of the orifice being checked. All results of the leak check program are stored in a database file that can be listed on the line printer for record keeping purposes or displayed on the terminal to help the technician find unchecked orifices. This program allows one technician to check a model for leaks instead of the two or three previously required.
Chambert, Thierry; Rotella, Jay J; Higgs, Megan D
2014-01-01
The investigation of individual heterogeneity in vital rates has recently received growing attention among population ecologists. Individual heterogeneity in wild animal populations has been accounted for and quantified by including individually varying effects in models for mark–recapture data, but the real need for underlying individual effects to account for observed levels of individual variation has recently been questioned by the work of Tuljapurkar et al. (Ecology Letters, 12, 93, 2009) on dynamic heterogeneity. Model-selection approaches based on information criteria or Bayes factors have been used to address this question. Here, we suggest that, in addition to model-selection, model-checking methods can provide additional important insights to tackle this issue, as they allow one to evaluate a model's misfit in terms of ecologically meaningful measures. Specifically, we propose the use of posterior predictive checks to explicitly assess discrepancies between a model and the data, and we explain how to incorporate model checking into the inferential process used to assess the practical implications of ignoring individual heterogeneity. Posterior predictive checking is a straightforward and flexible approach for performing model checks in a Bayesian framework that is based on comparisons of observed data to model-generated replications of the data, where parameter uncertainty is incorporated through use of the posterior distribution. If discrepancy measures are chosen carefully and are relevant to the scientific context, posterior predictive checks can provide important information allowing for more efficient model refinement. We illustrate this approach using analyses of vital rates with long-term mark–recapture data for Weddell seals and emphasize its utility for identifying shortfalls or successes of a model at representing a biological process or pattern of interest. We show how posterior predictive checks can be used to strengthen inferences in ecological studies. We demonstrate the application of this method on analyses dealing with the question of individual reproductive heterogeneity in a population of Antarctic pinnipeds. PMID:24834335
Gilbert, David
2016-01-01
Insights gained from multilevel computational models of biological systems can be translated into real-life applications only if the model correctness has been verified first. One of the most frequently employed in silico techniques for computational model verification is model checking. Traditional model checking approaches only consider the evolution of numeric values, such as concentrations, over time and are appropriate for computational models of small scale systems (e.g. intracellular networks). However for gaining a systems level understanding of how biological organisms function it is essential to consider more complex large scale biological systems (e.g. organs). Verifying computational models of such systems requires capturing both how numeric values and properties of (emergent) spatial structures (e.g. area of multicellular population) change over time and across multiple levels of organization, which are not considered by existing model checking approaches. To address this limitation we have developed a novel approximate probabilistic multiscale spatio-temporal meta model checking methodology for verifying multilevel computational models relative to specifications describing the desired/expected system behaviour. The methodology is generic and supports computational models encoded using various high-level modelling formalisms because it is defined relative to time series data and not the models used to generate it. In addition, the methodology can be automatically adapted to case study specific types of spatial structures and properties using the spatio-temporal meta model checking concept. To automate the computational model verification process we have implemented the model checking approach in the software tool Mule (http://mule.modelchecking.org). Its applicability is illustrated against four systems biology computational models previously published in the literature encoding the rat cardiovascular system dynamics, the uterine contractions of labour, the Xenopus laevis cell cycle and the acute inflammation of the gut and lung. Our methodology and software will enable computational biologists to efficiently develop reliable multilevel computational models of biological systems. PMID:27187178
Pârvu, Ovidiu; Gilbert, David
2016-01-01
Insights gained from multilevel computational models of biological systems can be translated into real-life applications only if the model correctness has been verified first. One of the most frequently employed in silico techniques for computational model verification is model checking. Traditional model checking approaches only consider the evolution of numeric values, such as concentrations, over time and are appropriate for computational models of small scale systems (e.g. intracellular networks). However for gaining a systems level understanding of how biological organisms function it is essential to consider more complex large scale biological systems (e.g. organs). Verifying computational models of such systems requires capturing both how numeric values and properties of (emergent) spatial structures (e.g. area of multicellular population) change over time and across multiple levels of organization, which are not considered by existing model checking approaches. To address this limitation we have developed a novel approximate probabilistic multiscale spatio-temporal meta model checking methodology for verifying multilevel computational models relative to specifications describing the desired/expected system behaviour. The methodology is generic and supports computational models encoded using various high-level modelling formalisms because it is defined relative to time series data and not the models used to generate it. In addition, the methodology can be automatically adapted to case study specific types of spatial structures and properties using the spatio-temporal meta model checking concept. To automate the computational model verification process we have implemented the model checking approach in the software tool Mule (http://mule.modelchecking.org). Its applicability is illustrated against four systems biology computational models previously published in the literature encoding the rat cardiovascular system dynamics, the uterine contractions of labour, the Xenopus laevis cell cycle and the acute inflammation of the gut and lung. Our methodology and software will enable computational biologists to efficiently develop reliable multilevel computational models of biological systems.
MatchingTools: A Python library for symbolic effective field theory calculations
NASA Astrophysics Data System (ADS)
Criado, Juan C.
2018-06-01
MatchingTools is a Python library for doing symbolic calculations in effective field theory. It provides the tools to construct general models by defining their field content and their interaction Lagrangian. Once a model is given, the heavy particles can be integrated out at the tree level to obtain an effective Lagrangian in which only the light particles appear. After integration, some of the terms of the resulting Lagrangian might not be independent. MatchingTools contains functions for transforming these terms to rewrite them in terms of any chosen set of operators.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao Yitian; Tian Bo; State Key Laboratory of Software Development Environment, Beijing University of Aeronautics and Astronautics, Beijing 100083
2006-11-15
The spherical modified Kadomtsev-Petviashvili (smKP) model is hereby derived with symbolic computation for the dust-ion-acoustic waves with zenith-angle perturbation in a cosmic dusty plasma. Formation and properties of both dark and bright smKP nebulons are obtained and discussed. The relevance of those smKP nebulons to the supernova shells and Saturn's F-ring is pointed out, and possibly observable nebulonic effects for the future cosmic plasma experiments are proposed. The difference of the smKP nebulons from other types of nebulons is also analyzed.
Model Checking Temporal Logic Formulas Using Sticker Automata
Feng, Changwei; Wu, Huanmei
2017-01-01
As an important complex problem, the temporal logic model checking problem is still far from being fully resolved under the circumstance of DNA computing, especially Computation Tree Logic (CTL), Interval Temporal Logic (ITL), and Projection Temporal Logic (PTL), because there is still a lack of approaches for DNA model checking. To address this challenge, a model checking method is proposed for checking the basic formulas in the above three temporal logic types with DNA molecules. First, one-type single-stranded DNA molecules are employed to encode the Finite State Automaton (FSA) model of the given basic formula so that a sticker automaton is obtained. On the other hand, other single-stranded DNA molecules are employed to encode the given system model so that the input strings of the sticker automaton are obtained. Next, a series of biochemical reactions are conducted between the above two types of single-stranded DNA molecules. It can then be decided whether the system satisfies the formula or not. As a result, we have developed a DNA-based approach for checking all the basic formulas of CTL, ITL, and PTL. The simulated results demonstrate the effectiveness of the new method. PMID:29119114
Foundations of the Bandera Abstraction Tools
NASA Technical Reports Server (NTRS)
Hatcliff, John; Dwyer, Matthew B.; Pasareanu, Corina S.; Robby
2003-01-01
Current research is demonstrating that model-checking and other forms of automated finite-state verification can be effective for checking properties of software systems. Due to the exponential costs associated with model-checking, multiple forms of abstraction are often necessary to obtain system models that are tractable for automated checking. The Bandera Tool Set provides multiple forms of automated support for compiling concurrent Java software systems to models that can be supplied to several different model-checking tools. In this paper, we describe the foundations of Bandera's data abstraction mechanism which is used to reduce the cardinality (and the program's state-space) of data domains in software to be model-checked. From a technical standpoint, the form of data abstraction used in Bandera is simple, and it is based on classical presentations of abstract interpretation. We describe the mechanisms that Bandera provides for declaring abstractions, for attaching abstractions to programs, and for generating abstracted programs and properties. The contributions of this work are the design and implementation of various forms of tool support required for effective application of data abstraction to software components written in a programming language like Java which has a rich set of linguistic features.
Full implementation of a distributed hydrological model based on check dam trapped sediment volumes
NASA Astrophysics Data System (ADS)
Bussi, Gianbattista; Francés, Félix
2014-05-01
Lack of hydrometeorological data is one of the most compelling limitations to the implementation of distributed environmental models. Mediterranean catchments, in particular, are characterised by high spatial variability of meteorological phenomena and soil characteristics, which may prevents from transferring model calibrations from a fully gauged catchment to a totally o partially ungauged one. For this reason, new sources of data are required in order to extend the use of distributed models to non-monitored or low-monitored areas. An important source of information regarding the hydrological and sediment cycle is represented by sediment deposits accumulated at the bottom of reservoirs. Since the 60s, reservoir sedimentation volumes were used as proxy data for the estimation of inter-annual total sediment yield rates, or, in more recent years, as a reference measure of the sediment transport for sediment model calibration and validation. Nevertheless, the possibility of using such data for constraining the calibration of a hydrological model has not been exhaustively investigated so far. In this study, the use of nine check dam reservoir sedimentation volumes for hydrological and sedimentological model calibration and spatio-temporal validation was examined. Check dams are common structures in Mediterranean areas, and are a potential source of spatially distributed information regarding both hydrological and sediment cycle. In this case-study, the TETIS hydrological and sediment model was implemented in a medium-size Mediterranean catchment (Rambla del Poyo, Spain) by taking advantage of sediment deposits accumulated behind the check dams located in the catchment headwaters. Reservoir trap efficiency was taken into account by coupling the TETIS model with a pond trap efficiency model. The model was calibrated by adjusting some of its parameters in order to reproduce the total sediment volume accumulated behind a check dam. Then, the model was spatially validated by obtaining the simulated sedimentation volume at the other eight check dams and comparing it to the observed sedimentation volumes. Lastly, the simulated water discharge at the catchment outlet was compared with observed water discharge records in order to check the hydrological sub-model behaviour. Model results provided highly valuable information concerning the spatial distribution of soil erosion and sediment transport. Spatial validation of the sediment sub-model provided very good results at seven check dams out of nine. This study shows that check dams can be a useful tool also for constraining hydrological model calibration, as model results agree with water discharge observations. In fact, the hydrological model validation at a downstream water flow gauge obtained a Nash-Sutcliffe efficiency of 0.8. This technique is applicable to all catchments with presence of check dams, and only requires rainfall and temperature data and soil characteristics maps.
[Understanding King's model on the paradigm of symbolic interactionism].
Araújo, Iliana Maria de Almeida; Oliveira, Marcos Venícios; de Oliveira, Marcos Venícius; Fernandes, Ana Fátima Carvalho
2005-01-01
It was aimed to reflect on King's Theory, according to the approach of Symbolic Interaccionism, and theory analyses model of Meleis. To reach the objective, we proceeded the reading of the three models as mentioned before, looking for the consistencies and discrepancies among the concepts and correlating them. The study allowed to conclude that the theories agree when elucidating the man as a being that reacts and search to understand the meaning of things to his/her circuit, drifting and judging their actions and the one of the other ones. It is important the subject of the meanings could be modified and they generate the elaboration of goals in common.
ERIC Educational Resources Information Center
Kuhlmeier, Valerie
2005-01-01
Many recent studies have explored young children's ability to use information from physical representations of space to guide search within the real world. In one commonly used procedure, children are asked to find a hidden toy in a room after observing a smaller toy being hidden in the analogous location in a scale model of the room.…
FPGA implementation of low complexity LDPC iterative decoder
NASA Astrophysics Data System (ADS)
Verma, Shivani; Sharma, Sanjay
2016-07-01
Low-density parity-check (LDPC) codes, proposed by Gallager, emerged as a class of codes which can yield very good performance on the additive white Gaussian noise channel as well as on the binary symmetric channel. LDPC codes have gained lots of importance due to their capacity achieving property and excellent performance in the noisy channel. Belief propagation (BP) algorithm and its approximations, most notably min-sum, are popular iterative decoding algorithms used for LDPC and turbo codes. The trade-off between the hardware complexity and the decoding throughput is a critical factor in the implementation of the practical decoder. This article presents introduction to LDPC codes and its various decoding algorithms followed by realisation of LDPC decoder by using simplified message passing algorithm and partially parallel decoder architecture. Simplified message passing algorithm has been proposed for trade-off between low decoding complexity and decoder performance. It greatly reduces the routing and check node complexity of the decoder. Partially parallel decoder architecture possesses high speed and reduced complexity. The improved design of the decoder possesses a maximum symbol throughput of 92.95 Mbps and a maximum of 18 decoding iterations. The article presents implementation of 9216 bits, rate-1/2, (3, 6) LDPC decoder on Xilinx XC3D3400A device from Spartan-3A DSP family.
Sum of the Magnitude for Hard Decision Decoding Algorithm Based on Loop Update Detection.
Meng, Jiahui; Zhao, Danfeng; Tian, Hai; Zhang, Liang
2018-01-15
In order to improve the performance of non-binary low-density parity check codes (LDPC) hard decision decoding algorithm and to reduce the complexity of decoding, a sum of the magnitude for hard decision decoding algorithm based on loop update detection is proposed. This will also ensure the reliability, stability and high transmission rate of 5G mobile communication. The algorithm is based on the hard decision decoding algorithm (HDA) and uses the soft information from the channel to calculate the reliability, while the sum of the variable nodes' (VN) magnitude is excluded for computing the reliability of the parity checks. At the same time, the reliability information of the variable node is considered and the loop update detection algorithm is introduced. The bit corresponding to the error code word is flipped multiple times, before this is searched in the order of most likely error probability to finally find the correct code word. Simulation results show that the performance of one of the improved schemes is better than the weighted symbol flipping (WSF) algorithm under different hexadecimal numbers by about 2.2 dB and 2.35 dB at the bit error rate (BER) of 10 -5 over an additive white Gaussian noise (AWGN) channel, respectively. Furthermore, the average number of decoding iterations is significantly reduced.
NASA Astrophysics Data System (ADS)
Bautista, Nazan Uludag
2011-06-01
This study investigated the effectiveness of an Early Childhood Education science methods course that focused exclusively on providing various mastery (i.e., enactive, cognitive content, and cognitive pedagogical) and vicarious experiences (i.e., cognitive self-modeling, symbolic modeling, and simulated modeling) in increasing preservice elementary teachers' self-efficacy beliefs. Forty-four preservice elementary teachers participated in the study. Analysis of the quantitative (STEBI-b) and qualitative (informal surveys) data revealed that personal science teaching efficacy and science teaching outcome expectancy beliefs increased significantly over the semester. Enactive mastery, cognitive pedagogical mastery, symbolic modeling, and cognitive self-modeling were the major sources of self-efficacy. This list was followed by cognitive content mastery and simulated modeling. This study has implications for science teacher educators.
NASA Astrophysics Data System (ADS)
Obulesu, O.; Rama Mohan Reddy, A., Dr; Mahendra, M.
2017-08-01
Detecting regular and efficient cyclic models is the demanding activity for data analysts due to unstructured, vigorous and enormous raw information produced from web. Many existing approaches generate large candidate patterns in the occurrence of huge and complex databases. In this work, two novel algorithms are proposed and a comparative examination is performed by considering scalability and performance parameters. The first algorithm is, EFPMA (Extended Regular Model Detection Algorithm) used to find frequent sequential patterns from the spatiotemporal dataset and the second one is, ETMA (Enhanced Tree-based Mining Algorithm) for detecting effective cyclic models with symbolic database representation. EFPMA is an algorithm grows models from both ends (prefixes and suffixes) of detected patterns, which results in faster pattern growth because of less levels of database projection compared to existing approaches such as Prefixspan and SPADE. ETMA uses distinct notions to store and manage transactions data horizontally such as segment, sequence and individual symbols. ETMA exploits a partition-and-conquer method to find maximal patterns by using symbolic notations. Using this algorithm, we can mine cyclic models in full-series sequential patterns including subsection series also. ETMA reduces the memory consumption and makes use of the efficient symbolic operation. Furthermore, ETMA only records time-series instances dynamically, in terms of character, series and section approaches respectively. The extent of the pattern and proving efficiency of the reducing and retrieval techniques from synthetic and actual datasets is a really open & challenging mining problem. These techniques are useful in data streams, traffic risk analysis, medical diagnosis, DNA sequence Mining, Earthquake prediction applications. Extensive investigational outcomes illustrates that the algorithms outperforms well towards efficiency and scalability than ECLAT, STNR and MAFIA approaches.
NASA Astrophysics Data System (ADS)
Ahmad, Mohd Ali Khameini; Liao, Lingmin; Saburov, Mansoor
2018-06-01
We study the set of p-adic Gibbs measures of the q-state Potts model on the Cayley tree of order three. We prove the vastness of the set of the periodic p-adic Gibbs measures for such model by showing the chaotic behavior of the corresponding Potts-Bethe mapping over Q_p for the prime numbers p≡1 (mod 3). In fact, for 0< |θ -1|_p< |q|_p^2 < 1 where θ =\\exp _p(J) and J is a coupling constant, there exists a subsystem that is isometrically conjugate to the full shift on three symbols. Meanwhile, for 0< |q|_p^2 ≤ |θ -1|_p< |q|_p < 1, there exists a subsystem that is isometrically conjugate to a subshift of finite type on r symbols where r ≥ 4. However, these subshifts on r symbols are all topologically conjugate to the full shift on three symbols. The p-adic Gibbs measures of the same model for the prime numbers p=2,3 and the corresponding Potts-Bethe mapping are also discussed. On the other hand, for 0< |θ -1|_p< |q|_p < 1, we remark that the Potts-Bethe mapping is not chaotic when p=3 and p≡ 2 (mod 3) and we could not conclude the vastness of the set of the periodic p-adic Gibbs measures. In a forthcoming paper with the same title, we will treat the case 0< |q|_p ≤ |θ -1|_p < 1 for all prime numbers p.
Music viewed by its entropy content: A novel window for comparative analysis
Febres, Gerardo; Jaffe, Klaus
2017-01-01
Polyphonic music files were analyzed using the set of symbols that produced the Minimal Entropy Description, which we call the Fundamental Scale. This allowed us to create a novel space to represent music pieces by developing: (a) a method to adjust a textual description from its original scale of observation to an arbitrarily selected scale, (b) a method to model the structure of any textual description based on the shape of the symbol frequency profiles, and (c) the concept of higher order entropy as the entropy associated with the deviations of a frequency-ranked symbol profile from a perfect Zipfian profile. We call this diversity index the ‘2nd Order Entropy’. Applying these methods to a variety of musical pieces showed how the space of ‘symbolic specific diversity-entropy’ and that of ‘2nd order entropy’ captures characteristics that are unique to each music type, style, composer and genre. Some clustering of these properties around each musical category is shown. These methods allow us to visualize a historic trajectory of academic music across this space, from medieval to contemporary academic music. We show that the description of musical structures using entropy, symbol frequency profiles and specific symbolic diversity allows us to characterize traditional and popular expressions of music. These classification techniques promise to be useful in other disciplines for pattern recognition and machine learning. PMID:29040288
Music viewed by its entropy content: A novel window for comparative analysis.
Febres, Gerardo; Jaffe, Klaus
2017-01-01
Polyphonic music files were analyzed using the set of symbols that produced the Minimal Entropy Description, which we call the Fundamental Scale. This allowed us to create a novel space to represent music pieces by developing: (a) a method to adjust a textual description from its original scale of observation to an arbitrarily selected scale, (b) a method to model the structure of any textual description based on the shape of the symbol frequency profiles, and (c) the concept of higher order entropy as the entropy associated with the deviations of a frequency-ranked symbol profile from a perfect Zipfian profile. We call this diversity index the '2nd Order Entropy'. Applying these methods to a variety of musical pieces showed how the space of 'symbolic specific diversity-entropy' and that of '2nd order entropy' captures characteristics that are unique to each music type, style, composer and genre. Some clustering of these properties around each musical category is shown. These methods allow us to visualize a historic trajectory of academic music across this space, from medieval to contemporary academic music. We show that the description of musical structures using entropy, symbol frequency profiles and specific symbolic diversity allows us to characterize traditional and popular expressions of music. These classification techniques promise to be useful in other disciplines for pattern recognition and machine learning.
NASA Astrophysics Data System (ADS)
Khoa, Dao Tien; Egelhof, Peter; Gales, Sydney; Giai, Nguyen Van; Motobayashi, Tohru
2008-04-01
Studies at the RIKEN RI beam factory / T. Motobayashi -- Dilute nuclear states / M. Freer -- Studies of exotic systems using transfer reactions at GANIL / D. Beaumel et al. -- First results from the Magnex large-acceptance spectrometer / A. Cunsolo et al. -- The ICHOR project and spin-isospin physics with unstable beams / H. Sakai -- Structure and low-lying states of the [symbol]He exotic nucleus via direct reactions on proton / V. Lapoux et al. -- Shell gap below [symbol]Sn based on the excited states in [symbol]Cd and [symbol]In / M. Górska -- Heavy neutron-rich nuclei produced in the fragmentation of a [symbol]Pb beam / Zs. Podolyák et al. -- Breakup and incomplete fusion in reactions of weakly-bound nuclei / D.J. Hinde et al. -- Excited states of [symbol]B and [symbol]He and their cluster aspect / Y. Kanada-En'yo et al. -- Nuclear reactions with weakly-bound systems: the treatment of the continuum / C. H. Dasso, A. Vitturi -- Dynamic evolution of three-body decaying resonances / A. S. Jensen et al. -- Prerainbow oscillations in [symbol]He scattering from the Hoyle state of [symbol]C and alpha particle condensation / S. Ohkubo, Y. Hirabayashi -- Angular dispersion behavior in heavy ion elastic scattering / Q. Wang et al. -- Microscopic optical potential in relativistic approach / Z.Yu. Ma et al. -- Exotic nuclei studied in direct reactions at low momentum transfer - recent results and future perspectives at fair / P. Egelhof -- Isotopic temperatures and symmetry energy in spectator fragmentation / M. De Napoli et al. -- Multi-channel algebraic scattering theory and the structure of exotic compound nuclei / K. Amos et al. -- Results for the first feasibility study for the EXL project at the experimental storage ring at GSI / N. Kalantar-Nayestanaki et al. -- Coulomb excitation of ISOLDE neutron-rich beams along the Z = 28 chain / P. Van Duppen -- The gamma decay of the pygmy resonance far from stability and the GDR at finite temperature / G. Benzoni et al. -- Thermal pairing in nuclei / N. D. Dang -- Molecular-orbital and di-nuclei states in Ne and F isotopes / M. Kimura -- Low-momentum interactions for nuclei / A. Schwenk -- Nonrelativistic nuclear energy functionals including the tensor force / G. Colo et al. -- New aspects on dynamics in nuclei described by covariant density functional theory / P. Ring, D. Pena -- Theoretical studies on ground-state properties of superheavy nuclei / Z. Z. Ren et al. -- New results in the study of superfluid nuclei: many-body effects, spectroscopic factors / P. F. Bortignon et al. -- New Effective nucleon-nucleon interaction for the mean-field approximation / V. K. Au et al. -- Linear response calculations with the time-dependent Skyrme density functional / T. Nakatsukasa et al. -- Dissipative dynamics with exotic beams / M. Di Toro et al. -- Exploring the symmetry energy of asymmetric nuclear matter with heavy ion reactions / M. B. Tsang -- Invariant mass spectroscopy of halo nuclei / T. Nakamura et al. -- Core [symbol] structures in [symbol]C, [symbol]C and [symbol]C up to high excitation energies / H. G. Bohlen et al. -- Light neutron-rich nuclei studied by alpha-induced reactions / S. Shimoura -- Fusion and direct reactions around the Coulomb barrier for the system [symbol]He + [symbol]Zn / V. Scuderi et al. -- Analyzing power measurement for proton elastic scattering on [symbol]He / S. Sakaguchi et al. -- Knockout reaction spectroscopy of exotic nuclei / J. A. Tostevin -- Exotic nuclei, quantum phase transitions, and the evolution of structure / R. F. Casten -- Structure of exotic nuclei in the medium mass region / T. Otsuka -- Pairing correlations in halo nuclei / H. Sagawa, K. Hagino -- Experimental approach to high-temperature Stellar reactions with low-energy RI beams / S. Kubono et al. -- Transition to quark matter in neutron stars / G. X. Peng et al. -- Research at VATLY: main themes and recent results / P. N. Diep et al. -- Study of the astrophysical reaction [symbol]C([symbol], n)[symbol]O by the transfer reaction [symbol]C([symbol]Li, t)[symbol]O / F. Hammache et al. -- SPIRAL2 at GANIL: a world of leading ISOL facility for the physics of exotic nuclei / S. Gales -- Magnetic properties of light neutron-rich nuclei and shell evolution / T. Suzuki, T. Otsuka -- Multiple scattering effects in elastic and quasi free proton scattering from halo nuclei / R. Crespo et al. -- The dipole response of neutron halos and skins / T. Aumann -- Giant and pygmy resonances within axially-symmetric-deformed QRPA with the Gogny force / S. Péru, H. Goutte -- Soft K[symbol] = O+ modes unique to deformed neutron-rich unstable nuclei / K. Yoshida et al. -- Synthesis, decay properties, and identification of superheavy nuclei produced in [symbol]Ca-induced reactions / Yu. Ts. Oganessian et al. -- Highlights of the Brazilian RIB facility and its first results and hindrance of fusion cross section induced by [symbol]He / P. R. S. Gomes et al. -- Search for long fission times of super-heavy elements with Z = 114 / M. Morjean et al. -- Microscopic dynamics of shape coexistence phenomena around [symbol]Se and [symbol]Kr / N. Hinohara et al. -- [symbol]-cluster states and 4[symbol]-particle condensation in [symbol]O / Y. Funaki et al. -- Evolution of the N = 28 shell closure far from stability / O. Sorlin et al. -- Continuum QRPA approach and the surface di-neutron modes in nuclei near the neutron drip-line / M. Matsuo et al. -- Deformed relativistic Hartree-Bogoliubov model for exotic nuclei / S. G. Zhou et al. -- Two- and three-body correlations in three-body resonances and continuum states / K. Katō, K. Ikeda -- Pion- and Rho-Meson effects in relativistic Hartree-Fock and RPA / N. V. Giai et al. -- Study of the structure of neutron rich nuclei by using [symbol]-delayed neutron and gamma emission method / Y. Ye et al. -- Production of secondary radioactive [symbol] Na beam for the study of [symbol]Na([symbol], p)[symbol]Mg stellar reaction / D. N. Binh et al. -- Asymmetric nuclear matter properties within the Brueckner theory / W. Zuo et al. -- Study of giant dipole resonance in continuum relativistic random phase approximation / D. Yang et al. -- Chiral bands for quasi-proton and quasi-neutron coupling with a triaxial rotor / B. Qi et al. -- Continuum properties of the Hartree-Fock mean field with finite-range interactions / H. S. Than et al. -- A study of pairing interaction in a separable form / Y. Tian et al. -- Microscopic study of the inelastic [symbol]+[symbol]C scattering / D. C. Cuong, D. T. Khoa -- Probing the high density behavior of the symmetry energy / F. Zhang et al. -- Microscopic calculations based on a Skyrme functional plus the pairing contribution / J. Li et al. -- In-medium cross sections in Dirac-Brueckner-Hartree-Fock approach / L. Peiyan et al. -- The effect of the tensor force on single-particle states and on the isotope shift / W. Zou et al. -- [symbol]Ne excited states two-proton decay / M. De Napoli et al. -- The isomeric ratio and angular momentum of fragment [symbol]Xe in photofission of heavy nuclei / T. D. Thiep et al. -- Search for correlated two-nucleon systems in [symbol]Li and [symbol]He nuclei via one-nucleon exchange reaction / N. T. Khai et al. -- Summary talk of ISPUN07 / N. Alamanos.
Reading aloud in Persian: ERP evidence for an early locus of the masked onset priming effect.
Timmer, Kalinka; Vahid-Gharavi, Narges; Schiller, Niels O
2012-07-01
The current study investigates reading aloud words in Persian, a language that does not mark all its vowels in the script. Behaviorally, a masked onset priming effect (MOPE) was revealed for transparent words, with faster speech onset latencies in the phoneme-matching condition (i.e. phonological prime and target onset overlap; e.g. [symbol: see text] /sɒːl/; 'year' [symbol: see text] /sot/; 'voice') than the phoneme-mismatching condition (e.g. [symbol: see text] /tɒːb/ 'swing' - [symbol: see text] /sot/; 'voice'). For opaque target words (e.g. [symbol: see text] /solh/; 'peace'), no such effect was found. However, event-related potentials (ERPs) did reveal an amplitude difference between the two prime conditions in the 80-160 ms time window for transparent as well as opaque words. Only for the former, this effect continued into the 300-480 ms time window. This finding constrains the time course of the MOPE and suggests the simultaneous activation of both the non-lexical grapheme-to-phoneme and the lexical route in the dual-route cascaded (DRC) model. Copyright © 2012 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
2004-01-01
Two-dimensional data matrix symbols, which contain encoded letters and numbers, are permanently etched on items for identification. They can store up to 100 times more information than traditional bar codes. While the symbols provide several advantages over bar codes, once they are covered by paint they can no longer be read by optical scanners. Since most products are painted eventually, this presents a problem for industries relying on the symbols for identification and tracking. In 1987, NASA s Marshall Space Flight Center began studying direct parts marking with matrix symbols in order to track millions of Space Shuttle parts. Advances in the technology proved that by incorporating magnetic properties into the paints, inks, and pastes used to apply the matrix symbols, the codes could be read by a magnetic scanner even after being covered with paint or other coatings. NASA received a patent for such a scanner in 1998, but the system it used for development was not portable and was too costly. A prototype was needed as a lead-in to a production model. In the summer of 2000, NASA began seeking companies to build a hand-held scanner that would detect the Read Through Paint data matrix identification marks containing magnetic materials through coatings.
Compositional schedulability analysis of real-time actor-based systems.
Jaghoori, Mohammad Mahdi; de Boer, Frank; Longuet, Delphine; Chothia, Tom; Sirjani, Marjan
2017-01-01
We present an extension of the actor model with real-time, including deadlines associated with messages, and explicit application-level scheduling policies, e.g.,"earliest deadline first" which can be associated with individual actors. Schedulability analysis in this setting amounts to checking whether, given a scheduling policy for each actor, every task is processed within its designated deadline. To check schedulability, we introduce a compositional automata-theoretic approach, based on maximal use of model checking combined with testing. Behavioral interfaces define what an actor expects from the environment, and the deadlines for messages given these assumptions. We use model checking to verify that actors match their behavioral interfaces. We extend timed automata refinement with the notion of deadlines and use it to define compatibility of actor environments with the behavioral interfaces. Model checking of compatibility is computationally hard, so we propose a special testing process. We show that the analyses are decidable and automate the process using the Uppaal model checker.
Influence of I-ching (Yijing, or The Book Of Changes) on Chinese medicine, philosophy and science.
Lu, Dominic P
2013-01-01
I-Ching or Yi-Jing ([see text] also known as The Book of Changes) is the earliest classic in China. It simply explained the formation of the universe and the relationship of man to the universe. Most, if not all, branches of various knowledge, including traditional Chinese medicine, can be traced back its origin to this Book in which Fu Shi ([see text] 2852 B.C.) theorized how the universe was formed, through his keen observation of environment and orbits of sun, moon and stars. He used symbols to represent his views. The essence of I-Ching is basically the expression and function of Yang symbolized as "--" (from <---->) and Yin symbolized "- -" (from --><--), and [see text] Yin and Yang as interaction and circulation of Yang and Yin. Both Yin and Yang were derived from the same origin, Tai-Chi. Fu Shi believed Yin and Yang were the two opposite background force and energy that make the universe as what it is. Yang and Yin manifest in great variety of phenomena such as mind and body, masculine and feminine, sun and moon, hot and cold, heaven and earth, positive and negative electricity etc. The entire theory of Chinese medicine is based on the theories of Yin and Yang as well as that of 5 Element Cycles which are also related to the orderly arrangement of 8 trigrams ([see text]) by King Wen ([see text]1099-1050 B.C.). The 5 Elements Theory explains the "check and balance" mechanism created by the background force of Yin and Yang Qi and illustrated the relationships that are either strengthened or weakened by "acting and controlling" among the 5 elements. I-Ching has exerted profound influences on some well- known European philosophers and scientists, notably Leibnitz and Hegel. Between I-Ching and modern cosmology and the physics of sub-atomic particles, there are some basic theories in common.
ERIC Educational Resources Information Center
Trimboli, Angela
This document argues that the Statue of Liberty has a lot to offer teachers who need to teach citizenship to elementary students. Among the symbols within the statue that have relevance to citizenship are: (1) the tablet; (2) the chains; (3) the step from the chains; () the torch; (5) the crown; (6) the face; and (7) the new infrastructure. The…
Enhancement of the Daytime MODIS Based Aircraft Icing Potential Algorithm Using Mesoscale Model Data
2006-03-01
January, 15, 2006 ...... 37 x Figure 25. ROC curves using 3 hour PIREPs and Alexander Tmap with symbols plotted at the 0.5 threshold values...42 Figure 26. ROC curves using 3 hour PIREPs and Alexander Tmap with symbols plotted at the 0.5 threshold values...Table 4. Results using T icing potential values from the Alexander Tmap , and 3 Hour PIREPs
Creative brains: designing in the real world†
Goel, Vinod
2014-01-01
The process of designing artifacts is a creative activity. It is proposed that, at the cognitive level, one key to understanding design creativity is to understand the array of symbol systems designers utilize. These symbol systems range from being vague, imprecise, abstract, ambiguous, and indeterminate (like conceptual sketches), to being very precise, concrete, unambiguous, and determinate (like contract documents). The former types of symbol systems support associative processes that facilitate lateral (or divergent) transformations that broaden the problem space, while the latter types of symbol systems support inference processes facilitating vertical (or convergent) transformations that deepen of the problem space. The process of artifact design requires the judicious application of both lateral and vertical transformations. This leads to a dual mechanism model of design problem-solving comprising of an associative engine and an inference engine. It is further claimed that this dual mechanism model is supported by an interesting hemispheric dissociation in human prefrontal cortex. The associative engine and neural structures that support imprecise, ambiguous, abstract, indeterminate representations are lateralized in the right prefrontal cortex, while the inference engine and neural structures that support precise, unambiguous, determinant representations are lateralized in the left prefrontal cortex. At the brain level, successful design of artifacts requires a delicate balance between the two hemispheres of prefrontal cortex. PMID:24817846
ANSYS duplicate finite-element checker routine
NASA Technical Reports Server (NTRS)
Ortega, R.
1995-01-01
An ANSYS finite-element code routine to check for duplicated elements within the volume of a three-dimensional (3D) finite-element mesh was developed. The routine developed is used for checking floating elements within a mesh, identically duplicated elements, and intersecting elements with a common face. A space shuttle main engine alternate turbopump development high pressure oxidizer turbopump finite-element model check using the developed subroutine is discussed. Finally, recommendations are provided for duplicate element checking of 3D finite-element models.
Application of symbolic computations to the constitutive modeling of structural materials
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Tan, H. Q.; Dong, X.
1990-01-01
In applications involving elevated temperatures, the derivation of mathematical expressions (constitutive equations) describing the material behavior can be quite time consuming, involved and error-prone. Therefore intelligent application of symbolic systems to faciliate this tedious process can be of significant benefit. Presented here is a problem oriented, self contained symbolic expert system, named SDICE, which is capable of efficiently deriving potential based constitutive models in analytical form. This package, running under DOE MACSYMA, has the following features: (1) potential differentiation (chain rule), (2) tensor computations (utilizing index notation) including both algebraic and calculus; (3) efficient solution of sparse systems of equations; (4) automatic expression substitution and simplification; (5) back substitution of invariant and tensorial relations; (6) the ability to form the Jacobian and Hessian matrix; and (7) a relational data base. Limited aspects of invariant theory were also incorporated into SDICE due to the utilization of potentials as a starting point and the desire for these potentials to be frame invariant (objective). The uniqueness of SDICE resides in its ability to manipulate expressions in a general yet pre-defined order and simplify expressions so as to limit expression growth. Results are displayed, when applicable, utilizing index notation. SDICE was designed to aid and complement the human constitutive model developer. A number of examples are utilized to illustrate the various features contained within SDICE. It is expected that this symbolic package can and will provide a significant incentive to the development of new constitutive theories.
VEST: Abstract Vector Calculus Simplification in Mathematica
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. Squire, J. Burby and H. Qin
2013-03-12
We present a new package, VEST (Vector Einstein Summation Tools), that performs abstract vector calculus computations in Mathematica. Through the use of index notation, VEST is able to reduce scalar and vector expressions of a very general type using a systematic canonicalization procedure. In addition, utilizing properties of the Levi-Civita symbol, the program can derive types of multi-term vector identities that are not recognized by canonicalization, subsequently applying these to simplify large expressions. In a companion paper [1], we employ VEST in the automation of the calculation of Lagrangians for the single particle guiding center system in plasma physics, amore » computation which illustrates its ability to handle very large expressions. VEST has been designed to be simple and intuitive to use, both for basic checking of work and more involved computations. __________________________________________________« less
Understanding student use of mathematics in IPLS with the Math Epistemic Games Survey
NASA Astrophysics Data System (ADS)
Eichenlaub, Mark; Hemingway, Deborah; Redish, Edward F.
2017-01-01
We present the Math Epistemic Games Survey (MEGS), a new concept inventory on the use of mathematics in introductory physics for the life sciences. The survey asks questions that are often best-answered via techniques commonly-valued in physics instruction, including dimensional analysis, checking special or extreme cases, understanding scaling relationships, interpreting graphical representations, estimation, and mapping symbols onto physical meaning. MEGS questions are often rooted in quantitative biology. We present preliminary data on the validation and administration of the MEGS in a large, introductory physics for the life sciences course at the University of Maryland, as well as preliminary results on the clustering of questions and responses as a guide to student resource activation in problem solving. This material is based upon work supported by the US National Science Foundation under Award No. 15-04366.
VEST: Abstract vector calculus simplification in Mathematica
NASA Astrophysics Data System (ADS)
Squire, J.; Burby, J.; Qin, H.
2014-01-01
We present a new package, VEST (Vector Einstein Summation Tools), that performs abstract vector calculus computations in Mathematica. Through the use of index notation, VEST is able to reduce three-dimensional scalar and vector expressions of a very general type to a well defined standard form. In addition, utilizing properties of the Levi-Civita symbol, the program can derive types of multi-term vector identities that are not recognized by reduction, subsequently applying these to simplify large expressions. In a companion paper Burby et al. (2013) [12], we employ VEST in the automation of the calculation of high-order Lagrangians for the single particle guiding center system in plasma physics, a computation which illustrates its ability to handle very large expressions. VEST has been designed to be simple and intuitive to use, both for basic checking of work and more involved computations.
Pharmacist and Technician Perceptions of Tech-Check-Tech in Community Pharmacy Practice Settings.
Frost, Timothy P; Adams, Alex J
2018-04-01
Tech-check-tech (TCT) is a practice model in which pharmacy technicians with advanced training can perform final verification of prescriptions that have been previously reviewed for appropriateness by a pharmacist. Few states have adopted TCT in part because of the common view that this model is controversial among members of the profession. This article aims to summarize the existing research on pharmacist and technician perceptions of community pharmacy-based TCT. A literature review was conducted using MEDLINE (January 1990 to August 2016) and Google Scholar (January 1990 to August 2016) using the terms "tech* and check," "tech-check-tech," "checking technician," and "accuracy checking tech*." Of the 7 studies identified we found general agreement among both pharmacists and technicians that TCT in community pharmacy settings can be safely performed. This agreement persisted in studies of theoretical TCT models and in studies assessing participants in actual community-based TCT models. Pharmacists who had previously worked with a checking technician were generally more favorable toward TCT. Both pharmacists and technicians in community pharmacy settings generally perceived TCT to be safe, in both theoretical surveys and in surveys following actual TCT demonstration projects. These perceptions of safety align well with the actual outcomes achieved from community pharmacy TCT studies.
NASA Technical Reports Server (NTRS)
Arnold, S. M.; Saleeb, A. F.; Tan, H. Q.; Zhang, Y.
1993-01-01
The issue of developing effective and robust schemes to implement a class of the Ogden-type hyperelastic constitutive models is addressed. To this end, special purpose functions (running under MACSYMA) are developed for the symbolic derivation, evaluation, and automatic FORTRAN code generation of explicit expressions for the corresponding stress function and material tangent stiffness tensors. These explicit forms are valid over the entire deformation range, since the singularities resulting from repeated principal-stretch values have been theoretically removed. The required computational algorithms are outlined, and the resulting FORTRAN computer code is presented.
Performance of concatenated Reed-Solomon/Viterbi channel coding
NASA Technical Reports Server (NTRS)
Divsalar, D.; Yuen, J. H.
1982-01-01
The concatenated Reed-Solomon (RS)/Viterbi coding system is reviewed. The performance of the system is analyzed and results are derived with a new simple approach. A functional model for the input RS symbol error probability is presented. Based on this new functional model, we compute the performance of a concatenated system in terms of RS word error probability, output RS symbol error probability, bit error probability due to decoding failure, and bit error probability due to decoding error. Finally we analyze the effects of the noisy carrier reference and the slow fading on the system performance.
Advanced compilation techniques in the PARADIGM compiler for distributed-memory multicomputers
NASA Technical Reports Server (NTRS)
Su, Ernesto; Lain, Antonio; Ramaswamy, Shankar; Palermo, Daniel J.; Hodges, Eugene W., IV; Banerjee, Prithviraj
1995-01-01
The PARADIGM compiler project provides an automated means to parallelize programs, written in a serial programming model, for efficient execution on distributed-memory multicomputers. .A previous implementation of the compiler based on the PTD representation allowed symbolic array sizes, affine loop bounds and array subscripts, and variable number of processors, provided that arrays were single or multi-dimensionally block distributed. The techniques presented here extend the compiler to also accept multidimensional cyclic and block-cyclic distributions within a uniform symbolic framework. These extensions demand more sophisticated symbolic manipulation capabilities. A novel aspect of our approach is to meet this demand by interfacing PARADIGM with a powerful off-the-shelf symbolic package, Mathematica. This paper describes some of the Mathematica routines that performs various transformations, shows how they are invoked and used by the compiler to overcome the new challenges, and presents experimental results for code involving cyclic and block-cyclic arrays as evidence of the feasibility of the approach.
Schlosser, Ralf W; Koul, Rajinder; Shane, Howard; Sorce, James; Brock, Kristofer; Harmon, Ashley; Moerlein, Dorothy; Hearn, Emilia
2014-10-01
The effects of animation on naming and identification of graphic symbols for verbs and prepositions were studied in 2 graphic symbol sets in preschoolers. Using a 2 × 2 × 2 × 3 completely randomized block design, preschoolers across three age groups were randomly assigned to combinations of symbol set (Autism Language Program [ALP] Animated Graphics or Picture Communication Symbols [PCS]), symbol format (animated or static), and word class (verbs or prepositions). Children were asked to name symbols and to identify a target symbol from an array given the spoken label. Animated symbols were more readily named than static symbols, although this was more pronounced for verbs than for prepositions. ALP symbols were named more accurately than PCS in particular with prepositions. Animation did not facilitate identification. ALP symbols for prepositions were identified better than PCS, but there was no difference for verbs. Finally, older children guessed and identified symbols more effectively than younger children. Animation improves the naming of graphic symbols for verbs. For prepositions, ALP symbols are named more accurately and are more readily identifiable than PCS. Naming and identifying symbols are learned skills that develop over time. Limitations and future research directions are discussed.
A human performance evaluation of graphic symbol-design features.
Samet, M G; Geiselman, R E; Landee, B M
1982-06-01
16 subjects learned each of two tactical display symbol sets (conventional symbols and iconic symbols) in turn and were then shown a series of graphic displays containing various symbol configurations. For each display, the subject was asked questions corresponding to different behavioral processes relating to symbol use (identification, search, comparison, pattern recognition). The results indicated that: (a) conventional symbols yielded faster pattern-recognition performance than iconic symbols, and iconic symbols did not yield faster identification than conventional symbols, and (b) the portrayal of additional feature information (through the use of perimeter density or vector projection coding) slowed processing of the core symbol information in four tasks, but certain symbol-design features created less perceptual interference and had greater correspondence with the portrayal of specific tactical concepts than others. The results were discussed in terms of the complexities involved in the selection of symbol design features for use in graphic tactical displays.
System and method for forward error correction
NASA Technical Reports Server (NTRS)
Cole, Robert M. (Inventor); Bishop, James E. (Inventor)
2006-01-01
A system and method are provided for transferring a packet across a data link. The packet may include a stream of data symbols which is delimited by one or more framing symbols. Corruptions of the framing symbol which result in valid data symbols may be mapped to invalid symbols. If it is desired to transfer one of the valid data symbols that has been mapped to an invalid symbol, the data symbol may be replaced with an unused symbol. At the receiving end, these unused symbols are replaced with the corresponding valid data symbols. The data stream of the packet may be encoded with forward error correction information to detect and correct errors in the data stream.
System and method for transferring data on a data link
NASA Technical Reports Server (NTRS)
Cole, Robert M. (Inventor); Bishop, James E. (Inventor)
2007-01-01
A system and method are provided for transferring a packet across a data link. The packet may include a stream of data symbols which is delimited by one or more framing symbols. Corruptions of the framing symbol which result in valid data symbols may be mapped to invalid symbols. If it is desired to transfer one of the valid data symbols that has been mapped to an invalid symbol, the data symbol may be replaced with an unused symbol. At the receiving end, these unused symbols are replaced with the corresponding valid data symbols. The data stream of the packet may be encoded with forward error correction information to detect and correct errors in the data stream.
Solid State Ionics Advanced Materials for Emerging Technologies
NASA Astrophysics Data System (ADS)
Chowdari, B. V. R.; Careem, M. A.; Dissanayake, M. A. K. L.; Rajapakse, R. M. G.; Seneviratne, V. A.
2006-06-01
Keynote lecture. Challenges and opportunities of solid state ionic devices / W. Weppner -- pt. I. Ionically conducting inorganic solids. Invited papers. Multinuclear NMR studies of mass transport of phosphoric acid in water / J. R. P. Jayakody ... [et al.]. Crystalline glassy and polymeric electrolytes: similarities and differences in ionic transport mechanisms / J.-L. Souquet. 30 years of NMR/NQR experiments in solid electrolytes / D. Brinkmann. Analysis of conductivity and NMR measurements in Li[symbol]La[symbol]TiO[symbol] fast Li[symbol] ionic conductor: evidence for correlated Li[symbol] motion / O. Bohnké ... [et al.]. Transport pathways for ions in disordered solids from bond valence mismatch landscapes / S. Adams. Proton conductivity in condensed phases of water: implications on linear and ball lightning / K. Tennakone -- Contributed papers. Proton transport in nanocrystalline bioceramic materials: an investigative study of synthetic bone with that of natural bone / H. Jena, B. Rambabu. Synthesis and properties of the nanostructured fast ionic conductor Li[symbol]La[symbol]TiO[symbol] / Q. N. Pham ... [et al.]. Hydrogen production: ceramic materials for high temperature water electrolysis / A. Hammou. Influence of the sintering temperature on pH sensor ability of Li[symbol]La[symbol]TiO[symbol]. Relationship between potentiometric and impedance spectroscopy measurements / Q. N. Pham ... [et al.]. Microstructure chracterization and ionic conductivity of nano-sized CeO[symbol]-Sm[symbol]O[symbol] system (x=0.05 - 0.2) prepared by combustion route / K. Singh, S. A. Acharya, S. S. Bhoga. Red soil in Northern Sri Lanka is a natural magnetic ceramic / K. Ahilan ... [et al.]. Neutron scattering of LiNiO[symbol] / K. Basar ... [et al.]. Preparation and properties of LiFePO[symbol] nanorods / L. Q. Mai ... [et al.]. Structural and electrochemical properties of monoclinic and othorhombic MoO[symbol] phases / O. M. Hussain ... [et al.]. Preparation of Zircon (ZrSiO[symbol]) ceramics via solid state sintering of Zr)[symbol] and SiO[symbol] and the effect of dopants on the zircon yield / U. Dhanayake, B. S. B. Karunaratne. Preparation and properties of vanadium doped ZnTe cermet thin films / M. S. Hossain, R. Islam, K. A. Khan. Dynamical properties and electronic structure of lithium-ion conductor / M. Kobayashi ... [et al.]. Cuprous ion conducting Montmorillonite-Polypyrrole nanocomposites / D. M. M. Krishantha ... [et al.]. Frequency dependence of conductivity studies on a newly synthesized superionic solid solution/mixed system: [0.75AgI: 0.25AgCl] / R. K. Nagarch, R. Kumar. Diffuse X-ray and neutron scattering from Powder PbS / X. Lian ... [et al.]. Electron affinity and work function of Pyrolytic MnO[symbol] thin films prepared from Mn(C[symbol]H[symbol]O[symbol])[symbol].4H[symbol]) / A. K. M. Farid Ul Islam, R. Islam, K. A. Khan. Crystal structure and heat capacity of Ba[symbol]Ca[symbol]Nb[symbol]O[symbol] / T. Shimoyama ... [et al.]. XPS and impedance investigations on amorphous vanadium oxide thin films / M. Kamalanathan ... [et al.]. Sintering and mixed electronic-ionic conducting properties of La[symbol]Sr[symbol]NiO[symbol] derived from a polyaminocarboxylate complex precursor / D.-P. Huang ... [et al.]. Preparation and characteristics of ball milled MgH[symbol] + M (M= Fe, VF[symbol] and FeF[symbol]) nanocomposites for hydrogen storage / N. W. B. Balasooriya, Ch. Poinsignon. Structural studies of oxysulfide glasses by X-ray diffraction and molecular dynamics simulation / R. Prasada Rao, M. Seshasayee, J. Dheepa. Synthesis, sintering and oxygen ionic conducting properties of Bi[symbol]V[symbol]Cu[symbol]O[symbol] / F. Zhang ... [et al.]. Synthesis and transport characteristics of PbI[symbol]-Ag[symbol]O-Cr[symbol]O[symbol] superioninc system / S. A. Suthanthiraraj, V. Mathew. Electronic conductivity of La[symbol]Sr[symbol]Ga[symbol]Mg[symbol]Co[symbol]O[symbol] electrolytes / K. Yamaji ... [et al.] -- pt. II. Electrode materials. Invited papers. Cathodic properties of Al-doped LiCoO[symbol] prepared by molten salt method Li-Ion batteries / M. V. Reddy, G. V. Subba Rao, B. V. R. Chowdari. Layered ion-electron conducting materials / M. A. Santa Ana, E. Benavente, G. González. LiNi[symbol]Co[symbol]O[symbol] cathode thin-film prepared by RF sputtering for all-solid-state rechargeable microbatteries / X. J. Zhu ... [et al.] -- Contributed papers. Contributed papers. Nanocomposite cathode for SOFCs prepared by electrostatic spray deposition / A. Princivalle, E. Djurado. Effect of the addition of nanoporous carbon black on the cycling characteristics of Li[symbol]Co[symbol](MoO[symbol])[symbol] for lithium batteries / K. M. Begam, S. R. S. Prabaharan. Protonic conduction in TiP[symbol]O[symbol] / V. Nalini, T. Norby, A. M. Anuradha. Preparation and electrochemical LiMn[symbol]O[symbol] thin film by a solution deposition method / X. Y. Gan ... [et al.]. Synthesis and characterization LiMPO[symbol] (M = Ni, Co) / T. Savitha, S. Selvasekarapandian, C. S. Ramya. Synthesis and electrical characterization of LiCoO[symbol] LiFeO[symbol] and NiO compositions / A. Wijayasinghe, B. Bergman. Natural Sri Lanka graphite as conducting enhancer in manganese dioxide (Emd type) cathode of alkaline batteries / N. W. B. Balasooriya ... [et al.]. Electrochemical properties of LiNi[symbol]Al[symbol]Zn[symbol]O[symbol] cathode material synthesized by emulsion method / B.-H. Kim ... [et al.]. LiNi[symbol]Co[symbol]O[symbol] cathode materials synthesized by particulate sol-gel method for lithium ion batteries / X. J. Zhu ... [et al.]. Pulsed laser deposition of highly oriented LiCoO[symbol] and LiMn[symbol]O[symbol] thin films for microbattery applications / O. M. Hussain ... [et al.]. Preparation of LiNi[symbol]Co[symbol]O[symbol] thin films by a sol-gel method / X. J. Zhu ... [et al.]. Electrochemical lithium insertion into a manganese dioxide electrode in aqueous solutions / M. Minakshi ... [et al.]. AC impedance spectroscopic analysis of thin film LiNiVO[symbol] prepared by pulsed laser deposition technique / S. Selvasekarapandian ... [et al.]. Synthesis and characterization of LiFePO[symbol] cathode materials by microwave processing / J. Zhou ... [et al.]. Characterization of Nd[symbol]Sr[symbol]CoO[symbol] including Pt second phase as the cathode material for low-temperature SOFCs / J. W. Choi ... [et al.]. Thermodynamic behavior of lithium intercalation into natural vein and synthetic graphite / N. W. B. Balasooriya, P. W. S. K. Bandaranayake, Ph. Touzain -- pt. III. Electroactive polymers. Invited papers. Organised or disorganised? looking at polymer electrolytes from both points of view / Y.-P. Liao ... [et al.]. Polymer electrolytes - simple low permittivity solutions? / I. Albinsson, B.-E. Mellander. Dependence of conductivity enhancement on the dielectric constant of the dispersoid in polymer-ferroelectric composite electrolytes / A. Chandra, P. K. Singh, S. Chandra. Design and application of boron compounds for high-performance polymer electrolytes / T. Fujinami. Structural, vibrational and AC impedance analysis of nano composite polymer electrolytes based on PVAC / S. Selvasekarapandian ... [et al.]. Absorption intensity variation with ion association in PEO based electrolytes / J. E. Furneaux ... [et al.]. Study of ion-polymer interactions in cationic and anionic ionomers from the dependence of conductivity on pressure and temperature / M. Duclot ... [et al.]. Triol based polyurethane gel electrolytes for electrochemical devices / A. R. Kulkarni. Contributed papers. Accurate conductivity measurements to solvation energies in nafion / M. Maréchal, J.-L Souquet. Ion conducting behaviour of composite polymer gel electrolyte: PEG-PVA-(NH[symbol]CH[symbol]CO[symbol])[symbol] system / S. L. Agrawal, A. Awadhia, S. K. Patel. Impedance spectroscopy and DSC studies of poly(vinylalcohol)/ silicotungstic acid crosslinked composite membranes / A. Anis, A. K. Banthia. (PEO)[symbol]:Na[symbol]P[symbol]O[symbol]: a report on complex formation / A. Bhide, K. Hariharan. Experimental studies on (PVC+LiClO[symbol]+DMP) polymer electrolyte systems for lithium battery / Ch. V. S. Reddy. Stability of the gel electrolyte, PAN: EC: PC: LiCF[symbol]SO[symbol] towards lithium / K. Perera ... [et al.]. Montmorillonite as a conductivity enhancer in (PEO)[symbol]LiCF[symbol]SO[symbol] polymer electrolyte / C. H. Manoratne ... [et al.]. Polymeric gel electrolytes for electrochemical capacitors / M. Morita ... [et al.]. Electrical conductivity studies on proton conducting polymer electrolytes based on poly (viniyl acetate) / D. Arun Kumar ... [et al.]. Conductivity and thermal studies on plasticized PEO:LiTf-Al[symbol]O[symbol] composite polymer electrolyte / H. M. J. C. Pitawala, M. A. K. L. Dissanayake, V. A. Seneviratne. Investigation of transport properties of a new biomaterials - gum mangosteen / S. S. Pradhan, A. Sarkar. Investigation of ionic conductivity of PEO-MgCl[symbol] based solid polymer electrolyte / M. Sundar ... [et al.]. [symbol]H NMR and Raman analysis of proton conducting polymer electrolytes based on partially hydrolyzed poly (vinyl alcohol) / G. Hirankumar ... [et al.]. Influence of Al[symbol]O[symbol] nanoparticles on the phase matrix of polyethylene oxide-silver triflate polymer electrolytes / S. Austin Suthanthiraraj, D. Joice Sheeba. Effect of different types of ceramic fillers on thermal, dielectric and transport properties of PEO[symbol]LiTf solid polymer electrolyte / K. Vignarooban ... [et al.]. Characterization of PVP based solid polymer electrolytes using spectroscopic techniques / C. S. Ramya ... [et al.]. Electrochemical and structural properties of poly vinylidene fluoride - silver triflate solid polymer electrolyte system / S. Austin Suthanthiraraj, B. Joseph Paul. Micro Raman, Li NMR and AC impedance analysis of PVAC:LiClO[symbol] solid polymer eectrolytes / R. Baskaran ... [et al.].Study of Na+ ion conduction in PVA-NaSCN solid polymer electrolytes / G. M. Brahmanandhan ... [et al.]. Effect of filler addition on plasticized polymer electrolyte systems / M. Sundar, S. Selladurai. Ionic motion in PEDOT and PPy conducting polymer bilayers / U. L. Zainudeen, S. Skaarup, M. A. Careem. Film formation mechanism and electrochemical characterization of V[symbol]O[symbol] xerogel intercalated by polyaniniline / Q. Zhu ... [et al.]. Effect of NH[symbol]NO[symbol] concentration on the conductivity of PVA based solid polymer electrolyte / M. Hema ... [et al.]. Dielectric and conductivity studies of PVA-KSCN based solid polymer electrolytes / J. Malathi ... [et al.] -- pt. IV. Emerging applications. Invited papers. The use of solid state ionic materials and devices in medical applications / R. Linford. Development of all-solid-state lithium batteries / V. Thangadurai, J. Schwenzei, W. Weppner. Reversible intermediate temperature solid oxide fuel cells / B.-E. Mellander, I. Albinsson. Nano-size effects in lithium batteries / P. Balaya, Y. Hu, J. Maier. Electrochromics: fundamentals and applications / C. G. Granqvist. Electrochemical CO[symbol] gas sensor / K. Singh. Polypyrrole for artificial muscles: ionic mechanisms / S. Skaarup. Development and characterization of polyfluorene based light emitting diodes and their colour tuning using Forster resonance energy transfer / P. C. Mattur ... [et al.]. Mesoporous and nanoparticulate metal oxides: applications in new photocatalysis / C. Boxall. Proton Conducting (PC) perovskite membranes for hydrogen separation and PC-SOFC electrodes and electrolytes / H. Jena, B. Rambabu. Contributed papers. Electroceramic materials for the development of natural gas fuelled SOFC/GT plant in developing country (Trinidad and Tobogo (T&T)) / R. Saunders, H. Jena, B. Rambabu. Thin film SOFC supported on nano-porous substrate / J. Hoon Joo, G. M. Choi. Characterization and fabrication of silver solid state battery Ag/AGI-AgPO[symbol]/I[symbol], C / E. Kartini ... [et al.]. Performance of lithium polymer cells with polyacrylonitrile based electrolyte / K. Perera ... [et al.]. Hydrothermal synthesis and electrochemical behavior of MoO[symbol] nanobelts for lithium batteries / Y. Qi ... [et al.]. Electrochemical behaviour of a PPy (DBS)/polyacrylonitrile: LiTF:EC:PC/Li cell / K. Vidanapathirana ... [et al.]. Characteristics of thick film CO[symbol] sensors based on NASICON using Li[symbol]CO[symbol]-CaCO[symbol] auxiliary phases / H. J. Kim ... [et al.]. Solid state battery discharge characteristic study on fast silver ion conducting composite system: 0.9[0.75AgI:0.25AgCl]: 0.1TiO[symbol] / R. K. Nagarch, R. Kumar, P. Rawat. Intercalating protonic solid-state batteries with series and parallel combination / K. Singh, S. S. Bhoga, S. M. Bansod. Synthesis and characterization of ZnO fiber by microwave processing / Lin Wang ... [et al.]. Preparation of Sn-Ge alloy coated Ge nanoparticles and Sn-Si alloy coated Si nanoparticles by ball-milling / J. K. D. S. Jayanett, S. M. Heald. Synthesis of ultrafine and crystallized TiO[symbol] by alalkoxied free polymerizable precursor method / M. Vijayakumar ... [et al.]. Development and characterization of polythiophene/fullerene composite solar cells and their degradation studies / P. K. Bhatnagar ... [et al.].
Exceptional M-brane sigma models and η-symbols
NASA Astrophysics Data System (ADS)
Sakatani, Yuho; Uehara, Shozo
2018-03-01
We develop the M-brane actions proposed in Y. Sakatani and S. Uehara, arXiv:1607.04265, by using η-symbols determined in Y. Sakatani and S. Uehara, arXiv:1708.06342. Introducing η-forms that are defined with the η-symbols, we present U-duality-covariant M-brane actions which describe the known brane worldvolume theories for Mp-branes with p=0,2,5. We show that the self-duality relation known in the double sigma model is naturally generalized to M-branes. In particular, for an M5-brane, the self-duality relation is nontrivially realized, where the Hodge star operator is defined with the familiar M5-brane metric while the η-form contains the self-dual three-form field strength. The action for a Kaluza-Klein monopole is also partially reproduced. Moreover, we explain how to treat type IIB branes in our general formalism. As a demonstration, we reproduce the known action for a (p,q)-string.
Code-Time Diversity for Direct Sequence Spread Spectrum Systems
Hassan, A. Y.
2014-01-01
Time diversity is achieved in direct sequence spread spectrum by receiving different faded delayed copies of the transmitted symbols from different uncorrelated channel paths when the transmission signal bandwidth is greater than the coherence bandwidth of the channel. In this paper, a new time diversity scheme is proposed for spread spectrum systems. It is called code-time diversity. In this new scheme, N spreading codes are used to transmit one data symbol over N successive symbols interval. The diversity order in the proposed scheme equals to the number of the used spreading codes N multiplied by the number of the uncorrelated paths of the channel L. The paper represents the transmitted signal model. Two demodulators structures will be proposed based on the received signal models from Rayleigh flat and frequency selective fading channels. Probability of error in the proposed diversity scheme is also calculated for the same two fading channels. Finally, simulation results are represented and compared with that of maximal ration combiner (MRC) and multiple-input and multiple-output (MIMO) systems. PMID:24982925
Exploring a Model of Symbolic Social Communication: The Case of ‘Magic’ Johnson
FLORA, JUNE A.; SCHOOLER, CAROLINE; MAYS, VICKIE M.; COCHRAN, SUSAN D.
2009-01-01
We propose a model of symbolic social communication to explain the process whereby sociocultural identity mediates relationships among receivers, sources and messages to shape message effects. This exploratory study examines how two at-risk groups of African American men responded to various HIV prevention messages delivered by celebrity and professional sources. We interviewed 47 men from a homeless shelter and 50 male college students. Members of both groups were likely to select Johnson as the best person to deliver HIV prevention messages among a list of African American celebrity and professional sources. Results suggest the symbolic meanings embedded in celebrities and message topics are important and enduring influences on message effects. The images and ideas that a source represents are transferred to the advocated behavior, attitude or knowledge change and thus shape how messages are interpreted and received. Further understanding of how culture influences the effects of persuasive messages is critical for the improvement of health-communication campaigns. PMID:22011997
NASA Astrophysics Data System (ADS)
Quintero-Quiroz, C.; Sorrentino, Taciano; Torrent, M. C.; Masoller, Cristina
2016-04-01
We study the dynamics of semiconductor lasers with optical feedback and direct current modulation, operating in the regime of low frequency fluctuations (LFFs). In the LFF regime the laser intensity displays abrupt spikes: the intensity drops to zero and then gradually recovers. We focus on the inter-spike-intervals (ISIs) and use a method of symbolic time-series analysis, which is based on computing the probabilities of symbolic patterns. We show that the variation of the probabilities of the symbols with the modulation frequency and with the intrinsic spike rate of the laser allows to identify different regimes of noisy locking. Simulations of the Lang-Kobayashi model are in good qualitative agreement with experimental observations.
Design and development of an AAC app based on a speech-to-symbol technology.
Radici, Elena; Bonacina, Stefano; De Leo, Gianluca
2016-08-01
The purpose of this paper is to present the design and the development of an Augmentative and Alternative Communication app that uses a speech to symbol technology to model language, i.e. to recognize the speech and display the text or the graphic content related to it. Our app is intended to be adopted by communication partners who want to engage in interventions focused on improving communication skills. Our app has the goal of translating simple speech sentences in a set of symbols that are understandable by children with complex communication needs. We moderated a focus group among six AAC communication partners. Then, we developed a prototype. We are currently starting testing our app in an AAC Centre in Milan, Italy.
Stewart, Terrence C; Eliasmith, Chris
2013-06-01
Quantum probability (QP) theory can be seen as a type of vector symbolic architecture (VSA): mental states are vectors storing structured information and manipulated using algebraic operations. Furthermore, the operations needed by QP match those in other VSAs. This allows existing biologically realistic neural models to be adapted to provide a mechanistic explanation of the cognitive phenomena described in the target article by Pothos & Busemeyer (P&B).
2009-09-01
problems, to better model the problem solving of computer systems. This research brought about the intertwining of AI and cognitive psychology . Much of...where symbol sequences are sequential intelligent states of the network, and must be classified as normal, abnormal , or unknown. These symbols...is associated with abnormal behavior; and abcbc is associated with unknown behavior, as it fits no known behavior. Predicted outcomes from
Baikov, P A; Chetyrkin, K G; Kühn, J H
2006-01-13
We compute, for the first time, the absorptive part of the massless correlator of two quark scalar currents in five loops. As physical applications, we consider the [symbol: see text](alpha(s)4) corrections to the decay rate of the standard model Higgs boson into quarks, as well as the constraints on the strange quark mass following from QCD sum rules.
Joy Lo, Chih-Wei; Yien, Huey-Wen; Chen, I-Ping
2016-04-01
To evaluate the effectiveness of universal health symbol usage and to analyze the factors influencing the adoption of those symbols in Taiwan. Universal symbols are an important innovative tool for health facility wayfinding systems. Hablamos Juntos, a universal healthcare symbol system developed in the United States, is a thoughtful, well-designed, and thoroughly tested symbol system that facilitates communication across languages and cultures. We designed a questionnaire to test how well the selected graphic symbols were understood by Taiwanese participants and determined factors related to successful symbol decoding, including participant-related factors, stimulation factors, and the interaction between stimulation and participants. Additionally, we further established a design principle for future development of localized healthcare symbols. (1) Eleven symbols were identified as highly comprehensible and effective symbols that can be directly adopted in Taiwanese healthcare settings. Sixteen symbols were deemed incomprehensible or confusing and thus had to be redesigned. Finally, 14 were identified as relatively incomprehensible and could thus be redesigned and then have their effectiveness evaluated again. (2) Three factors were found to influence the participants' differing levels of comprehension of the Hablamos Juntos symbols. In order to prevent the three aforementioned factors from causing difficulty in interpreting symbols, we suggest that the local symbol designers should (1) use more iconic images, (2) carefully evaluate the indexical and symbolic meaning of graphic symbols, and (3) collect the consensus of Taiwanese people with different educational backgrounds. © The Author(s) 2016.
Gebuis, Titia; Herfs, Inkeri K; Kenemans, J Leon; de Haan, Edward H F; van der Smagt, Maarten J
2009-11-01
Infants can visually detect changes in numerosity, which suggests that a (non-symbolic) numerosity system is already present early in life. This non-symbolic system is hypothesized to serve as the basis for the later acquired symbolic system. Little is known about the processes underlying the transition from the non-symbolic to symbolic code. In the current study we investigated the development of automatization of symbolic number processing in children from second (6.0 years) and fourth grade (8.0 years) and adults using a symbolic and non-symbolic size congruency task and event-related potentials (ERPs) as a measure. The comparison between symbolic and non-symbolic size congruency effects (SCEs) allowed us to disentangle processes necessary to perform the task from processes specific to numerosity notation. In contrast to previous studies, second graders already revealed a behavioral symbolic SCE similar to that of adults. In addition, the behavioral SCE increased for symbolic and decreased for non-symbolic notation with increasing age. For all age groups, the ERP data showed that the two magnitudes interfered at a level before selective activation of the response system, for both notations. However, only for the second graders distinct processes were recruited to perform the symbolic size comparison task. This shift in recruited processes for the symbolic task only might reflect the functional specialization of the parietal cortex.
Lytle, Nicole; London, Kamala; Bruck, Maggie
2015-01-01
In two experiments, we investigated 3- to 5-year-old children’s ability to use dolls and human figure drawings as symbols to map body touches. In Experiment 1 stickers were placed on different locations of children’s bodies, and they were asked to indicate the location of the sticker using three different symbols: a doll, a human figure drawing, and the adult researcher. Performance on the tasks increased with age, but many 5-year-olds did not attain perfect performance. Surprisingly, younger children made more errors on the 2D human figure drawing task compared to the 3D doll and adult tasks. In Experiment 2, we compared children’s ability to use 3D and 2D symbols to indicate body touch as well as to guide their search for a hidden object. We replicated the findings of Experiment 1 for the body touch task: for younger children, 3D symbols were easier to use than 2D symbols. However, the reverse pattern was found for the object locations task with children showing superior performance using 2D drawings over 3D models. Though children showed developmental improvements in using dolls and drawings to show where they were touched, less than two-thirds of the 5-year-olds performed perfectly on the touch tasks. Developmental as well as forensic implications of these results are discussed. PMID:25781003
Model Checking the Remote Agent Planner
NASA Technical Reports Server (NTRS)
Khatib, Lina; Muscettola, Nicola; Havelund, Klaus; Norvig, Peter (Technical Monitor)
2001-01-01
This work tackles the problem of using Model Checking for the purpose of verifying the HSTS (Scheduling Testbed System) planning system. HSTS is the planner and scheduler of the remote agent autonomous control system deployed in Deep Space One (DS1). Model Checking allows for the verification of domain models as well as planning entries. We have chosen the real-time model checker UPPAAL for this work. We start by motivating our work in the introduction. Then we give a brief description of HSTS and UPPAAL. After that, we give a sketch for the mapping of HSTS models into UPPAAL and we present samples of plan model properties one may want to verify.
Random Testing and Model Checking: Building a Common Framework for Nondeterministic Exploration
NASA Technical Reports Server (NTRS)
Groce, Alex; Joshi, Rajeev
2008-01-01
Two popular forms of dynamic analysis, random testing and explicit-state software model checking, are perhaps best viewed as search strategies for exploring the state spaces introduced by nondeterminism in program inputs. We present an approach that enables this nondeterminism to be expressed in the SPIN model checker's PROMELA language, and then lets users generate either model checkers or random testers from a single harness for a tested C program. Our approach makes it easy to compare model checking and random testing for models with precisely the same input ranges and probabilities and allows us to mix random testing with model checking's exhaustive exploration of non-determinism. The PROMELA language, as intended in its design, serves as a convenient notation for expressing nondeterminism and mixing random choices with nondeterministic choices. We present and discuss a comparison of random testing and model checking. The results derive from using our framework to test a C program with an effectively infinite state space, a module in JPL's next Mars rover mission. More generally, we show how the ability of the SPIN model checker to call C code can be used to extend SPIN's features, and hope to inspire others to use the same methods to implement dynamic analyses that can make use of efficient state storage, matching, and backtracking.
CheckMATE 2: From the model to the limit
NASA Astrophysics Data System (ADS)
Dercks, Daniel; Desai, Nishita; Kim, Jong Soo; Rolbiecki, Krzysztof; Tattersall, Jamie; Weber, Torsten
2017-12-01
We present the latest developments to the CheckMATE program that allows models of new physics to be easily tested against the recent LHC data. To achieve this goal, the core of CheckMATE now contains over 60 LHC analyses of which 12 are from the 13 TeV run. The main new feature is that CheckMATE 2 now integrates the Monte Carlo event generation via MadGraph5_aMC@NLO and Pythia 8. This allows users to go directly from a SLHA file or UFO model to the result of whether a model is allowed or not. In addition, the integration of the event generation leads to a significant increase in the speed of the program. Many other improvements have also been made, including the possibility to now combine signal regions to give a total likelihood for a model.
LDPC-PPM Coding Scheme for Optical Communication
NASA Technical Reports Server (NTRS)
Barsoum, Maged; Moision, Bruce; Divsalar, Dariush; Fitz, Michael
2009-01-01
In a proposed coding-and-modulation/demodulation-and-decoding scheme for a free-space optical communication system, an error-correcting code of the low-density parity-check (LDPC) type would be concatenated with a modulation code that consists of a mapping of bits to pulse-position-modulation (PPM) symbols. Hence, the scheme is denoted LDPC-PPM. This scheme could be considered a competitor of a related prior scheme in which an outer convolutional error-correcting code is concatenated with an interleaving operation, a bit-accumulation operation, and a PPM inner code. Both the prior and present schemes can be characterized as serially concatenated pulse-position modulation (SCPPM) coding schemes. Figure 1 represents a free-space optical communication system based on either the present LDPC-PPM scheme or the prior SCPPM scheme. At the transmitting terminal, the original data (u) are processed by an encoder into blocks of bits (a), and the encoded data are mapped to PPM of an optical signal (c). For the purpose of design and analysis, the optical channel in which the PPM signal propagates is modeled as a Poisson point process. At the receiving terminal, the arriving optical signal (y) is demodulated to obtain an estimate (a^) of the coded data, which is then processed by a decoder to obtain an estimate (u^) of the original data.
Coverage Metrics for Model Checking
NASA Technical Reports Server (NTRS)
Penix, John; Visser, Willem; Norvig, Peter (Technical Monitor)
2001-01-01
When using model checking to verify programs in practice, it is not usually possible to achieve complete coverage of the system. In this position paper we describe ongoing research within the Automated Software Engineering group at NASA Ames on the use of test coverage metrics to measure partial coverage and provide heuristic guidance for program model checking. We are specifically interested in applying and developing coverage metrics for concurrent programs that might be used to support certification of next generation avionics software.
Learning abstract visual concepts via probabilistic program induction in a Language of Thought.
Overlan, Matthew C; Jacobs, Robert A; Piantadosi, Steven T
2017-11-01
The ability to learn abstract concepts is a powerful component of human cognition. It has been argued that variable binding is the key element enabling this ability, but the computational aspects of variable binding remain poorly understood. Here, we address this shortcoming by formalizing the Hierarchical Language of Thought (HLOT) model of rule learning. Given a set of data items, the model uses Bayesian inference to infer a probability distribution over stochastic programs that implement variable binding. Because the model makes use of symbolic variables as well as Bayesian inference and programs with stochastic primitives, it combines many of the advantages of both symbolic and statistical approaches to cognitive modeling. To evaluate the model, we conducted an experiment in which human subjects viewed training items and then judged which test items belong to the same concept as the training items. We found that the HLOT model provides a close match to human generalization patterns, significantly outperforming two variants of the Generalized Context Model, one variant based on string similarity and the other based on visual similarity using features from a deep convolutional neural network. Additional results suggest that variable binding happens automatically, implying that binding operations do not add complexity to peoples' hypothesized rules. Overall, this work demonstrates that a cognitive model combining symbolic variables with Bayesian inference and stochastic program primitives provides a new perspective for understanding people's patterns of generalization. Copyright © 2017 Elsevier B.V. All rights reserved.
Posterior Predictive Model Checking in Bayesian Networks
ERIC Educational Resources Information Center
Crawford, Aaron
2014-01-01
This simulation study compared the utility of various discrepancy measures within a posterior predictive model checking (PPMC) framework for detecting different types of data-model misfit in multidimensional Bayesian network (BN) models. The investigated conditions were motivated by an applied research program utilizing an operational complex…
Analyzing the cost of screening selectee and non-selectee baggage.
Virta, Julie L; Jacobson, Sheldon H; Kobza, John E
2003-10-01
Determining how to effectively operate security devices is as important to overall system performance as developing more sensitive security devices. In light of recent federal mandates for 100% screening of all checked baggage, this research studies the trade-offs between screening only selectee checked baggage and screening both selectee and non-selectee checked baggage for a single baggage screening security device deployed at an airport. This trade-off is represented using a cost model that incorporates the cost of the baggage screening security device, the volume of checked baggage processed through the device, and the outcomes that occur when the device is used. The cost model captures the cost of deploying, maintaining, and operating a single baggage screening security device over a one-year period. The study concludes that as excess baggage screening capacity is used to screen non-selectee checked bags, the expected annual cost increases, the expected annual cost per checked bag screened decreases, and the expected annual cost per expected number of threats detected in the checked bags screened increases. These results indicate that the marginal increase in security per dollar spent is significantly lower when non-selectee checked bags are screened than when only selectee checked bags are screened.
40 CFR 86.327-79 - Quench checks; NOX analyzer.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Quench checks; NOX analyzer. (a) Perform the reaction chamber quench check for each model of high vacuum reaction chamber analyzer prior to initial use. (b) Perform the reaction chamber quench check for each new analyzer that has an ambient pressure or “soft vacuum” reaction chamber prior to initial use. Additionally...
40 CFR 86.327-79 - Quench checks; NOX analyzer.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Quench checks; NOX analyzer. (a) Perform the reaction chamber quench check for each model of high vacuum reaction chamber analyzer prior to initial use. (b) Perform the reaction chamber quench check for each new analyzer that has an ambient pressure or “soft vacuum” reaction chamber prior to initial use. Additionally...
Reilly, Jamie; Peelle, Jonathan E; Garcia, Amanda; Crutch, Sebastian J
2016-01-01
Biological plausibility is an essential constraint for any viable model of semantic memory. Yet, we have only the most rudimentary understanding of how the human brain conducts abstract symbolic transformations that underlie word and object meaning. Neuroscience has evolved a sophisticated arsenal of techniques for elucidating the architecture of conceptual representation. Nevertheless, theoretical convergence remains elusive. Here we describe several contrastive approaches to the organization of semantic knowledge, and in turn we offer our own perspective on two recurring questions in semantic memory research: 1) to what extent are conceptual representations mediated by sensorimotor knowledge (i.e., to what degree is semantic memory embodied)? 2) How might an embodied semantic system represent abstract concepts such as modularity, symbol, or proposition? To address these questions, we review the merits of sensorimotor (i.e., embodied) and amodal (i.e., disembodied) semantic theories and address the neurobiological constraints underlying each. We conclude that the shortcomings of both perspectives in their extreme forms necessitate a hybrid middle ground. We accordingly propose the Dynamic Multilevel Reactivation Framework, an integrative model premised upon flexible interplay between sensorimotor and amodal symbolic representations mediated by multiple cortical hubs. We discuss applications of the Dynamic Multilevel Reactivation Framework to abstract and concrete concept representation and describe how a multidimensional conceptual topography based on emotion, sensation, and magnitude can successfully frame a semantic space containing meanings for both abstract and concrete words. The consideration of ‘abstract conceptual features’ does not diminish the role of logical and/or executive processing in activating, manipulating and using information stored in conceptual representations. Rather, it proposes that the material on which these processes operate necessarily combine pure sensorimotor information and higher-order cognitive dimensions involved in symbolic representation. PMID:27294419
Reilly, Jamie; Peelle, Jonathan E; Garcia, Amanda; Crutch, Sebastian J
2016-08-01
Biological plausibility is an essential constraint for any viable model of semantic memory. Yet, we have only the most rudimentary understanding of how the human brain conducts abstract symbolic transformations that underlie word and object meaning. Neuroscience has evolved a sophisticated arsenal of techniques for elucidating the architecture of conceptual representation. Nevertheless, theoretical convergence remains elusive. Here we describe several contrastive approaches to the organization of semantic knowledge, and in turn we offer our own perspective on two recurring questions in semantic memory research: (1) to what extent are conceptual representations mediated by sensorimotor knowledge (i.e., to what degree is semantic memory embodied)? (2) How might an embodied semantic system represent abstract concepts such as modularity, symbol, or proposition? To address these questions, we review the merits of sensorimotor (i.e., embodied) and amodal (i.e., disembodied) semantic theories and address the neurobiological constraints underlying each. We conclude that the shortcomings of both perspectives in their extreme forms necessitate a hybrid middle ground. We accordingly propose the Dynamic Multilevel Reactivation Framework-an integrative model predicated upon flexible interplay between sensorimotor and amodal symbolic representations mediated by multiple cortical hubs. We discuss applications of the dynamic multilevel reactivation framework to abstract and concrete concept representation and describe how a multidimensional conceptual topography based on emotion, sensation, and magnitude can successfully frame a semantic space containing meanings for both abstract and concrete words. The consideration of 'abstract conceptual features' does not diminish the role of logical and/or executive processing in activating, manipulating and using information stored in conceptual representations. Rather, it proposes that the materials upon which these processes operate necessarily combine pure sensorimotor information and higher-order cognitive dimensions involved in symbolic representation.
The Model-Building Process in Introductory College Geography: An Illustrative Example
ERIC Educational Resources Information Center
Cadwallader, Martin
1978-01-01
Illustrates the five elements of conceptual models by developing a model of consumer behavior in choosing among alternative supermarkets. The elements are: identifying the problem, constructing a conceptual model, translating it into a symbolic model, operationalizing the model, and testing. (Author/AV)
Sirois, Fuschia M; Salamonsen, Anita; Kristoffersen, Agnete E
2016-02-24
Research on continued CAM use has been largely atheoretical and has not considered the broader range of psychological and behavioral factors that may be involved. The purpose of this study was to test a new conceptual model of commitment to CAM use that implicates utilitarian (trust in CAM) and symbolic (perceived fit with CAM) in psychological and behavioral dimensions of CAM commitment. A student sample of CAM consumers, (N = 159) completed a survey about their CAM use, CAM-related values, intentions for future CAM use, CAM word-of-mouth behavior, and perceptions of being an ongoing CAM consumer. Analysis revealed that the utilitarian, symbolic, and CAM commitment variables were significantly related, with r's ranging from .54 to .73. A series hierarchical regression analyses controlling for relevant demographic variables found that the utilitarian and symbolic values uniquely accounted for significant and substantial proportion of the variance in each of the three CAM commitment indicators (R(2) from .37 to .57). The findings provide preliminary support for the new model that posits that CAM commitment is a multi-dimensional psychological state with behavioral indicators. Further research with large-scale samples and longitudinal designs is warranted to understand the potential value of the new model.
Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking
NASA Technical Reports Server (NTRS)
Turgeon, Gregory; Price, Petra
2010-01-01
A feasibility study was performed on a representative aerospace system to determine the following: (1) the benefits and limitations to using SCADE , a commercially available tool for model checking, in comparison to using a proprietary tool that was studied previously [1] and (2) metrics for performing the model checking and for assessing the findings. This study was performed independently of the development task by a group unfamiliar with the system, providing a fresh, external perspective free from development bias.
[Series: Utilization of Differential Equations and Methods for Solving Them in Medical Physics (2)].
Murase, Kenya
2015-01-01
In this issue, symbolic methods for solving differential equations were firstly introduced. Of the symbolic methods, Laplace transform method was also introduced together with some examples, in which this method was applied to solving the differential equations derived from a two-compartment kinetic model and an equivalent circuit model for membrane potential. Second, series expansion methods for solving differential equations were introduced together with some examples, in which these methods were used to solve Bessel's and Legendre's differential equations. In the next issue, simultaneous differential equations and various methods for solving these differential equations will be introduced together with some examples in medical physics.
Effect of digital scrambling on satellite communication links
NASA Technical Reports Server (NTRS)
Dessouky, K.
1985-01-01
Digital data scrambling has been considered for communication systems using NRZ symbol formats. The purpose is to increase the number of transitions in the data to improve the performance of the symbol synchronizer. This is accomplished without expanding the bandwidth but at the expense of increasing the data bit error rate (BER). Models for the scramblers/descramblers of practical interest are presented together with the appropriate link model. The effects of scrambling on the performance of coded and uncoded links are studied. The results are illustrated by application to the Tracking and Data Relay Satellite System (TDRSS) links. Conclusions regarding the usefulness of scrambling are also given.
Unraveling halide hydration: A high dilution approach.
Migliorati, Valentina; Sessa, Francesco; Aquilanti, Giuliana; D'Angelo, Paola
2014-07-28
The hydration properties of halide aqua ions have been investigated combining classical Molecular Dynamics (MD) with Extended X-ray Absorption Fine Structure (EXAFS) spectroscopy. Three halide-water interaction potentials recently developed [M. M. Reif and P. H. Hünenberger, J. Chem. Phys. 134, 144104 (2011)], along with three plausible choices for the value of the absolute hydration free energy of the proton (ΔG [minus sign in circle symbol]hyd[H+]), have been checked for their capability to properly describe the structural properties of halide aqueous solutions, by comparing the MD structural results with EXAFS experimental data. A very good agreement between theory and experiment has been obtained with one parameter set, namely LE, thus strengthening preliminary evidences for a ΔG [minus sign in circle symbol]hyd[H] value of -1100 kJ mol(-1) [M. M. Reif and P. H. Hünenberger, J. Chem. Phys. 134, 144104 (2011)]. The Cl(-), Br(-), and I(-) ions have been found to form an unstructured and disordered first hydration shell in aqueous solution, with a broad distribution of instantaneous coordination numbers. Conversely, the F(-) ion shows more ordered and defined first solvation shell, with only two statistically relevant coordination geometries (six and sevenfold complexes). Our thorough investigation on the effect of halide ions on the microscopic structure of water highlights that the perturbation induced by the Cl(-), Br(-), and I(-) ions does not extend beyond the ion first hydration shell, and the structure of water in the F(-) second shell is also substantially unaffected by the ion.
Reynvoet, Bert; Sasanguie, Delphine
2016-01-01
Recently, a lot of studies in the domain of numerical cognition have been published demonstrating a robust association between numerical symbol processing and individual differences in mathematics achievement. Because numerical symbols are so important for mathematics achievement, many researchers want to provide an answer on the ‘symbol grounding problem,’ i.e., how does a symbol acquires its numerical meaning? The most popular account, the approximate number system (ANS) mapping account, assumes that a symbol acquires its numerical meaning by being mapped on a non-verbal and ANS. Here, we critically evaluate four arguments that are supposed to support this account, i.e., (1) there is an evolutionary system for approximate number processing, (2) non-symbolic and symbolic number processing show the same behavioral effects, (3) non-symbolic and symbolic numbers activate the same brain regions which are also involved in more advanced calculation and (4) non-symbolic comparison is related to the performance on symbolic mathematics achievement tasks. Based on this evaluation, we conclude that all of these arguments and consequently also the mapping account are questionable. Next we explored less popular alternative, where small numerical symbols are initially mapped on a precise representation and then, in combination with increasing knowledge of the counting list result in an independent and exact symbolic system based on order relations between symbols. We evaluate this account by reviewing evidence on order judgment tasks following the same four arguments. Although further research is necessary, the available evidence so far suggests that this symbol–symbol association account should be considered as a worthy alternative of how symbols acquire their meaning. PMID:27790179
1990-12-01
S) Naval Postgraduate School 6a. NAME OF PERFORMING ORGANIZATION 6b OFFICE SYMBOL 7a NAME OF MONITORING ORGANIZATION (if applicable ) Code 33 6c...FUNDING/SPONSORING Bb. OFFICE SYMBOL 9. PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION (If applicable ) 8c. ADDRESS (City, State, and ZIP Code...system’s individual components. Then one derives the overall system reliability from that information, using a simple mathematical model, to be
NASA Technical Reports Server (NTRS)
Call, Jared A.; Kwok, John H.; Fisher, Forest W.
2013-01-01
This innovation is a tool used to verify and validate spacecraft sequences at the predicted events file (PEF) level for the GRAIL (Gravity Recovery and Interior Laboratory, see http://www.nasa. gov/mission_pages/grail/main/index. html) mission as part of the Multi-Mission Planning and Sequencing Team (MPST) operations process to reduce the possibility for errors. This tool is used to catch any sequence related errors or issues immediately after the seqgen modeling to streamline downstream processes. This script verifies and validates the seqgen modeling for the GRAIL MPST process. A PEF is provided as input, and dozens of checks are performed on it to verify and validate the command products including command content, command ordering, flight-rule violations, modeling boundary consistency, resource limits, and ground commanding consistency. By performing as many checks as early in the process as possible, grl_pef_check streamlines the MPST task of generating GRAIL command and modeled products on an aggressive schedule. By enumerating each check being performed, and clearly stating the criteria and assumptions made at each step, grl_pef_check can be used as a manual checklist as well as an automated tool. This helper script was written with a focus on enabling the user with the information they need in order to evaluate a sequence quickly and efficiently, while still keeping them informed and active in the overall sequencing process. grl_pef_check verifies and validates the modeling and sequence content prior to investing any more effort into the build. There are dozens of various items in the modeling run that need to be checked, which is a time-consuming and errorprone task. Currently, no software exists that provides this functionality. Compared to a manual process, this script reduces human error and saves considerable man-hours by automating and streamlining the mission planning and sequencing task for the GRAIL mission.
Non-equilibrium dog-flea model
NASA Astrophysics Data System (ADS)
Ackerson, Bruce J.
2017-11-01
We develop the open dog-flea model to serve as a check of proposed non-equilibrium theories of statistical mechanics. The model is developed in detail. Then it is applied to four recent models for non-equilibrium statistical mechanics. Comparison of the dog-flea solution with these different models allows checking claims and giving a concrete example of the theoretical models.
Numerical Polynomial Homotopy Continuation Method and String Vacua
Mehta, Dhagash
2011-01-01
Finding vmore » acua for the four-dimensional effective theories for supergravity which descend from flux compactifications and analyzing them according to their stability is one of the central problems in string phenomenology. Except for some simple toy models, it is, however, difficult to find all the vacua analytically. Recently developed algorithmic methods based on symbolic computer algebra can be of great help in the more realistic models. However, they suffer from serious algorithmic complexities and are limited to small system sizes. In this paper, we review a numerical method called the numerical polynomial homotopy continuation (NPHC) method, first used in the areas of lattice field theories, which by construction finds all of the vacua of a given potential that is known to have only isolated solutions. The NPHC method is known to suffer from no major algorithmic complexities and is embarrassingly parallelizable , and hence its applicability goes way beyond the existing symbolic methods. We first solve a simple toy model as a warm-up example to demonstrate the NPHC method at work. We then show that all the vacua of a more complicated model of a compactified M theory model, which has an S U ( 3 ) structure, can be obtained by using a desktop machine in just about an hour, a feat which was reported to be prohibitively difficult by the existing symbolic methods. Finally, we compare the various technicalities between the two methods.« less
The effects of sign design features on bicycle pictorial symbols for bicycling facility signs.
Oh, Kyunghui; Rogoff, Aaron; Smith-Jackson, Tonya
2013-11-01
The inanimate bicycle symbol has long been used to indicate the animate activity of bicycling facility signs. In contrast, either the inanimate bicycle symbol or the animate bicycle symbol has been used interchangeably for the standard pavement symbols in bike lanes. This has led to confusion among pedestrians and cyclists alike. The purpose of this study was to examine two different designs (inanimate symbol vs. animate symbol) involved in the evaluation of perceived preference and glance legibility, and investigate sign design features on bicycle pictorial symbols. Thirty-five participants compared current bicycle signs (inanimate symbols) to alternative designs (animate symbols) in a controlled laboratory setting. The results indicated that the alternative designs (animate symbols) showed better performance in both preference and glance legibility tests. Conceptual compatibility, familiarity, and perceptual affordances were found to be important factors as well. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Facchini, F
2000-12-01
The aptitude for symbolization, characteristic of man, is revealed not only in artistic representations and funerary practices. It is exhibited by every manifestation of human activity or representation of natural phenomena that assumes or refers to a meaning. We can recognize functional symbolism (tool-making, habitative or food technology), social symbolism, (language and social communication) and spiritual symbolism (funerary practices and artistic expressions). On the basis of these concepts, research into symbolism in prehistoric man allows us to recognize forms of symbolism already in the manifestations of the most ancient humans, starting with Homo habilis (or rudolfensis). Toolmaking, social organization and organization of the territory are oriented toward survival and the life of the family group. They attest to symbolic behaviors and constitute symbolic systems by means of which man expresses himself, lives and transmits his symbolic world. The diverse forms of symbolism are discussed with reference to the different phases of prehistoric humanity.
[The cultural history of palliative care in primitive societies: an integrative review].
Siles González, José; Solano Ruiz, Maria Del Carmen
2012-08-01
The objective of this study is to describe the evolution of palliative care in order to reflect on the possibility of its origin in primitive cultures and their relationship with the beginnings of the cult of the dead. It describes the change in the symbolic structures and social interactions involved in palliative care during prehistory: functional unit, functional framework and functional element. The theoretical framework is based on cultural history, the dialectical structural model and symbolic interactionism. Categorization techniques, cultural history and dialectic structuralism analyses were performed. Palliative care existed in primitive societies, mostly associated with the rites of passage with a high symbolic content. The social structures - functional unit, functional framework and functional element - are the pillars that supported palliative care in prehistory societies.
Image/video understanding systems based on network-symbolic models
NASA Astrophysics Data System (ADS)
Kuvich, Gary
2004-03-01
Vision is a part of a larger information system that converts visual information into knowledge structures. These structures drive vision process, resolve ambiguity and uncertainty via feedback projections, and provide image understanding that is an interpretation of visual information in terms of such knowledge models. Computer simulation models are built on the basis of graphs/networks. The ability of human brain to emulate similar graph/network models is found. Symbols, predicates and grammars naturally emerge in such networks, and logic is simply a way of restructuring such models. Brain analyzes an image as a graph-type relational structure created via multilevel hierarchical compression of visual information. Primary areas provide active fusion of image features on a spatial grid-like structure, where nodes are cortical columns. Spatial logic and topology naturally present in such structures. Mid-level vision processes like perceptual grouping, separation of figure from ground, are special kinds of network transformations. They convert primary image structure into the set of more abstract ones, which represent objects and visual scene, making them easy for analysis by higher-level knowledge structures. Higher-level vision phenomena are results of such analysis. Composition of network-symbolic models combines learning, classification, and analogy together with higher-level model-based reasoning into a single framework, and it works similar to frames and agents. Computational intelligence methods transform images into model-based knowledge representation. Based on such principles, an Image/Video Understanding system can convert images into the knowledge models, and resolve uncertainty and ambiguity. This allows creating intelligent computer vision systems for design and manufacturing.
Online medical symbol recognition using a Tablet PC
NASA Astrophysics Data System (ADS)
Kundu, Amlan; Hu, Qian; Boykin, Stanley; Clark, Cheryl; Fish, Randy; Jones, Stephen; Moore, Stephen
2011-01-01
In this paper we describe a scheme to enhance the usability of a Tablet PC's handwriting recognition system by including medical symbols that are not a part of the Tablet PC's symbol library. The goal of this work is to make handwriting recognition more useful for medical professionals accustomed to using medical symbols in medical records. To demonstrate that this new symbol recognition module is robust and expandable, we report results on both a medical symbol set and an expanded symbol test set which includes selected mathematical symbols.
A Fully Integrated Sensor SoC with Digital Calibration Hardware and Wireless Transceiver at 2.4 GHz
Kim, Dong-Sun; Jang, Sung-Joon; Hwang, Tae-Ho
2013-01-01
A single-chip sensor system-on-a-chip (SoC) that implements radio for 2.4 GHz, complete digital baseband physical layer (PHY), 10-bit sigma-delta analog-to-digital converter and dedicated sensor calibration hardware for industrial sensing systems has been proposed and integrated in a 0.18-μm CMOS technology. The transceiver's building block includes a low-noise amplifier, mixer, channel filter, receiver signal-strength indicator, frequency synthesizer, voltage-controlled oscillator, and power amplifier. In addition, the digital building block consists of offset quadrature phase-shift keying (OQPSK) modulation, demodulation, carrier frequency offset compensation, auto-gain control, digital MAC function, sensor calibration hardware and embedded 8-bit microcontroller. The digital MAC function supports cyclic redundancy check (CRC), inter-symbol timing check, MAC frame control, and automatic retransmission. The embedded sensor signal processing block consists of calibration coefficient calculator, sensing data calibration mapper and sigma-delta analog-to-digital converter with digital decimation filter. The sensitivity of the overall receiver and the error vector magnitude (EVM) of the overall transmitter are −99 dBm and 18.14%, respectively. The proposed calibration scheme has a reduction of errors by about 45.4% compared with the improved progressive polynomial calibration (PPC) method and the maximum current consumption of the SoC is 16 mA. PMID:23698271
Sum of the Magnitude for Hard Decision Decoding Algorithm Based on Loop Update Detection
Meng, Jiahui; Zhao, Danfeng; Tian, Hai; Zhang, Liang
2018-01-01
In order to improve the performance of non-binary low-density parity check codes (LDPC) hard decision decoding algorithm and to reduce the complexity of decoding, a sum of the magnitude for hard decision decoding algorithm based on loop update detection is proposed. This will also ensure the reliability, stability and high transmission rate of 5G mobile communication. The algorithm is based on the hard decision decoding algorithm (HDA) and uses the soft information from the channel to calculate the reliability, while the sum of the variable nodes’ (VN) magnitude is excluded for computing the reliability of the parity checks. At the same time, the reliability information of the variable node is considered and the loop update detection algorithm is introduced. The bit corresponding to the error code word is flipped multiple times, before this is searched in the order of most likely error probability to finally find the correct code word. Simulation results show that the performance of one of the improved schemes is better than the weighted symbol flipping (WSF) algorithm under different hexadecimal numbers by about 2.2 dB and 2.35 dB at the bit error rate (BER) of 10−5 over an additive white Gaussian noise (AWGN) channel, respectively. Furthermore, the average number of decoding iterations is significantly reduced. PMID:29342963
Abstraction and Assume-Guarantee Reasoning for Automated Software Verification
NASA Technical Reports Server (NTRS)
Chaki, S.; Clarke, E.; Giannakopoulou, D.; Pasareanu, C. S.
2004-01-01
Compositional verification and abstraction are the key techniques to address the state explosion problem associated with model checking of concurrent software. A promising compositional approach is to prove properties of a system by checking properties of its components in an assume-guarantee style. This article proposes a framework for performing abstraction and assume-guarantee reasoning of concurrent C code in an incremental and fully automated fashion. The framework uses predicate abstraction to extract and refine finite state models of software and it uses an automata learning algorithm to incrementally construct assumptions for the compositional verification of the abstract models. The framework can be instantiated with different assume-guarantee rules. We have implemented our approach in the COMFORT reasoning framework and we show how COMFORT out-performs several previous software model checking approaches when checking safety properties of non-trivial concurrent programs.
Practical Capillary Electrophoresis, 2nd Edition by Robert Weinberger
NASA Astrophysics Data System (ADS)
Tilstra, Luanne
2001-01-01
Parts of the text are unnecessarily confusing. The confusion arises because the author assumes the reader knows some aspect of CE in detail, although the detail is not presented until a later chapter. The "Master Symbol List" in the early pages is complete for individuals who are fluent in the language of CE, but not necessarily for folks new to the field. When faced with a new abbreviation, I found it necessary to check for its presence in the master list; if the abbreviation wasn't there, I added it. For example, every CE practitioner knows that BGE stands for "background electrolyte". Although the abbreviation was clearly defined in Chapter 1, I had forgotten it by the time I got to Chapter 3. Nevertheless, the wealth of detailed information, the abundance of useful references, and the well-organized tables outweigh the few editorial problems I found.
In the laboratory of the Ghost-Baron: parapsychology in Germany in the early 20th century.
Wolffram, Heather
2009-12-01
During the early twentieth century the Munich-based psychiatrist Albert von Schrenck-Notzing constructed a parapsychological laboratory in his Karolinenplatz home. Furnished with a range of apparatus derived from the physical and behavioural sciences, the Baron's intention was to mimic both the outward form and disciplinary trajectory of contemporary experimental psychology, thereby legitimating the nascent field of parapsychology. Experimentation with mediums, those labile subjects who produced ectoplasm, materialisation and telekinesis, however, necessitated not only the inclusion of a range of spiritualist props, but the lackadaisical application of those checks and controls intended to prevent simulation and fraud. Thus Schrenck-Notzing's parapsychological laboratory with its stereoscopic cameras, galvanometers and medium cabinets was a strange coalescence of both the séance room and the lab, a hybrid space that was symbolic of the irresolvable epistemological and methodological problems at the heart of this aspiring science.
LDPC product coding scheme with extrinsic information for bit patterned media recoding
NASA Astrophysics Data System (ADS)
Jeong, Seongkwon; Lee, Jaejin
2017-05-01
Since the density limit of the current perpendicular magnetic storage system will soon be reached, bit patterned media recording (BPMR) is a promising candidate for the next generation storage system to achieve an areal density beyond 1 Tb/in2. Each recording bit is stored in a fabricated magnetic island and the space between the magnetic islands is nonmagnetic in BPMR. To approach recording densities of 1 Tb/in2, the spacing of the magnetic islands must be less than 25 nm. Consequently, severe inter-symbol interference (ISI) and inter-track interference (ITI) occur. ITI and ISI degrade the performance of BPMR. In this paper, we propose a low-density parity check (LDPC) product coding scheme that exploits extrinsic information for BPMR. This scheme shows an improved bit error rate performance compared to that in which one LDPC code is used.
Representation of visual symbols in the visual word processing network.
Muayqil, Taim; Davies-Thompson, Jodie; Barton, Jason J S
2015-03-01
Previous studies have shown that word processing involves a predominantly left-sided occipitotemporal network. Words are a form of symbolic representation, in that they are arbitrary perceptual stimuli that represent other objects, actions or concepts. Lesions of parts of the visual word processing network can cause alexia, which can be associated with difficulty processing other types of symbols such as musical notation or road signs. We investigated whether components of the visual word processing network were also activated by other types of symbols. In 16 music-literate subjects, we defined the visual word network using fMRI and examined responses to four symbolic categories: visual words, musical notation, instructive symbols (e.g. traffic signs), and flags and logos. For each category we compared responses not only to scrambled stimuli, but also to similar stimuli that lacked symbolic meaning. The left visual word form area and a homologous right fusiform region responded similarly to all four categories, but equally to both symbolic and non-symbolic equivalents. Greater response to symbolic than non-symbolic stimuli occurred only in the left inferior frontal and middle temporal gyri, but only for words, and in the case of the left inferior frontal gyri, also for musical notation. A whole-brain analysis comparing symbolic versus non-symbolic stimuli revealed a distributed network of inferior temporooccipital and parietal regions that differed for different symbols. The fusiform gyri are involved in processing the form of many symbolic stimuli, but not specifically for stimuli with symbolic content. Selectivity for stimuli with symbolic content only emerges in the visual word network at the level of the middle temporal and inferior frontal gyri, but is specific for words and musical notation. Copyright © 2015 Elsevier Ltd. All rights reserved.
2016-01-01
The numerical cognition literature offers two views to explain numerical and arithmetical development. The unique-representation view considers the approximate number system (ANS) to represent the magnitude of both symbolic and non-symbolic numbers and to be the basis of numerical learning. In contrast, the dual-representation view suggests that symbolic and non-symbolic skills rely on different magnitude representations and that it is the ability to build an exact representation of symbolic numbers that underlies math learning. Support for these hypotheses has come mainly from correlative studies with inconsistent results. In this study, we developed two training programs aiming at enhancing the magnitude processing of either non-symbolic numbers or symbolic numbers and compared their effects on arithmetic skills. Fifty-six preschoolers were randomly assigned to one of three 10-session-training conditions: (1) non-symbolic training (2) symbolic training and (3) control training working on story understanding. Both numerical training conditions were significantly more efficient than the control condition in improving magnitude processing. Moreover, symbolic training led to a significantly larger improvement in arithmetic than did non-symbolic training and the control condition. These results support the dual-representation view. PMID:27875540
Incremental checking of Master Data Management model based on contextual graphs
NASA Astrophysics Data System (ADS)
Lamolle, Myriam; Menet, Ludovic; Le Duc, Chan
2015-10-01
The validation of models is a crucial step in distributed heterogeneous systems. In this paper, an incremental validation method is proposed in the scope of a Model Driven Engineering (MDE) approach, which is used to develop a Master Data Management (MDM) field represented by XML Schema models. The MDE approach presented in this paper is based on the definition of an abstraction layer using UML class diagrams. The validation method aims to minimise the model errors and to optimisethe process of model checking. Therefore, the notion of validation contexts is introduced allowing the verification of data model views. Description logics specify constraints that the models have to check. An experimentation of the approach is presented through an application developed in ArgoUML IDE.
Thompson, Clarissa A; Ratcliff, Roger; McKoon, Gail
2016-10-01
How do speed and accuracy trade off, and what components of information processing develop as children and adults make simple numeric comparisons? Data from symbolic and non-symbolic number tasks were collected from 19 first graders (Mage=7.12 years), 26 second/third graders (Mage=8.20 years), 27 fourth/fifth graders (Mage=10.46 years), and 19 seventh/eighth graders (Mage=13.22 years). The non-symbolic task asked children to decide whether an array of asterisks had a larger or smaller number than 50, and the symbolic task asked whether a two-digit number was greater than or less than 50. We used a diffusion model analysis to estimate components of processing in tasks from accuracy, correct and error response times, and response time (RT) distributions. Participants who were accurate on one task were accurate on the other task, and participants who made fast decisions on one task made fast decisions on the other task. Older participants extracted a higher quality of information from the stimulus arrays, were more willing to make a decision, and were faster at encoding, transforming the stimulus representation, and executing their responses. Individual participants' accuracy and RTs were uncorrelated. Drift rate and boundary settings were significantly related across tasks, but they were unrelated to each other. Accuracy was mainly determined by drift rate, and RT was mainly determined by boundary separation. We concluded that RT and accuracy operate largely independently. Copyright © 2016 Elsevier Inc. All rights reserved.
Model Checking - My 27-Year Quest to Overcome the State Explosion Problem
NASA Technical Reports Server (NTRS)
Clarke, Ed
2009-01-01
Model Checking is an automatic verification technique for state-transition systems that are finite=state or that have finite-state abstractions. In the early 1980 s in a series of joint papers with my graduate students E.A. Emerson and A.P. Sistla, we proposed that Model Checking could be used for verifying concurrent systems and gave algorithms for this purpose. At roughly the same time, Joseph Sifakis and his student J.P. Queille at the University of Grenoble independently developed a similar technique. Model Checking has been used successfully to reason about computer hardware and communication protocols and is beginning to be used for verifying computer software. Specifications are written in temporal logic, which is particularly valuable for expressing concurrency properties. An intelligent, exhaustive search is used to determine if the specification is true or not. If the specification is not true, the Model Checker will produce a counterexample execution trace that shows why the specification does not hold. This feature is extremely useful for finding obscure errors in complex systems. The main disadvantage of Model Checking is the state-explosion problem, which can occur if the system under verification has many processes or complex data structures. Although the state-explosion problem is inevitable in worst case, over the past 27 years considerable progress has been made on the problem for certain classes of state-transition systems that occur often in practice. In this talk, I will describe what Model Checking is, how it works, and the main techniques that have been developed for combating the state explosion problem.
Efficient Implementation of a Symbol Timing Estimator for Broadband PLC.
Nombela, Francisco; García, Enrique; Mateos, Raúl; Hernández, Álvaro
2015-08-21
Broadband Power Line Communications (PLC) have taken advantage of the research advances in multi-carrier modulations to mitigate frequency selective fading, and their adoption opens up a myriad of applications in the field of sensory and automation systems, multimedia connectivity or smart spaces. Nonetheless, the use of these multi-carrier modulations, such as Wavelet-OFDM, requires a highly accurate symbol timing estimation for reliably recovering of transmitted data. Furthermore, the PLC channel presents some particularities that prevent the direct use of previous synchronization algorithms proposed in wireless communication systems. Therefore more research effort should be involved in the design and implementation of novel and robust synchronization algorithms for PLC, thus enabling real-time synchronization. This paper proposes a symbol timing estimator for broadband PLC based on cross-correlation with multilevel complementary sequences or Zadoff-Chu sequences and its efficient implementation in a FPGA; the obtained results show a 90% of success rate in symbol timing estimation for a certain PLC channel model and a reduced resource consumption for its implementation in a Xilinx Kyntex FPGA.
Basic mapping principles for visualizing cancer data using Geographic Information Systems (GIS).
Brewer, Cynthia A
2006-02-01
Maps and other data graphics may play a role in generating ideas and hypotheses at the beginning of a project. They are useful as part of analyses for evaluating model results and then at the end of a project when researchers present their results and conclusions to varied audiences, such as their local research group, decision makers, or a concerned public. Cancer researchers are gaining skill with geographic information system (GIS) mapping as one of their many tools and are broadening the symbolization approaches they use for investigating and illustrating their data. A single map is one of many possible representations of the data, so making multiple maps is often part of a complete mapping effort. Symbol types, color choices, and data classing each affect the information revealed by a map and are best tailored to the specific characteristics of data. Related data can be examined in series with coordinated classing and can also be compared using multivariate symbols that build on the basic rules of symbol design. Informative legend wording and setting suitable map projections are also basic to skilled mapmaking.
Baumert, Mathias; Brown, Rachael; Duma, Stephen; Broe, G Anthony; Kabir, Muammar M; Macefield, Vaughan G
2012-01-01
Heart rate and respiration display fluctuations that are interlinked by central regulatory mechanisms of the autonomic nervous system (ANS). Joint assessment of respiratory time series along with heart rate variability (HRV) may therefore provide information on ANS dysfunction. The aim of this study was to investigate cardio-respiratory interaction in patients with Parkinson's disease (PD), a neurodegenerative disorder that is associated with progressive ANS dysfunction. Short-term ECG and respiration were recorded in 25 PD patients and 28 healthy controls during rest. To assess ANS dysfunction we analyzed joint symbolic dynamics of heart rate and respiration, cardio-respiratory synchrograms along with heart rate variability. Neither HRV nor cardio-respiratory synchrograms were significantly altered in PD patients. Symbolic analysis, however, identified a significant reduction in cardio-respiratory interactions in PD patients compared to healthy controls (16 ± 3.6 % vs. 20 ± 6.1 %; p= 0.02). In conclusion, joint symbolic analysis of cardio-respiratory dynamics provides a powerful tool to detect early signs of autonomic nervous system dysfunction in Parkinson's disease patients at an early stage of the disease.
Pakmor, Rüdiger; Kromer, Markus; Röpke, Friedrich K; Sim, Stuart A; Ruiter, Ashley J; Hillebrandt, Wolfgang
2010-01-07
Type Ia supernovae are thought to result from thermonuclear explosions of carbon-oxygen white dwarf stars. Existing models generally explain the observed properties, with the exception of the sub-luminous 1991bg-like supernovae. It has long been suspected that the merger of two white dwarfs could give rise to a type Ia event, but hitherto simulations have failed to produce an explosion. Here we report a simulation of the merger of two equal-mass white dwarfs that leads to a sub-luminous explosion, although at the expense of requiring a single common-envelope phase, and component masses of approximately 0.9M[symbol: see text]. The light curve is too broad, but the synthesized spectra, red colour and low expansion velocities are all close to what is observed for sub-luminous 1991bg-like events. Although the mass ratios can be slightly less than one and still produce a sub-luminous event, the masses have to be in the range 0.83M[symbol: see text] to 0.9M[symbol: see text].
ERIC Educational Resources Information Center
Hoijtink, Herbert; Molenaar, Ivo W.
1997-01-01
This paper shows that a certain class of constrained latent class models may be interpreted as a special case of nonparametric multidimensional item response models. Parameters of this latent class model are estimated using an application of the Gibbs sampler, and model fit is investigated using posterior predictive checks. (SLD)
Statechart Analysis with Symbolic PathFinder
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.
2012-01-01
We report here on our on-going work that addresses the automated analysis and test case generation for software systems modeled using multiple Statechart formalisms. The work is motivated by large programs such as NASA Exploration, that involve multiple systems that interact via safety-critical protocols and are designed with different Statechart variants. To verify these safety-critical systems, we have developed Polyglot, a framework for modeling and analysis of model-based software written using different Statechart formalisms. Polyglot uses a common intermediate representation with customizable Statechart semantics and leverages the analysis and test generation capabilities of the Symbolic PathFinder tool. Polyglot is used as follows: First, the structure of the Statechart model (expressed in Matlab Stateflow or Rational Rhapsody) is translated into a common intermediate representation (IR). The IR is then translated into Java code that represents the structure of the model. The semantics are provided as "pluggable" modules.
Model-based object classification using unification grammars and abstract representations
NASA Astrophysics Data System (ADS)
Liburdy, Kathleen A.; Schalkoff, Robert J.
1993-04-01
The design and implementation of a high level computer vision system which performs object classification is described. General object labelling and functional analysis require models of classes which display a wide range of geometric variations. A large representational gap exists between abstract criteria such as `graspable' and current geometric image descriptions. The vision system developed and described in this work addresses this problem and implements solutions based on a fusion of semantics, unification, and formal language theory. Object models are represented using unification grammars, which provide a framework for the integration of structure and semantics. A methodology for the derivation of symbolic image descriptions capable of interacting with the grammar-based models is described and implemented. A unification-based parser developed for this system achieves object classification by determining if the symbolic image description can be unified with the abstract criteria of an object model. Future research directions are indicated.
1988-03-01
distribution Zb DECLASSIFICATION/OOWNGRADING SCHEDULE unlimited. 4 PERFORMING ORGANIZATION REPORT NUMBER(S) S MONITORING ORGANIZATION REPORT NUMBER(S...Technical Report CERC-88-1. 6a NAME OF PERFORMING ORGANIZATION 6b OFFICE SYMBOL 7a. NAME OF MONITORING ORGANIZATION USAEWES, Coastal Engineering (if...FUNDING/SPONSORING 8b OFFICE SYMBOL 9 PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION US Army (If applicable) Engineer District, Norfolk Intra
Decoding algorithm for vortex communications receiver
NASA Astrophysics Data System (ADS)
Kupferman, Judy; Arnon, Shlomi
2018-01-01
Vortex light beams can provide a tremendous alphabet for encoding information. We derive a symbol decoding algorithm for a direct detection matrix detector vortex beam receiver using Laguerre Gauss (LG) modes, and develop a mathematical model of symbol error rate (SER) for this receiver. We compare SER as a function of signal to noise ratio (SNR) for our algorithm and for the Pearson correlation algorithm. To our knowledge, this is the first comprehensive treatment of a decoding algorithm of a matrix detector for an LG receiver.
Construct validity and reliability of the Single Checking Administration of Medications Scale.
O'Connell, Beverly; Hawkins, Mary; Ockerby, Cherene
2013-06-01
Research indicates that single checking of medications is as safe as double checking; however, many nurses are averse to independently checking medications. To assist with the introduction and use of single checking, a measure of nurses' attitudes, the thirteen-item Single Checking Administration of Medications Scale (SCAMS) was developed. We examined the psychometric properties of the SCAMS. Secondary analyses were conducted on data collected from 503 nurses across a large Australian health-care service. Analyses using exploratory and confirmatory factor analyses supported by structural equation modelling resulted in a valid twelve-item SCAMS containing two reliable subscales, the nine-item Attitudes towards single checking and three-item Advantages of single checking subscales. The SCAMS is recommended as a valid and reliable measure for monitoring nurses' attitudes to single checking prior to introducing single checking medications and after its implementation. © 2013 Wiley Publishing Asia Pty Ltd.
Ranking streamflow model performance based on Information theory metrics
NASA Astrophysics Data System (ADS)
Martinez, Gonzalo; Pachepsky, Yakov; Pan, Feng; Wagener, Thorsten; Nicholson, Thomas
2016-04-01
The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic model evaluation and selection. We simulated 10-year streamflow time series in five watersheds located in Texas, North Carolina, Mississippi, and West Virginia. Eight model of different complexity were applied. The information-theory based metrics were obtained after representing the time series as strings of symbols where different symbols corresponded to different quantiles of the probability distribution of streamflow. The symbol alphabet was used. Three metrics were computed for those strings - mean information gain that measures the randomness of the signal, effective measure complexity that characterizes predictability and fluctuation complexity that characterizes the presence of a pattern in the signal. The observed streamflow time series has smaller information content and larger complexity metrics than the precipitation time series. Watersheds served as information filters and and streamflow time series were less random and more complex than the ones of precipitation. This is reflected the fact that the watershed acts as the information filter in the hydrologic conversion process from precipitation to streamflow. The Nash Sutcliffe efficiency metric increased as the complexity of models increased, but in many cases several model had this efficiency values not statistically significant from each other. In such cases, ranking models by the closeness of the information-theory based parameters in simulated and measured streamflow time series can provide an additional criterion for the evaluation of hydrologic model performance.
Li, Chen; Nagasaki, Masao; Ueno, Kazuko; Miyano, Satoru
2009-04-27
Model checking approaches were applied to biological pathway validations around 2003. Recently, Fisher et al. have proved the importance of model checking approach by inferring new regulation of signaling crosstalk in C. elegans and confirming the regulation with biological experiments. They took a discrete and state-based approach to explore all possible states of the system underlying vulval precursor cell (VPC) fate specification for desired properties. However, since both discrete and continuous features appear to be an indispensable part of biological processes, it is more appropriate to use quantitative models to capture the dynamics of biological systems. Our key motivation of this paper is to establish a quantitative methodology to model and analyze in silico models incorporating the use of model checking approach. A novel method of modeling and simulating biological systems with the use of model checking approach is proposed based on hybrid functional Petri net with extension (HFPNe) as the framework dealing with both discrete and continuous events. Firstly, we construct a quantitative VPC fate model with 1761 components by using HFPNe. Secondly, we employ two major biological fate determination rules - Rule I and Rule II - to VPC fate model. We then conduct 10,000 simulations for each of 48 sets of different genotypes, investigate variations of cell fate patterns under each genotype, and validate the two rules by comparing three simulation targets consisting of fate patterns obtained from in silico and in vivo experiments. In particular, an evaluation was successfully done by using our VPC fate model to investigate one target derived from biological experiments involving hybrid lineage observations. However, the understandings of hybrid lineages are hard to make on a discrete model because the hybrid lineage occurs when the system comes close to certain thresholds as discussed by Sternberg and Horvitz in 1986. Our simulation results suggest that: Rule I that cannot be applied with qualitative based model checking, is more reasonable than Rule II owing to the high coverage of predicted fate patterns (except for the genotype of lin-15ko; lin-12ko double mutants). More insights are also suggested. The quantitative simulation-based model checking approach is a useful means to provide us valuable biological insights and better understandings of biological systems and observation data that may be hard to capture with the qualitative one.
Design of Installing Check Dam Using RAMMS Model in Seorak National Park of South Korea
NASA Astrophysics Data System (ADS)
Jun, K.; Tak, W.; JUN, B. H.; Lee, H. J.; KIM, S. D.
2016-12-01
Design of Installing Check Dam Using RAMMS Model in Seorak National Park of South Korea Kye-Won Jun*, Won-Jun Tak*, Byong-Hee Jun**, Ho-Jin Lee***, Soung-Doug Kim* *Graduate School of Disaster Prevention, Kangwon National University, 346 Joogang-ro, Samcheok-si, Gangwon-do, Korea **School of Fire and Disaster Protection, Kangwon National University, 346 Joogang-ro, Samcheok-si, Gangwon-do, Korea ***School of Civil Engineering, Chungbuk National University, 1 Chungdae-ro, Seowon-gu, Cheongju, Korea Abstract As more than 64% of the land in South Korea is mountainous area, so many regions in South Korea are exposed to the danger of landslide and debris flow. So it is important to understand the behavior of debris flow in mountainous terrains, the various methods and models are being presented and developed based on the mathematical concept. The purpose of this study is to investigate the regions that experienced the debris flow due to typhoon called Ewiniar and to perform numerical modeling to design and layout of the Check dam for reducing the damage by the debris flow. For the performance of numerical modeling, on-site measurement of the research area was conducted including: topographic investigation, research on bridges in the downstream, and precision LiDAR 3D scanning for composed basic data of numerical modeling. The numerical simulation of this study was performed using RAMMS (Rapid Mass Movements Simulation) model for the analysis of the debris flow. This model applied to the conditions of the Check dam which was installed in the upstream, midstream, and downstream. Considering the reduction effect of debris flow, the expansion of debris flow, and the influence on the bridges in the downstream, proper location of the Check dam was designated. The result of present numerical model showed that when the Check dam was installed in the downstream section, 50 m above the bridge, the reduction effect of the debris flow was higher compared to when the Check dam were installed in other sections. Key words: Debris flow, LiDAR, Check dam, RAMMSAcknowledgementsThis research was supported by a grant [MPSS-NH-2014-74] through the Disaster and Safety Management Institute funded by Ministry of Public Safety and Security of Korean government
Model building strategy for logistic regression: purposeful selection.
Zhang, Zhongheng
2016-03-01
Logistic regression is one of the most commonly used models to account for confounders in medical literature. The article introduces how to perform purposeful selection model building strategy with R. I stress on the use of likelihood ratio test to see whether deleting a variable will have significant impact on model fit. A deleted variable should also be checked for whether it is an important adjustment of remaining covariates. Interaction should be checked to disentangle complex relationship between covariates and their synergistic effect on response variable. Model should be checked for the goodness-of-fit (GOF). In other words, how the fitted model reflects the real data. Hosmer-Lemeshow GOF test is the most widely used for logistic regression model.
Using computer algebra and SMT-solvers to analyze a mathematical model of cholera propagation
NASA Astrophysics Data System (ADS)
Trujillo Arredondo, Mariana
2014-06-01
We analyze a mathematical model for the transmission of cholera. The model is already defined and involves variables such as the pathogen agent, which in this case is the bacterium Vibrio cholera, and the human population. The human population is divided into three classes: susceptible, infectious and removed. Using Computer Algebra, specifically Maple we obtain two equilibrium states: the disease free state and the endemic state. Using Maple it is possible to prove that the disease free state is locally asymptotically stable if and only if R0 < 1. Using Maple it is possible to prove that the endemic equilibrium state is locally stable when it exists, it is to say when R0 > 1. Using the package Red-Log of the Computer algebra system Reduce and the SMT-Solver Z3Py it is possible to obtain numerical conditions for the model. The formula for the basic reproductive number makes a synthesis with all epidemic parameters in the model. Also it is possible to make numerical simulations which are very illustrative about the epidemic patters that are expected to be observed in real situations. We claim that these kinds of software are very useful in the analysis of epidemic models given that the symbolic computation provides algebraic formulas for the basic reproductive number and such algebraic formulas are very useful to derive control measures. For other side, computer algebra software is a powerful tool to make the stability analysis for epidemic models given that the all steps in the stability analysis can be made automatically: finding the equilibrium points, computing the jacobian, computing the characteristic polynomial for the jacobian, and applying the Routh-Hurwitz theorem to the characteristic polynomial. Finally, using SMT-Solvers is possible to make automatically checks of satisfiability, validity and quantifiers elimination being these computations very useful to analyse complicated epidemic models.
HiVy automated translation of stateflow designs for model checking verification
NASA Technical Reports Server (NTRS)
Pingree, Paula
2003-01-01
tool set enables model checking of finite state machines designs. This is acheived by translating state-chart specifications into the input language of the Spin model checker. An abstract syntax of hierarchical sequential automata (HSA) is provided as an intermediate format tool set.
Miklavec, Krista; Pravst, Igor; Raats, Monique M; Pohar, Jure
2016-12-01
Many nutrition and/or health symbols were introduced in different countries in the past years and Slovenia is no exception. The objective of our study was to examine familiarity with and perception of the Protective Food symbol (PF symbol) in Slovenia and to investigate consumers' associations related to the symbol, and the influence of symbols' appearance on their preferences. The study was conducted through online questionnaire with incorporated word-association tasks and conjoint analysis; GfK consumer panel and social media (Facebook) were used for recruitment of Slovenian adults (n=1050; 534 men, 516 women). The majority (78%) of the participants reported they had previously seen the PF symbol, and 64% declared familiarity with it. Familiarity was verified using a word-association task in which we analysed the nature of the symbol's description, distinguishing the description of symbol's visual appearance or its meaning. In this task, 73% of the participants described the symbol's meaning with reference to health or a healthy lifestyle, confirming their familiarity with it. Women and those responsible for grocery shopping were significantly more familiar with the symbol. The impact of the symbol's appearance on consumers' preferences was investigated using conjoint analysis consisting of two attributes - three different symbols found on foods in Slovenia (PF symbol, Choices Programme symbol and Keyhole symbol), and accompanying worded claims. Although worded claims had less relative importance (29.5%) than the symbols (70.5%), we show that careful choice of the wording can affect consumers' preferences considerably. The lowest part-worth utility was observed without an accompanying claim, and the highest for the claim directly communicating health ("Protects your health"). The fact that most participants are well familiar with the PF symbol indicates the symbol's potential to promote healthier food choices, which could be further improved by an accompanying worded claim that clearly describes its meaning. In addition, the use of Facebook ads is shown to be a useful alternative recruitment method for research with consumers. Copyright © 2016 Elsevier Ltd. All rights reserved.
Razzaq, Misbah; Ahmad, Jamil
2015-01-01
Internet worms are analogous to biological viruses since they can infect a host and have the ability to propagate through a chosen medium. To prevent the spread of a worm or to grasp how to regulate a prevailing worm, compartmental models are commonly used as a means to examine and understand the patterns and mechanisms of a worm spread. However, one of the greatest challenge is to produce methods to verify and validate the behavioural properties of a compartmental model. This is why in this study we suggest a framework based on Petri Nets and Model Checking through which we can meticulously examine and validate these models. We investigate Susceptible-Exposed-Infectious-Recovered (SEIR) model and propose a new model Susceptible-Exposed-Infectious-Recovered-Delayed-Quarantined (Susceptible/Recovered) (SEIDQR(S/I)) along with hybrid quarantine strategy, which is then constructed and analysed using Stochastic Petri Nets and Continuous Time Markov Chain. The analysis shows that the hybrid quarantine strategy is extremely effective in reducing the risk of propagating the worm. Through Model Checking, we gained insight into the functionality of compartmental models. Model Checking results validate simulation ones well, which fully support the proposed framework. PMID:26713449
Razzaq, Misbah; Ahmad, Jamil
2015-01-01
Internet worms are analogous to biological viruses since they can infect a host and have the ability to propagate through a chosen medium. To prevent the spread of a worm or to grasp how to regulate a prevailing worm, compartmental models are commonly used as a means to examine and understand the patterns and mechanisms of a worm spread. However, one of the greatest challenge is to produce methods to verify and validate the behavioural properties of a compartmental model. This is why in this study we suggest a framework based on Petri Nets and Model Checking through which we can meticulously examine and validate these models. We investigate Susceptible-Exposed-Infectious-Recovered (SEIR) model and propose a new model Susceptible-Exposed-Infectious-Recovered-Delayed-Quarantined (Susceptible/Recovered) (SEIDQR(S/I)) along with hybrid quarantine strategy, which is then constructed and analysed using Stochastic Petri Nets and Continuous Time Markov Chain. The analysis shows that the hybrid quarantine strategy is extremely effective in reducing the risk of propagating the worm. Through Model Checking, we gained insight into the functionality of compartmental models. Model Checking results validate simulation ones well, which fully support the proposed framework.
Kraus, Nicole; Lindenberg, Julia; Zeeck, Almut; Kosfelder, Joachim; Vocks, Silja
2015-09-01
Cognitive-behavioural models of eating disorders state that body checking arises in response to negative emotions in order to reduce the aversive emotional state and is therefore negatively reinforced. This study empirically tests this assumption. For a seven-day period, women with eating disorders (n = 26) and healthy controls (n = 29) were provided with a handheld computer for assessing occurring body checking strategies as well as negative and positive emotions. Serving as control condition, randomized computer-emitted acoustic signals prompted reports on body checking and emotions. There was no difference in the intensity of negative emotions before body checking and in control situations across groups. However, from pre- to post-body checking, an increase in negative emotions was found. This effect was more pronounced in women with eating disorders compared with healthy controls. Results are contradictory to the assumptions of the cognitive-behavioural model, as body checking does not seem to reduce negative emotions. Copyright © 2015 John Wiley & Sons, Ltd and Eating Disorders Association.
LATDYN - PROGRAM FOR SIMULATION OF LARGE ANGLE TRANSIENT DYNAMICS OF FLEXIBLE AND RIGID STRUCTURES
NASA Technical Reports Server (NTRS)
Housner, J. M.
1994-01-01
LATDYN is a computer code for modeling the Large Angle Transient DYNamics of flexible articulating structures and mechanisms involving joints about which members rotate through large angles. LATDYN extends and brings together some of the aspects of Finite Element Structural Analysis, Multi-Body Dynamics, and Control System Analysis; three disciplines that have been historically separate. It combines significant portions of their distinct capabilities into one single analysis tool. The finite element formulation for flexible bodies in LATDYN extends the conventional finite element formulation by using a convected coordinate system for constructing the equation of motion. LATDYN's formulation allows for large displacements and rotations of finite elements subject to the restriction that deformations within each are small. Also, the finite element approach implemented in LATDYN provides a convergent path for checking solutions simply by increasing mesh density. For rigid bodies and joints LATDYN borrows extensively from methodology used in multi-body dynamics where rigid bodies may be defined and connected together through joints (hinges, ball, universal, sliders, etc.). Joints may be modeled either by constraints or by adding joint degrees of freedom. To eliminate error brought about by the separation of structural analysis and control analysis, LATDYN provides symbolic capabilities for modeling control systems which are integrated with the structural dynamic analysis itself. Its command language contains syntactical structures which perform symbolic operations which are also interfaced directly with the finite element structural model, bypassing the modal approximation. Thus, when the dynamic equations representing the structural model are integrated, the equations representing the control system are integrated along with them as a coupled system. This procedure also has the side benefit of enabling a dramatic simplification of the user interface for modeling control systems. Three FORTRAN computer programs, the LATDYN Program, the Preprocessor, and the Postprocessor, make up the collective LATDYN System. The Preprocessor translates user commands into a form which can be used while the LATDYN program provides the computational core. The Postprocessor allows the user to interactively plot and manage a database of LATDYN transient analysis results. It also includes special facilities for modeling control systems and for programming changes to the model which take place during analysis sequence. The documentation includes a Demonstration Problem Manual for the evaluation and verification of results and a Postprocessor guide. Because the program should be viewed as a byproduct of research on technology development, LATDYN's scope is limited. It does not have a wide library of finite elements, and 3-D Graphics are not available. Nevertheless, it does have a measure of "user friendliness". The LATDYN program was developed over a period of several years and was implemented on a CDC NOS/VE & Convex Unix computer. It is written in FORTRAN 77 and has a virtual memory requirement of 1.46 MB. The program was validated on a DEC MICROVAX operating under VMS 5.2.
A Parallel Saturation Algorithm on Shared Memory Architectures
NASA Technical Reports Server (NTRS)
Ezekiel, Jonathan; Siminiceanu
2007-01-01
Symbolic state-space generators are notoriously hard to parallelize. However, the Saturation algorithm implemented in the SMART verification tool differs from other sequential symbolic state-space generators in that it exploits the locality of ring events in asynchronous system models. This paper explores whether event locality can be utilized to efficiently parallelize Saturation on shared-memory architectures. Conceptually, we propose to parallelize the ring of events within a decision diagram node, which is technically realized via a thread pool. We discuss the challenges involved in our parallel design and conduct experimental studies on its prototypical implementation. On a dual-processor dual core PC, our studies show speed-ups for several example models, e.g., of up to 50% for a Kanban model, when compared to running our algorithm only on a single core.
Uncovering Sundanese Values by Analyzing Symbolic Meaning of Ménak Priangan Clothing (1800-1942)
NASA Astrophysics Data System (ADS)
Karmila, M.; Suciati; Widiaty, I.
2016-04-01
This study investigates symbolic meanings found in the Sunda ethnic clothing, particularly the Menak Priangan clothing. This study aims to uncover and document those symbolic meanings found in the Menak Priangan clothing as an effort to develop Sunda cultural artefacts of West Java. This study on Menak Priangan clothing applies ethnography (visual) and aesthetic methods. The visual method is utilized in order to uncover local cultural (Sunda) values found in Menak Priangan clothing visualization, including: design, model, name, and representing colours, which then directed towards local Sundanese aesthetic concepts living within the Priangan community. Furthermore, aesthetic method is used to explore role of aesthetic values in empowering visual cultural values within certain community, particularly Sunda aesthetic values. The study results show that since the 19th century, Sunda ethnic clothing was limited to Priangan Sunda only, while traditional clothing wearing by Priangan people reflects their social strata, consisting of: a. Menak Gede (Menak pangluhurna: mayor), bearing raden title, b. Menak Leutik/Santana (mayor assistant), titles: asep, mas, agus, ujang, (Nyimas for woman), c. Somah/Cacah: ordinary people/lower class. Clothing is a cultural phenomenon within certain culture reflecting such society experiences. For Menak people, clothing and its accessories have important meanings. They wear such traditional clothing and accessories as a symbol of power they have within bureaucratic structure and as a symbol of social status they bear within traditional community structure.
Lyons, Ian M; Ansari, Daniel
2015-01-01
Numerical and mathematical skills are critical predictors of academic success. The last three decades have seen a substantial growth in our understanding of how the human mind and brain represent and process numbers. In particular, research has shown that we share with animals the ability to represent numerical magnitude (the total number of items in a set) and that preverbal infants can process numerical magnitude. Further research has shown that similar processing signatures characterize numerical magnitude processing across species and developmental time. These findings suggest that an approximate system for nonsymbolic (e.g., dot arrays) numerical magnitude representation serves as the basis for the acquisition of cultural, symbolic (e.g., Arabic numerals) representations of numerical magnitude. This chapter explores this hypothesis by reviewing studies that have examined the relation between individual differences in nonsymbolic numerical magnitude processing and symbolic math abilities (e.g., arithmetic). Furthermore, we examine the extent to which the available literature provides strong evidence for a link between symbolic and nonsymbolic representations of numerical magnitude at the behavioral and neural levels of analysis. We conclude that claims that symbolic number abilities are grounded in the approximate system for the nonsymbolic representation of numerical magnitude are not strongly supported by the available evidence. Alternative models and future research directions are discussed. © 2015 Elsevier Inc. All rights reserved.
Elisabeth Kubler-Ross and the Tradition of the Private Sphere: An Analysis of Symbols.
ERIC Educational Resources Information Center
Klass, Dennis
1981-01-01
Shows how Kubler-Ross' schema functions as a symbol system. Analyzes the symbol "acceptance." Shows how that symbol is part of a strong American tradition of symbols of the private sphere. (Author/JAC)
Spatial Stroop interference occurs in the processing of radicals of ideogrammic compounds.
Luo, Chunming; Proctor, Robert W; Weng, Xuchu; Li, Xinshan
2014-06-01
In this study, we investigated whether the meanings of radicals are involved in reading ideogrammic compounds in a spatial Stroop task. We found spatial Stroop effects of similar size for the simple characters [symbol: see text] ("up") and [symbol: see text] ("down") and for the complex characters [symbol: see text] ("nervous") and [symbol: see text] ("nervous"), which are ideogrammic compounds containing a radical [symbol: see text] or [symbol: see text], in Experiments 1 and 2. In Experiment 3, the spatial Stroop effects were also similar for the simple characters [symbol: see text] ("east") and [symbol: see text] ("west") and for the complex characters [symbol: see text] ("state") and [symbol: see text] ("spray"), which contain [symbol: see text] and [symbol: see text] as radicals. This outcome occurred regardless of whether the task was to identify the character (Exps. 1 and 3) or its location (Exp. 2). Thus, the spatial Stroop effect emerges in the processing of radicals just as it does for processing simple characters. This finding suggests that when reading ideogrammic compounds, (a) their radicals' meanings can be processed and (b) ideogrammic compounds have little or no influence on their radicals' semantic processing.
Digital scrambling for shuttle communication links: Do drawbacks outweigh advantages?
NASA Technical Reports Server (NTRS)
Dessouky, K.
1985-01-01
Digital data scrambling has been considered for communication systems using NRZ (non-return to zero) symbol formats. The purpose is to increase the number of transitions in the data to improve the performance of the symbol synchronizer. This is accomplished without expanding the bandwidth but at the expense of increasing the data bit error rate (BER). Models for the scramblers/descramblers of practical interest are presented together with the appropriate link model. The effects of scrambling on the performance of coded and uncoded links are studied. The results are illustrated by application to the Tracking and Data Relay Satellite System links. Conclusions regarding the usefulness of scrambling are also given.
Prinja, Shankar; Manchanda, Neha; Aggarwal, Arun Kumar; Kaur, Manmeet; Jeet, Gursimer; Kumar, Rajesh
2013-12-01
Various models of referral transport services have been introduced in different States in India with an aim to reduce maternal and infant mortality. Most of the research on referral transport has focussed on coverage, quality and timeliness of the service with not much information on cost and efficiency. This study was undertaken to analyze the cost of a publicly financed and managed referral transport service model in three districts of Haryana State, and to assess its cost and technical efficiency. Data on all resources spent for delivering referral transport service, during 2010, were collected from three districts of Haryana State. Costs incurred at State level were apportioned using appropriate methods. Data Envelopment Analysis (DEA) technique was used to assess the technical efficiency of ambulances. To estimate the efficient scale of operation for ambulance service, the average cost was regressed on kilometres travelled for each ambulance station using a quadratic regression equation. The cost of referral transport per year varied from [symbol: see text] 5.2 million in Narnaul to [symbol: see text] 9.8 million in Ambala. Salaries (36-50%) constituted the major cost. Referral transport was found to be operating at an average efficiency level of 76.8 per cent. Operating an ambulance with a patient load of 137 per month was found to reduce unit costs from an average [symbol: see text] 15.5 per km to [symbol: see text] 9.57 per km. Our results showed that the publicly delivered referral transport services in Haryana were operating at an efficient level. Increasing the demand for referral transport services among the target population represents an opportunity for further improving the efficiency of the underutilized ambulances.
Model Diagnostics for Bayesian Networks
ERIC Educational Resources Information Center
Sinharay, Sandip
2006-01-01
Bayesian networks are frequently used in educational assessments primarily for learning about students' knowledge and skills. There is a lack of works on assessing fit of Bayesian networks. This article employs the posterior predictive model checking method, a popular Bayesian model checking tool, to assess fit of simple Bayesian networks. A…
Hiniker, Alexis
2016-01-01
Despite reports of mathematical talent in autism spectrum disorders (ASD), little is known about basic number processing abilities in affected children. We investigated number sense, the ability to rapidly assess quantity information, in 36 children with ASD and 61 typically developing controls. Numerical acuity was assessed using symbolic (Arabic numerals) as well as non-symbolic (dot array) formats. We found significant impairments in non-symbolic acuity in children with ASD, but symbolic acuity was intact. Symbolic acuity mediated the relationship between non-symbolic acuity and mathematical abilities only in children with ASD, indicating a distinctive role for symbolic number sense in the acquisition of mathematical proficiency in this group. Our findings suggest that symbolic systems may help children with ASD organize imprecise information. PMID:26659551
A Directed Acyclic Graph-Large Margin Distribution Machine Model for Music Symbol Classification
Wen, Cuihong; Zhang, Jing; Rebelo, Ana; Cheng, Fanyong
2016-01-01
Optical Music Recognition (OMR) has received increasing attention in recent years. In this paper, we propose a classifier based on a new method named Directed Acyclic Graph-Large margin Distribution Machine (DAG-LDM). The DAG-LDM is an improvement of the Large margin Distribution Machine (LDM), which is a binary classifier that optimizes the margin distribution by maximizing the margin mean and minimizing the margin variance simultaneously. We modify the LDM to the DAG-LDM to solve the multi-class music symbol classification problem. Tests are conducted on more than 10000 music symbol images, obtained from handwritten and printed images of music scores. The proposed method provides superior classification capability and achieves much higher classification accuracy than the state-of-the-art algorithms such as Support Vector Machines (SVMs) and Neural Networks (NNs). PMID:26985826
A Directed Acyclic Graph-Large Margin Distribution Machine Model for Music Symbol Classification.
Wen, Cuihong; Zhang, Jing; Rebelo, Ana; Cheng, Fanyong
2016-01-01
Optical Music Recognition (OMR) has received increasing attention in recent years. In this paper, we propose a classifier based on a new method named Directed Acyclic Graph-Large margin Distribution Machine (DAG-LDM). The DAG-LDM is an improvement of the Large margin Distribution Machine (LDM), which is a binary classifier that optimizes the margin distribution by maximizing the margin mean and minimizing the margin variance simultaneously. We modify the LDM to the DAG-LDM to solve the multi-class music symbol classification problem. Tests are conducted on more than 10000 music symbol images, obtained from handwritten and printed images of music scores. The proposed method provides superior classification capability and achieves much higher classification accuracy than the state-of-the-art algorithms such as Support Vector Machines (SVMs) and Neural Networks (NNs).
Neural Cognition and Affective Computing on Cyber Language.
Huang, Shuang; Zhou, Xuan; Xue, Ke; Wan, Xiqiong; Yang, Zhenyi; Xu, Duo; Ivanović, Mirjana; Yu, Xueer
2015-01-01
Characterized by its customary symbol system and simple and vivid expression patterns, cyber language acts as not only a tool for convenient communication but also a carrier of abundant emotions and causes high attention in public opinion analysis, internet marketing, service feedback monitoring, and social emergency management. Based on our multidisciplinary research, this paper presents a classification of the emotional symbols in cyber language, analyzes the cognitive characteristics of different symbols, and puts forward a mechanism model to show the dominant neural activities in that process. Through the comparative study of Chinese, English, and Spanish, which are used by the largest population in the world, this paper discusses the expressive patterns of emotions in international cyber languages and proposes an intelligent method for affective computing on cyber language in a unified PAD (Pleasure-Arousal-Dominance) emotional space.
Neural Cognition and Affective Computing on Cyber Language
Huang, Shuang; Zhou, Xuan; Xue, Ke; Wan, Xiqiong; Yang, Zhenyi; Xu, Duo; Ivanović, Mirjana
2015-01-01
Characterized by its customary symbol system and simple and vivid expression patterns, cyber language acts as not only a tool for convenient communication but also a carrier of abundant emotions and causes high attention in public opinion analysis, internet marketing, service feedback monitoring, and social emergency management. Based on our multidisciplinary research, this paper presents a classification of the emotional symbols in cyber language, analyzes the cognitive characteristics of different symbols, and puts forward a mechanism model to show the dominant neural activities in that process. Through the comparative study of Chinese, English, and Spanish, which are used by the largest population in the world, this paper discusses the expressive patterns of emotions in international cyber languages and proposes an intelligent method for affective computing on cyber language in a unified PAD (Pleasure-Arousal-Dominance) emotional space. PMID:26491431
Eagle, Dawn M.; Noschang, Cristie; d’Angelo, Laure-Sophie Camilla; Noble, Christie A.; Day, Jacob O.; Dongelmans, Marie Louise; Theobald, David E.; Mar, Adam C.; Urcelay, Gonzalo P.; Morein-Zamir, Sharon; Robbins, Trevor W.
2014-01-01
Excessive checking is a common, debilitating symptom of obsessive-compulsive disorder (OCD). In an established rodent model of OCD checking behaviour, quinpirole (dopamine D2/3-receptor agonist) increased checking in open-field tests, indicating dopaminergic modulation of checking-like behaviours. We designed a novel operant paradigm for rats (observing response task (ORT)) to further examine cognitive processes underpinning checking behaviour and clarify how and why checking develops. We investigated i) how quinpirole increases checking, ii) dependence of these effects on D2/3 receptor function (following treatment with D2/3 receptor antagonist sulpiride) and iii) effects of reward uncertainty. In the ORT, rats pressed an ‘observing’ lever for information about the location of an ‘active’ lever that provided food reinforcement. High- and low-checkers (defined from baseline observing) received quinpirole (0.5 mg/kg, 10 treatments) or vehicle. Parametric task manipulations assessed observing/checking under increasing task demands relating to reinforcement uncertainty (variable response requirement and active-lever location switching). Treatment with sulpiride further probed the pharmacological basis of long-term behavioural changes. Quinpirole selectively increased checking, both functional observing lever presses (OLPs) and non-functional extra OLPs (EOLPs). The increase in OLPs and EOLPs was long-lasting, without further quinpirole administration. Quinpirole did not affect the immediate ability to use information from checking. Vehicle and quinpirole-treated rats (VEH and QNP respectively) were selectively sensitive to different forms of uncertainty. Sulpiride reduced non-functional EOLPs in QNP rats but had no effect on functional OLPs. These data have implications for treatment of compulsive checking in OCD, particularly for serotonin-reuptake-inhibitor treatment-refractory cases, where supplementation with dopamine receptor antagonists may be beneficial. PMID:24406720
2017-03-01
models of software execution, for example memory access patterns, to check for security intrusions. Additional research was performed to tackle the...considered using indirect models of software execution, for example memory access patterns, to check for security intrusions. Additional research ...deterioration for example , no longer corresponds to the model used during verification time. Finally, the research looked at ways to combine hybrid systems
Heart in anatomy history, radiology, anthropology and art.
Marinković, S; Lazić, D; Kanjuh, V; Valjarević, S; Tomić, I; Aksić, M; Starčević, A
2014-05-01
Anthropologic, artistic and medical significance of heart inspired usto undertake this multidisciplinary study. Amongst the 24 obtained echocardiograms and phonograms, 1 was used for a Photoshop processing. In addition, over 20,000 art work reproductions were examined in this study. Artistic and symbolic presentation of heart started some 15,000 years ago. First heart models were made by the Egyptian and Olmec civilisations. Ancient cultures regarded heart as the seat of the soul, spirit and intelligence. First anatomical and artistic images of heart were created by Leonardo da Vinci in the15th century, and first wax models by the Italian anatomists in the 17th century. Mediaeval religious symbolism of heart was replaced in the Renaissance and later on mainly by its role in the romantic love. Anatomical heart art continued in the 18th and 19th centuries through the works of Sénac, Cloquet, Hirschfeldand Bourgery. Some modern artists, such as Dalí, Kahlo, Rivera, Warhol, Ivanjicki, Vital, Kober and Mastrlova, created the anatomical heart images or sculptures, whereas some others, such as Duchamp, Klee, Miró, Matisse and Dine, presented heart symbol in their artworks. New radiologic technologies produce fine images of heart, some of which are similar to the works of modern artists. Heart biology and symbolism have had a tremendous influence on our culture, including art and medical sciences. New radiologic techniques and computer technology have produced such images of heart, which substantially improved diagnosis, but also enhanced the heart aesthetics.
Thistle, Jennifer J; Wilkinson, Krista
2017-09-01
Children whose speech does not meet their communication needs often benefit from augmentative and alternative communication (AAC). The design of an AAC display may influence the child's ability to communicate effectively. The current study examined how symbol background color cues and symbol arrangement affected construction of multi-symbol messages using line-drawing symbols, by young children with typical development. Participants (N = 52) heard a spoken phrase matching a photograph and selected line drawings within a 4 × 4 array. Friedman two-way ANOVAs evaluated speed and accuracy of multi-symbol message construction under four conditions in which the background color and arrangement of symbols was manipulated. Participants demonstrated significantly faster response times when symbols were arranged by word-class category compared to no symbol arrangement. The majority of children responded faster when symbols had white backgrounds, but this effect failed to reach statistical significance. This study provides preliminary evidence suggesting the importance of symbol arrangement for young children. The findings highlight the need for caution when incorporating background color on displays for young children. Future research is needed to examine the effect of visual cues on children who use AAC and consider additional factors that could influence efficacy of symbol arrangement and background color use.
The Moon System Adapted for Musical Notation.
ERIC Educational Resources Information Center
Jackson, Michael
1987-01-01
A means is presented for using William Moon's embossed symbols to represent musical notation for blind individuals, as an alternative to braille notation. The proposed system includes pitch symbols, octave indicators, duration symbols, accidentals, key signatures, rests, stress symbols, ornaments, and other symbols. (Author/JDD)
2012-01-01
Background The question whether Developmental Dyscalculia (DD; a deficit in the ability to process numerical information) is the result of deficiencies in the non symbolic numerical representation system (e.g., a group of dots) or in the symbolic numerical representation system (e.g., Arabic numerals) has been debated in scientific literature. It is accepted that the non symbolic system is divided into two different ranges, the subitizing range (i.e., quantities from 1-4) which is processed automatically and quickly, and the counting range (i.e., quantities larger than 4) which is an attention demanding procedure and is therefore processed serially and slowly. However, so far no study has tested the automaticity of symbolic and non symbolic representation in DD participants separately for the subitizing and the counting ranges. Methods DD and control participants undergo a novel version of the Stroop task, i.e., the Enumeration Stroop. They were presented with a random series of between one and nine written digits, and were asked to name either the relevant written digit (in the symbolic task) or the relevant quantity of digits (in the non symbolic task) while ignoring the irrelevant aspect. Result DD participants, unlike the control group, didn't show any congruency effect in the subitizing range of the non symbolic task. Conclusion These findings suggest that DD may be impaired in the ability to process symbolic numerical information or in the ability to automatically associate the two systems (i.e., the symbolic vs. the non symbolic). Additionally DD have deficiencies in the non symbolic counting range. PMID:23190433
Furman, Tamar; Rubinsten, Orly
2012-11-28
The question whether Developmental Dyscalculia (DD; a deficit in the ability to process numerical information) is the result of deficiencies in the non symbolic numerical representation system (e.g., a group of dots) or in the symbolic numerical representation system (e.g., Arabic numerals) has been debated in scientific literature. It is accepted that the non symbolic system is divided into two different ranges, the subitizing range (i.e., quantities from 1-4) which is processed automatically and quickly, and the counting range (i.e., quantities larger than 4) which is an attention demanding procedure and is therefore processed serially and slowly. However, so far no study has tested the automaticity of symbolic and non symbolic representation in DD participants separately for the subitizing and the counting ranges. DD and control participants undergo a novel version of the Stroop task, i.e., the Enumeration Stroop. They were presented with a random series of between one and nine written digits, and were asked to name either the relevant written digit (in the symbolic task) or the relevant quantity of digits (in the non symbolic task) while ignoring the irrelevant aspect. DD participants, unlike the control group, didn't show any congruency effect in the subitizing range of the non symbolic task. These findings suggest that DD may be impaired in the ability to process symbolic numerical information or in the ability to automatically associate the two systems (i.e., the symbolic vs. the non symbolic). Additionally DD have deficiencies in the non symbolic counting range.
Front-of-pack symbols are not a reliable indicator of products with healthier nutrient profiles.
Emrich, Teri E; Qi, Ying; Cohen, Joanna E; Lou, Wendy Y; L'Abbe, Mary L
2015-01-01
Front-of-pack (FOP) nutrition rating systems and symbols are a form of nutrition marketing used on food labels worldwide. In the absence of standardized criteria for their use, it is unclear if FOP symbols are being used to promote products more nutritious than products without symbols. To compare the amount of calories, saturated fat, sodium, and sugar in products with FOP symbols, and different FOP symbol types, to products without symbols. The median calorie, saturated fat, sodium, and sugar content per reference amount of products with FOP symbols were compared to products without FOP symbols using data from the Food Label Information Program, a database of 10,487 Canadian packaged food labels. Ten food categories and 60 subcategories were analyzed. Nutrient content differences were compared using Wilcoxon rank-sum test; differences greater than 25% were deemed nutritionally relevant. Products with FOP symbols were not uniformly lower in calories, saturated fat, sodium, and sugar per reference amount than products without these symbols in any food category and the majority of subcategories (59/60). None of the different FOP types examined were used to market products with overall better nutritional profiles than products without this type of marketing. FOP symbols are being used to market foods that are no more nutritious than foods without this type of marketing. Because FOP symbols may influence consumer perceptions of products and their purchases, it may be a useful public health strategy to set minimum nutritional standards for products using FOP symbol marketing. Copyright © 2014 Elsevier Ltd. All rights reserved.
Ordinality and the nature of symbolic numbers.
Lyons, Ian M; Beilock, Sian L
2013-10-23
The view that representations of symbolic and nonsymbolic numbers are closely tied to one another is widespread. However, the link between symbolic and nonsymbolic numbers is almost always inferred from cardinal processing tasks. In the current work, we show that considering ordinality instead points to striking differences between symbolic and nonsymbolic numbers. Human behavioral and neural data show that ordinal processing of symbolic numbers (Are three Indo-Arabic numerals in numerical order?) is distinct from symbolic cardinal processing (Which of two numerals represents the greater quantity?) and nonsymbolic number processing (ordinal and cardinal judgments of dot-arrays). Behaviorally, distance-effects were reversed when assessing ordinality in symbolic numbers, but canonical distance-effects were observed for cardinal judgments of symbolic numbers and all nonsymbolic judgments. At the neural level, symbolic number-ordering was the only numerical task that did not show number-specific activity (greater than control) in the intraparietal sulcus. Only activity in left premotor cortex was specifically associated with symbolic number-ordering. For nonsymbolic numbers, activation in cognitive-control areas during ordinal processing and a high degree of overlap between ordinal and cardinal processing networks indicate that nonsymbolic ordinality is assessed via iterative cardinality judgments. This contrasts with a striking lack of neural overlap between ordinal and cardinal judgments anywhere in the brain for symbolic numbers, suggesting that symbolic number processing varies substantially with computational context. Ordinal processing sheds light on key differences between symbolic and nonsymbolic number processing both behaviorally and in the brain. Ordinality may prove important for understanding the power of representing numbers symbolically.
PROTO-PLASM: parallel language for adaptive and scalable modelling of biosystems.
Bajaj, Chandrajit; DiCarlo, Antonio; Paoluzzi, Alberto
2008-09-13
This paper discusses the design goals and the first developments of PROTO-PLASM, a novel computational environment to produce libraries of executable, combinable and customizable computer models of natural and synthetic biosystems, aiming to provide a supporting framework for predictive understanding of structure and behaviour through multiscale geometric modelling and multiphysics simulations. Admittedly, the PROTO-PLASM platform is still in its infancy. Its computational framework--language, model library, integrated development environment and parallel engine--intends to provide patient-specific computational modelling and simulation of organs and biosystem, exploiting novel functionalities resulting from the symbolic combination of parametrized models of parts at various scales. PROTO-PLASM may define the model equations, but it is currently focused on the symbolic description of model geometry and on the parallel support of simulations. Conversely, CellML and SBML could be viewed as defining the behavioural functions (the model equations) to be used within a PROTO-PLASM program. Here we exemplify the basic functionalities of PROTO-PLASM, by constructing a schematic heart model. We also discuss multiscale issues with reference to the geometric and physical modelling of neuromuscular junctions.
Proto-Plasm: parallel language for adaptive and scalable modelling of biosystems
Bajaj, Chandrajit; DiCarlo, Antonio; Paoluzzi, Alberto
2008-01-01
This paper discusses the design goals and the first developments of Proto-Plasm, a novel computational environment to produce libraries of executable, combinable and customizable computer models of natural and synthetic biosystems, aiming to provide a supporting framework for predictive understanding of structure and behaviour through multiscale geometric modelling and multiphysics simulations. Admittedly, the Proto-Plasm platform is still in its infancy. Its computational framework—language, model library, integrated development environment and parallel engine—intends to provide patient-specific computational modelling and simulation of organs and biosystem, exploiting novel functionalities resulting from the symbolic combination of parametrized models of parts at various scales. Proto-Plasm may define the model equations, but it is currently focused on the symbolic description of model geometry and on the parallel support of simulations. Conversely, CellML and SBML could be viewed as defining the behavioural functions (the model equations) to be used within a Proto-Plasm program. Here we exemplify the basic functionalities of Proto-Plasm, by constructing a schematic heart model. We also discuss multiscale issues with reference to the geometric and physical modelling of neuromuscular junctions. PMID:18559320
A model for indexing medical documents combining statistical and symbolic knowledge.
Avillach, Paul; Joubert, Michel; Fieschi, Marius
2007-10-11
To develop and evaluate an information processing method based on terminologies, in order to index medical documents in any given documentary context. We designed a model using both symbolic general knowledge extracted from the Unified Medical Language System (UMLS) and statistical knowledge extracted from a domain of application. Using statistical knowledge allowed us to contextualize the general knowledge for every particular situation. For each document studied, the extracted terms are ranked to highlight the most significant ones. The model was tested on a set of 17,079 French standardized discharge summaries (SDSs). The most important ICD-10 term of each SDS was ranked 1st or 2nd by the method in nearly 90% of the cases. The use of several terminologies leads to more precise indexing. The improvement achieved in the models implementation performances as a result of using semantic relationships is encouraging.
Developing Computer Model-Based Assessment of Chemical Reasoning: A Feasibility Study
ERIC Educational Resources Information Center
Liu, Xiufeng; Waight, Noemi; Gregorius, Roberto; Smith, Erica; Park, Mihwa
2012-01-01
This paper reports a feasibility study on developing computer model-based assessments of chemical reasoning at the high school level. Computer models are flash and NetLogo environments to make simultaneously available three domains in chemistry: macroscopic, submicroscopic, and symbolic. Students interact with computer models to answer assessment…
NASA Astrophysics Data System (ADS)
Ravelo-García, A. G.; Saavedra-Santana, P.; Juliá-Serdá, G.; Navarro-Mesa, J. L.; Navarro-Esteva, J.; Álvarez-López, X.; Gapelyuk, A.; Penzel, T.; Wessel, N.
2014-06-01
Many sleep centres try to perform a reduced portable test in order to decrease the number of overnight polysomnographies that are expensive, time-consuming, and disturbing. With some limitations, heart rate variability (HRV) has been useful in this task. The aim of this investigation was to evaluate if inclusion of symbolic dynamics variables to a logistic regression model integrating clinical and physical variables, can improve the detection of subjects for further polysomnographies. To our knowledge, this is the first contribution that innovates in that strategy. A group of 133 patients has been referred to the sleep center for suspected sleep apnea. Clinical assessment of the patients consisted of a sleep related questionnaire and a physical examination. The clinical variables related to apnea and selected in the statistical model were age (p < 10-3), neck circumference (p < 10-3), score on a questionnaire scale intended to quantify daytime sleepiness (p < 10-3), and intensity of snoring (p < 10-3). The validation of this model demonstrated an increase in classification performance when a variable based on non-linear dynamics of HRV (p < 0.01) was used additionally to the other variables. For diagnostic rule based only on clinical and physical variables, the corresponding area under the receiver operating characteristic (ROC) curve was 0.907 (95% confidence interval (CI) = 0.848, 0.967), (sensitivity 87.10% and specificity 80%). For the model including the average of a symbolic dynamic variable, the area under the ROC curve was increased to 0.941 (95% = 0.897, 0.985), (sensitivity 88.71% and specificity 82.86%). In conclusion, symbolic dynamics, coupled with significant clinical and physical variables can help to prioritize polysomnographies in patients with a high probability of apnea. In addition, the processing of the HRV is a well established low cost and robust technique.
An Analytical Model for University Identity and Reputation Strategy Work
ERIC Educational Resources Information Center
Steiner, Lars; Sundstrom, Agneta C.; Sammalisto, Kaisu
2013-01-01
Universities face increasing global competition, pressuring them to restructure and find new identities. A multidimensional model: identity, image and reputation of strategic university identity and reputation work is developed. The model includes: organizational identity; employee and student attitudes; symbolic identity; influence from…
Grounding language in action and perception: From cognitive agents to humanoid robots
NASA Astrophysics Data System (ADS)
Cangelosi, Angelo
2010-06-01
In this review we concentrate on a grounded approach to the modeling of cognition through the methodologies of cognitive agents and developmental robotics. This work will focus on the modeling of the evolutionary and developmental acquisition of linguistic capabilities based on the principles of symbol grounding. We review cognitive agent and developmental robotics models of the grounding of language to demonstrate their consistency with the empirical and theoretical evidence on language grounding and embodiment, and to reveal the benefits of such an approach in the design of linguistic capabilities in cognitive robotic agents. In particular, three different models will be discussed, where the complexity of the agent's sensorimotor and cognitive system gradually increases: from a multi-agent simulation of language evolution, to a simulated robotic agent model for symbol grounding transfer, to a model of language comprehension in the humanoid robot iCub. The review also discusses the benefits of the use of humanoid robotic platform, and specifically of the open source iCub platform, for the study of embodied cognition.
Naturalistic Experience and the Early Use of Symbolic Artifacts
ERIC Educational Resources Information Center
Troseth, Georgene L.; Casey, Amy M.; Lawver, Kelly A.; Walker, Joan M. T.; Cole, David A.
2007-01-01
Experience with a variety of symbolic artifacts has been proposed as a mechanism underlying symbolic development. In this study, the parents of 120 2-year-old children who participated in symbolic object retrieval tasks completed a questionnaire regarding their children's naturalistic experience with symbolic artifacts and activities. In separate…
Metasearch Accuracy for Letters and Symbols: Do Our Intuitions Match Empirical Reality?
ERIC Educational Resources Information Center
Green, Sean R.; Redford, Joshua
2016-01-01
The "familiarity effect" (Shen and Reingold, "Perception & Psychophysics" 63(3):464-475, 2001) is a phenomenon in which unfamiliar symbols perceptually "pop-out" when placed among familiar symbols (e.g., letters). In contrast, searching for familiar symbols among unfamiliar symbols is more challenging. Failure to…
Fazio, Lisa K; Bailey, Drew H; Thompson, Clarissa A; Siegler, Robert S
2014-07-01
We examined relations between symbolic and non-symbolic numerical magnitude representations, between whole number and fraction representations, and between these representations and overall mathematics achievement in fifth graders. Fraction and whole number symbolic and non-symbolic numerical magnitude understandings were measured using both magnitude comparison and number line estimation tasks. After controlling for non-mathematical cognitive proficiency, both symbolic and non-symbolic numerical magnitude understandings were uniquely related to mathematics achievement, but the relation was much stronger for symbolic numbers. A meta-analysis of 19 published studies indicated that relations between non-symbolic numerical magnitude knowledge and mathematics achievement are present but tend to be weak, especially beyond 6 years of age. Copyright © 2014 Elsevier Inc. All rights reserved.
Levy, Ruggero
2012-08-01
This article examines pathologies in the creation of symbols and those pathologies' ensuing consequences. It relies mainly on the vertices provided by Bion and Meltzer. It studies the different forms in which these lapses occur in symbolic processes, where they may create vacuums in symbolic networks or give rise to 'lies', and even destroy or de-symbolize established symbols. Based on Bion's concept of the minus-contained (-contained), I propose that when a symbol is attacked, a particular mental structure with its own peculiar characteristics comes about. This structure not only creates a vacuum in that mental zone, it ends up damaging the entire symbolization process. This contribution aims to describe that structure from the metapsychological point of view - contained. I end by synthesizing the possible widening of what could be a Bionian negative grid. Copyright © 2012 Institute of Psychoanalysis.
Method for coding low entrophy data
NASA Technical Reports Server (NTRS)
Yeh, Pen-Shu (Inventor)
1995-01-01
A method of lossless data compression for efficient coding of an electronic signal of information sources of very low information rate is disclosed. In this method, S represents a non-negative source symbol set, (s(sub 0), s(sub 1), s(sub 2), ..., s(sub N-1)) of N symbols with s(sub i) = i. The difference between binary digital data is mapped into symbol set S. Consecutive symbols in symbol set S are then paired into a new symbol set Gamma which defines a non-negative symbol set containing the symbols (gamma(sub m)) obtained as the extension of the original symbol set S. These pairs are then mapped into a comma code which is defined as a coding scheme in which every codeword is terminated with the same comma pattern, such as a 1. This allows a direct coding and decoding of the n-bit positive integer digital data differences without the use of codebooks.
Effects of environmental sounds on the guessability of animated graphic symbols.
Harmon, Ashley C; Schlosser, Ralf W; Gygi, Brian; Shane, Howard C; Kong, Ying-Yee; Book, Lorraine; Macduff, Kelly; Hearn, Emilia
2014-12-01
Graphic symbols are a necessity for pre-literate children who use aided augmentative and alternative communication (AAC) systems (including non-electronic communication boards and speech generating devices), as well as for mobile technologies using AAC applications. Recently, developers of the Autism Language Program (ALP) Animated Graphics Set have added environmental sounds to animated symbols representing verbs in an attempt to enhance their iconicity. The purpose of this study was to examine the effects of environmental sounds (added to animated graphic symbols representing verbs) in terms of naming. Participants included 46 children with typical development between the ages of 3;0 to 3;11 (years;months). The participants were randomly allocated to a condition of symbols with environmental sounds or a condition without environmental sounds. Results indicated that environmental sounds significantly enhanced the naming accuracy of animated symbols for verbs. Implications in terms of symbol selection, symbol refinement, and future symbol development will be discussed.
The sound symbolism bootstrapping hypothesis for language acquisition and language evolution
Imai, Mutsumi; Kita, Sotaro
2014-01-01
Sound symbolism is a non-arbitrary relationship between speech sounds and meaning. We review evidence that, contrary to the traditional view in linguistics, sound symbolism is an important design feature of language, which affects online processing of language, and most importantly, language acquisition. We propose the sound symbolism bootstrapping hypothesis, claiming that (i) pre-verbal infants are sensitive to sound symbolism, due to a biologically endowed ability to map and integrate multi-modal input, (ii) sound symbolism helps infants gain referential insight for speech sounds, (iii) sound symbolism helps infants and toddlers associate speech sounds with their referents to establish a lexical representation and (iv) sound symbolism helps toddlers learn words by allowing them to focus on referents embedded in a complex scene, alleviating Quine's problem. We further explore the possibility that sound symbolism is deeply related to language evolution, drawing the parallel between historical development of language across generations and ontogenetic development within individuals. Finally, we suggest that sound symbolism bootstrapping is a part of a more general phenomenon of bootstrapping by means of iconic representations, drawing on similarities and close behavioural links between sound symbolism and speech-accompanying iconic gesture. PMID:25092666
Norman, Laura M.; Niraula, Rewati
2016-01-01
The objective of this study was to evaluate the effect of check dam infrastructure on soil and water conservation at the catchment scale using the Soil and Water Assessment Tool (SWAT). This paired watershed study includes a watershed treated with over 2000 check dams and a Control watershed which has none, in the West Turkey Creek watershed, Southeast Arizona, USA. SWAT was calibrated for streamflow using discharge documented during the summer of 2013 at the Control site. Model results depict the necessity to eliminate lateral flow from SWAT models of aridland environments, the urgency to standardize geospatial soils data, and the care for which modelers must document altering parameters when presenting findings. Performance was assessed using the percent bias (PBIAS), with values of ±2.34%. The calibrated model was then used to examine the impacts of check dams at the Treated watershed. Approximately 630 tons of sediment is estimated to be stored behind check dams in the Treated watershed over the 3-year simulation, increasing water quality for fish habitat. A minimum precipitation event of 15 mm was necessary to instigate the detachment of soil, sediments, or rock from the study area, which occurred 2% of the time. The resulting watershed model is useful as a predictive framework and decision-support tool to consider long-term impacts of restoration and potential for future restoration.
Joint Carrier-Phase Synchronization and LDPC Decoding
NASA Technical Reports Server (NTRS)
Simon, Marvin; Valles, Esteban
2009-01-01
A method has been proposed to increase the degree of synchronization of a radio receiver with the phase of a suppressed carrier signal modulated with a binary- phase-shift-keying (BPSK) or quaternary- phase-shift-keying (QPSK) signal representing a low-density parity-check (LDPC) code. This method is an extended version of the method described in Using LDPC Code Constraints to Aid Recovery of Symbol Timing (NPO-43112), NASA Tech Briefs, Vol. 32, No. 10 (October 2008), page 54. Both methods and the receiver architectures in which they would be implemented belong to a class of timing- recovery methods and corresponding receiver architectures characterized as pilotless in that they do not require transmission and reception of pilot signals. The proposed method calls for the use of what is known in the art as soft decision feedback to remove the modulation from a replica of the incoming signal prior to feeding this replica to a phase-locked loop (PLL) or other carrier-tracking stage in the receiver. Soft decision feedback refers to suitably processed versions of intermediate results of iterative computations involved in the LDPC decoding process. Unlike a related prior method in which hard decision feedback (the final sequence of decoded symbols) is used to remove the modulation, the proposed method does not require estimation of the decoder error probability. In a basic digital implementation of the proposed method, the incoming signal (having carrier phase theta theta (sub c) plus noise would first be converted to inphase (I) and quadrature (Q) baseband signals by mixing it with I and Q signals at the carrier frequency [wc/(2 pi)] generated by a local oscillator. The resulting demodulated signals would be processed through one-symbol-period integrate and- dump filters, the outputs of which would be sampled and held, then multiplied by a soft-decision version of the baseband modulated signal. The resulting I and Q products consist of terms proportional to the cosine and sine of the carrier phase cc as well as correlated noise components. These products would be fed as inputs to a digital PLL that would include a number-controlled oscillator (NCO), which provides an estimate of the carrier phase, theta(sub c).
NASA Astrophysics Data System (ADS)
Witt, Annette; Ehlers, Frithjof; Luther, Stefan
2017-09-01
We have analyzed symbol sequences of heart beat annotations obtained from 24-h electrocardiogram recordings of 184 post-infarction patients (from the Cardiac Arrhythmia Suppression Trial database, CAST). In the symbol sequences, each heart beat was coded as an arrhythmic or as a normal beat. The symbol sequences were analyzed with a model-based approach which relies on two-parametric peaks over the threshold (POT) model, interpreting each premature ventricular contraction (PVC) as an extreme event. For the POT model, we explored (i) the Shannon entropy which was estimated in terms of the Lempel-Ziv complexity, (ii) the shape parameter of the Weibull distribution that best fits the PVC return times, and (iii) the strength of long-range correlations quantified by detrended fluctuation analysis (DFA) for the two-dimensional parameter space. We have found that in the frame of our model the Lempel-Ziv complexity is functionally related to the shape parameter of the Weibull distribution. Thus, two complementary measures (entropy and strength of long-range correlations) are sufficient to characterize realizations of the two-parametric model. For the CAST data, we have found evidence for an intermediate strength of long-range correlations in the PVC timings, which are correlated to the age of the patient: younger post-infarction patients have higher strength of long-range correlations than older patients. The normalized Shannon entropy has values in the range 0.5
New Results in Software Model Checking and Analysis
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.
2010-01-01
This introductory article surveys new techniques, supported by automated tools, for the analysis of software to ensure reliability and safety. Special focus is on model checking techniques. The article also introduces the five papers that are enclosed in this special journal volume.
Preventing SQL Code Injection by Combining Static and Runtime Analysis
2008-05-01
attacker changes the developer’s intended structure of an SQ L com- mand by inserting new SQ L keywords or operators. (Su and Wasser - mann provide a...FROM b o o k s WHERE a u t h o r = ’ ’ GROUP BY r a t i n g We use symbol as a placeholder for the indeterminate part of the command (in this...dialects of SQL.) In our model, we mark transitions that correspond to externally defined strings with the symbol . To illustrate, Figure 2 shows the SQL
An engineering approach to automatic programming
NASA Technical Reports Server (NTRS)
Rubin, Stuart H.
1990-01-01
An exploratory study of the automatic generation and optimization of symbolic programs using DECOM - a prototypical requirement specification model implemented in pure LISP was undertaken. It was concluded, on the basis of this study, that symbolic processing languages such as LISP can support a style of programming based upon formal transformation and dependent upon the expression of constraints in an object-oriented environment. Such languages can represent all aspects of the software generation process (including heuristic algorithms for effecting parallel search) as dynamic processes since data and program are represented in a uniform format.
Learning fuzzy information in a hybrid connectionist, symbolic model
NASA Technical Reports Server (NTRS)
Romaniuk, Steve G.; Hall, Lawrence O.
1993-01-01
An instance-based learning system is presented. SC-net is a fuzzy hybrid connectionist, symbolic learning system. It remembers some examples and makes groups of examples into exemplars. All real-valued attributes are represented as fuzzy sets. The network representation and learning method is described. To illustrate this approach to learning in fuzzy domains, an example of segmenting magnetic resonance images of the brain is discussed. Clearly, the boundaries between human tissues are ill-defined or fuzzy. Example fuzzy rules for recognition are generated. Segmentations are presented that provide results that radiologists find useful.
Giannini, A J
1993-12-01
Visual art was used to teach the biopsychiatric model of addiction to audiences in the Caribbean, Europe and Mideast. Art slides were tangentially linked to slides of pharmacological data. Stylistically dense art was processed by the intuitive right brain while spare notational pharmacological data was processed by the intellectual (rationalistic) left brain. Simultaneous presentation of these data enhanced attention and retention. This teaching paradigm was based on the nonliterate methods developed by Medieval architects and refined by Italian Renaissance philosopher, Marsilio Ficino.
NASA Astrophysics Data System (ADS)
Vostokov, S. V.
1982-04-01
The theory of a continuous Steinberg symbol in a local field is generalized to formal commutative groups. For Lubin-Tate groups, a universal symbol is constructed in explicit form, and it is shown that the module of values of an arbitrary symbol imbeds into the group of points of the formal group. By means of this theory of symbols a new approach is given to obtaining an explicit form for the Hilbert norm residue symbol on Lubin-Tate formal groups. Bibliography: 10 titles.
Wilkinson, Krista M.; Snell, Julie
2012-01-01
Purpose Communication about feelings is a core element of human interaction. Aided augmentative and alternative communication systems must therefore include symbols representing these concepts. The symbols must be readily distinguishable in order for users to communicate effectively. However, emotions are represented within most systems by schematic faces in which subtle distinctions are difficult to represent. We examined whether background color cuing and spatial arrangement might help children identify symbols for different emotions. Methods Thirty nondisabled children searched for symbols representing emotions within an 8-choice array. On some trials, a color cue signaled the valence of the emotion (positive vs. negative). Additionally, symbols were either organized with the negatively-valenced symbols at the top and the positive symbols on the bottom of the display, or the symbols were distributed randomly throughout. Dependent variables were accuracy and speed of responses. Results The speed with which children could locate a target was significantly faster for displays in which symbols were clustered by valence, but only when the symbols had white backgrounds. Addition of a background color cue did not facilitate responses. Conclusions Rapid search was facilitated by a spatial organization cue, but not by the addition of background color. Further examination of the situations in which color cues may be useful is warranted. PMID:21813821
[Origin of three symbols in medicine and surgery].
de la Garza-Villaseñor, Lorenzo
2010-01-01
Humans use many ways to communicate with fellow humans. Symbols have been one of these ways. Shamans probably used these in the beginning and adopted other distinctive symbols as they were introduced. The origin, the reason and use of three symbols in medicine and surgery are discussed. Some symbols currently remain the same and others have been modified or have disappeared. The oldest of these three symbols is the staff of Aesculapius, related to the Greek god of medicine and health. Since the 19th century, in some countries the symbol of the medical profession has become the caduceus, but the staff is the natural symbol. The second symbol is the barber pole that was created at the beginning of the Middle Ages. This was the means to locate the office and shop of a barber/surgeon in towns, cities and battlefields. On the other hand, the surgeon made use of the emblem of the union, trade or fraternity to which he belonged, accompanied by the bowl for bloodletting. The third symbol is the wearing of long and short robes that distinguished graduate surgeons from a medical school and the so-called barber/surgeons. Symbols facilitate the manner in which to identify the origin or trade of many working people. Some symbols currently remain and others have either been modified or are obsolete, losing their relationship with surgery and medicine.
Symbolic play and language development.
Orr, Edna; Geva, Ronny
2015-02-01
Symbolic play and language are known to be highly interrelated, but the developmental process involved in this relationship is not clear. Three hypothetical paths were postulated to explore how play and language drive each other: (1) direct paths, whereby initiation of basic forms in symbolic action or babbling, will be directly related to all later emerging language and motor outputs; (2) an indirect interactive path, whereby basic forms in symbolic action will be associated with more complex forms in symbolic play, as well as with babbling, and babbling mediates the relationship between symbolic play and speech; and (3) a dual path, whereby basic forms in symbolic play will be associated with basic forms of language, and complex forms of symbolic play will be associated with complex forms of language. We micro-coded 288 symbolic vignettes gathered during a yearlong prospective bi-weekly examination (N=14; from 6 to 18 months of age). Results showed that the age of initiation of single-object symbolic play correlates strongly with the age of initiation of later-emerging symbolic and vocal outputs; its frequency at initiation is correlated with frequency at initiation of babbling, later-emerging speech, and multi-object play in initiation. Results support the notion that a single-object play relates to the development of other symbolic forms via a direct relationship and an indirect relationship, rather than a dual-path hypothesis. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Desy Fatmaryanti, Siska; Suparmi; Sarwanto; Ashadi
2017-11-01
This study focuses on description attainment of students’ conception in the magnetic field. The conception was based by using of direct observation and symbolic language ability. The method used is descriptive quantitative research. The subject of study was about 86 students from 3 senior high school at Purworejo. The learning process was done by guided inquiry model. During the learning, students were required to actively investigate the concept of a magnetic field around a straight wire electrical current Data retrieval was performed using an instrument in the form of a multiple choice test reasoned and observation during the learning process. There was four indicator of direct observation ability and four indicators of symbolic language ability to grouping category of students conception. The results of average score showed that students conception about the magnitude more better than the direction of magnetic fields in view of symbolic language. From the observation, we found that students could draw the magnetic fields line not from a text book but their direct observation results. They used various way to get a good accuracy of observation results. Explicit recommendations are presented in the discussion section at the end of this paper.
Information entropy of humpback whale songs.
Suzuki, Ryuji; Buck, John R; Tyack, Peter L
2006-03-01
The structure of humpback whale (Megaptera novaeangliae) songs was examined using information theory techniques. The song is an ordered sequence of individual sound elements separated by gaps of silence. Song samples were converted into sequences of discrete symbols by both human and automated classifiers. This paper analyzes the song structure in these symbol sequences using information entropy estimators and autocorrelation estimators. Both parametric and nonparametric entropy estimators are applied to the symbol sequences representing the songs. The results provide quantitative evidence consistent with the hierarchical structure proposed for these songs by Payne and McVay [Science 173, 587-597 (1971)]. Specifically, this analysis demonstrates that: (1) There is a strong structural constraint, or syntax, in the generation of the songs, and (2) the structural constraints exhibit periodicities with periods of 6-8 and 180-400 units. This implies that no empirical Markov model is capable of representing the songs' structure. The results are robust to the choice of either human or automated song-to-symbol classifiers. In addition, the entropy estimates indicate that the maximum amount of information that could be communicated by the sequence of sounds made is less than 1 bit per second.
Diagrams benefit symbolic problem-solving.
Chu, Junyi; Rittle-Johnson, Bethany; Fyfe, Emily R
2017-06-01
The format of a mathematics problem often influences students' problem-solving performance. For example, providing diagrams in conjunction with story problems can benefit students' understanding, choice of strategy, and accuracy on story problems. However, it remains unclear whether providing diagrams in conjunction with symbolic equations can benefit problem-solving performance as well. We tested the impact of diagram presence on students' performance on algebra equation problems to determine whether diagrams increase problem-solving success. We also examined the influence of item- and student-level factors to test the robustness of the diagram effect. We worked with 61 seventh-grade students who had received 2 months of pre-algebra instruction. Students participated in an experimenter-led classroom session. Using a within-subjects design, students solved algebra problems in two matched formats (equation and equation-with-diagram). The presence of diagrams increased equation-solving accuracy and the use of informal strategies. This diagram benefit was independent of student ability and item complexity. The benefits of diagrams found previously for story problems generalized to symbolic problems. The findings are consistent with cognitive models of problem-solving and suggest that diagrams may be a useful additional representation of symbolic problems. © 2017 The British Psychological Society.
The Cognitive Content of the World of Symbols in a Language
ERIC Educational Resources Information Center
Zhirenov, Sayan A.; Satemirova, Darikha A.; Ibraeva, Aizat D.; Tanzharikova, Alua V.
2016-01-01
The purpose of this study is to analyze the meaning of symbols, the symbolic world in linguistics. Using the methods of observation, analysis, synthesis and interpretation, the author determines the category of symbols in linguistic-cognitive research. The study delineates connection between linguistic image of the universe and symbolic categories…
Two-year-olds' understanding of self-symbols.
Herold, Katherine; Akhtar, Nameera
2014-09-01
This study investigated 48 2.5-year-olds' ability to map from their own body to a two-dimensional self-representation and also examined relations between parents' talk about body representations and their children's understanding of self-symbols. Children participated in two dual-representation tasks in which they were asked to match body parts between a symbol and its referent. In one task, they used a self-symbol and in the other they used a symbol for a doll. Participants were also read a book about body parts by a parent. As a group, children found the self-symbol task more difficult than the doll-task; however, those whose parents explicitly pointed out the relation between their children's bodies and the symbols in the book performed better on the self-symbol task. The findings demonstrate that 2-year-old children have difficulty comprehending a self-symbol, even when it is two-dimensional and approximately the same size as them, and suggest that parents' talk about self-symbols may facilitate their understanding. © 2014 The British Psychological Society.
Schlosser, Ralf W; Shane, Howard; Sorce, James; Koul, Rajinder; Bloomfield, Emma; Debrowski, Lisa; DeLuca, Tim; Miller, Stephanie; Schneider, Danielle; Neff, Allison
2012-04-01
The effects of animation on transparency, name agreement, and identification of graphic symbols for verbs and prepositions were evaluated in preschoolers of 3 age groups. Methods A mixed-group design was used; in each age group, half of the children were randomly allocated to 1 of 2 orders of symbol formats. The 52 children were asked to guess the meaning of symbols and to identify a target symbol among foils given the spoken label. Animated symbols were more transparent than static symbols, although this was more pronounced for verbs. Animated verbs were named more accurately than static verbs, but there was no difference between animated and static prepositions. Verbs were identified more accurately compared with prepositions, but there was no difference between symbol formats. Older children guessed, named, and identified symbols more effectively than younger children. Animation enhances transparency and name agreement, especially for verbs, which reduces the instructional burden that comes with nontransparent symbols. Animation does not enhance identification accuracy. Verbs are easier to identify than prepositions. A developmental effect was observed for each measure. Limitations and implications for future research are discussed.
Error control techniques for satellite and space communications
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.
1989-01-01
Two aspects of the work for NASA are examined: the construction of multi-dimensional phase modulation trellis codes and a performance analysis of these codes. A complete list is contained of all the best trellis codes for use with phase modulation. LxMPSK signal constellations are included for M = 4, 8, and 16 and L = 1, 2, 3, and 4. Spectral efficiencies range from 1 bit/channel symbol (equivalent to rate 1/2 coded QPSK) to 3.75 bits/channel symbol (equivalent to 15/16 coded 16-PSK). The parity check polynomials, rotational invariance properties, free distance, path multiplicities, and coding gains are given for all codes. These codes are considered to be the best candidates for implementation of a high speed decoder for satellite transmission. The design of a hardware decoder for one of these codes, viz., the 16-state 3x8-PSK code with free distance 4.0 and coding gain 3.75 dB is discussed. An exhaustive simulation study of the multi-dimensional phase modulation trellis codes is contained. This study was motivated by the fact that coding gains quoted for almost all codes found in literature are in fact only asymptotic coding gains, i.e., the coding gain at very high signal to noise ratios (SNRs) or very low BER. These asymptotic coding gains can be obtained directly from a knowledge of the free distance of the code. On the other hand, real coding gains at BERs in the range of 10(exp -2) to 10(exp -6), where these codes are most likely to operate in a concatenated system, must be done by simulation.
Flexible Automatic Discretization for Finite Differences: Eliminating the Human Factor
NASA Astrophysics Data System (ADS)
Pranger, Casper
2017-04-01
In the geophysical numerical modelling community, finite differences are (in part due to their small footprint) a popular spatial discretization method for PDEs in the regular-shaped continuum that is the earth. However, they rapidly become prone to programming mistakes when physics increase in complexity. To eliminate opportunities for human error, we have designed an automatic discretization algorithm using Wolfram Mathematica, in which the user supplies symbolic PDEs, the number of spatial dimensions, and a choice of symbolic boundary conditions, and the script transforms this information into matrix- and right-hand-side rules ready for use in a C++ code that will accept them. The symbolic PDEs are further used to automatically develop and perform manufactured solution benchmarks, ensuring at all stages physical fidelity while providing pragmatic targets for numerical accuracy. We find that this procedure greatly accelerates code development and provides a great deal of flexibility in ones choice of physics.
A pattern jitter free AFC scheme for mobile satellite systems
NASA Technical Reports Server (NTRS)
Yoshida, Shousei
1993-01-01
This paper describes a scheme for pattern jitter free automatic frequency control (AFC) with a wide frequency acquisition range. In this scheme, equalizing signals fed to the frequency discriminator allow pattern jitter free performance to be achieved for all roll-off factors. In order to define the acquisition range, frequency discrimination characateristics are analyzed on a newly derived frequency domain model. As a result, it is shown that a sufficiently wide acquisition range over a given system symbol rate can be achieved independent of symbol timing errors. Additionally, computer simulation demonstrates that frequency jitter performance improves in proportion to E(sub b)/N(sub 0) because pattern-dependent jitter is suppressed in the discriminator output. These results show significant promise for applciation to mobile satellite systems, which feature relatively low symbol rate transmission with an approximately 0.4-0.7 roll-off factor.
Decomposition of conditional probability for high-order symbolic Markov chains.
Melnik, S S; Usatenko, O V
2017-07-01
The main goal of this paper is to develop an estimate for the conditional probability function of random stationary ergodic symbolic sequences with elements belonging to a finite alphabet. We elaborate on a decomposition procedure for the conditional probability function of sequences considered to be high-order Markov chains. We represent the conditional probability function as the sum of multilinear memory function monomials of different orders (from zero up to the chain order). This allows us to introduce a family of Markov chain models and to construct artificial sequences via a method of successive iterations, taking into account at each step increasingly high correlations among random elements. At weak correlations, the memory functions are uniquely expressed in terms of the high-order symbolic correlation functions. The proposed method fills the gap between two approaches, namely the likelihood estimation and the additive Markov chains. The obtained results may have applications for sequential approximation of artificial neural network training.
Decomposition of conditional probability for high-order symbolic Markov chains
NASA Astrophysics Data System (ADS)
Melnik, S. S.; Usatenko, O. V.
2017-07-01
The main goal of this paper is to develop an estimate for the conditional probability function of random stationary ergodic symbolic sequences with elements belonging to a finite alphabet. We elaborate on a decomposition procedure for the conditional probability function of sequences considered to be high-order Markov chains. We represent the conditional probability function as the sum of multilinear memory function monomials of different orders (from zero up to the chain order). This allows us to introduce a family of Markov chain models and to construct artificial sequences via a method of successive iterations, taking into account at each step increasingly high correlations among random elements. At weak correlations, the memory functions are uniquely expressed in terms of the high-order symbolic correlation functions. The proposed method fills the gap between two approaches, namely the likelihood estimation and the additive Markov chains. The obtained results may have applications for sequential approximation of artificial neural network training.
Symbolic Dynamics of Reanalysis Data
NASA Astrophysics Data System (ADS)
Larson, J. W.; Dickens, P. M.
2003-12-01
Symbolic dynamics1 is the study of sequences of symbols belonging to a discrete set of elements, the most commmon example being a sequence of ones and zeroes. Often the set of symbols is derived from a timeseries of a continuous variable through the introduction of a partition function--a process called symbolization. Symbolic dynamics has been used widely in the physical sciences; a geophysical example being the application of C1 and C2 complexity2 to hourly precipitation station data3. The C1 and C2 complexities are computed by examining subsequences--or words--of fixed length L in the limit of large values of L. Recent advances in information theory have led to techniques focused on the growth rate of the Shannon entropy and its asymptotic behavior in the limit of long words--levels of entropy convergence4. The result is a set of measures one can use to quantify the amount of memory stored in the sequence, whether or not an observer is able to synchronize to the sequence, and with what confidence it may be predicted. These techniques may also be used to uncover periodic behavior in the sequence. We are currently applying complexity theory and levels of entropy convergence to gridpoint timeseries from the NCAR/NCEP 50-year reanalysis5. Topics to be discussed include: a brief introduction to symbolic dynamics; a description of the partition function/symbolization strategy; a discussion of C1 and C2 complexity and entropy convergence rates and their utility; and example applications of these techniques to NCAR/NCEP 50-reanalyses gridpoint timeseries, resulting in maps of C1 and C2 complexities and entropy convergence rates. Finally, we will discuss how these results may be used to validate climate models. 1{Hao, Bai-Lin, Elementary Symbolic Dynamics and Chaos in Dissipative Systems, Wold Scientific, Singapore (1989)} 2{d'Alessandro, G. and Politi, A., Phys. Rev. Lett., 64, 1609-1612 (1990).} 3{Elsner, J. and Tsonis, A., J. Atmos. Sci., 50, 400-405 (1993).} 4{Crutchfield, J. and Feldman, D., Chaos, {bf 13}, 25-54 (2003).} 5{Kalnay, E.~, Kanamitsu, M.~, Kistler, R.~, Collins, W.~, Deaven, D.~, Gandin, L.~, Iredell, M.~, Saha, S.~, White, G.~, Woolen, J.~, Zhu, Y.~, Chelliah, M.~, Ebisuzaki, W.~, Higgins, W.~, Janowiak, J.~, Mo, K.~C.~, Ropelewski, C.~, Wang, J.~, Leetmaa, A.~, Reynolds, R.~, Jenne, R.~, and Joseph, D.~, Bull. Amer. Met. Soc., 77, 437-471 (1996).}
ERIC Educational Resources Information Center
Edelson, Edward
1980-01-01
Described are the historical uses and research involving the discipline of artifical intelligence. Topics discussed include: symbol manipulation; knowledge engineering; cognitive modeling; and language, vision and robotics. (Author/DS)
Reducing False Positives in Runtime Analysis of Deadlocks
NASA Technical Reports Server (NTRS)
Bensalem, Saddek; Havelund, Klaus; Clancy, Daniel (Technical Monitor)
2002-01-01
This paper presents an improvement of a standard algorithm for detecting dead-lock potentials in multi-threaded programs, in that it reduces the number of false positives. The standard algorithm works as follows. The multi-threaded program under observation is executed, while lock and unlock events are observed. A graph of locks is built, with edges between locks symbolizing locking orders. Any cycle in the graph signifies a potential for a deadlock. The typical standard example is the group of dining philosophers sharing forks. The algorithm is interesting because it can catch deadlock potentials even though no deadlocks occur in the examined trace, and at the same time it scales very well in contrast t o more formal approaches to deadlock detection. The algorithm, however, can yield false positives (as well as false negatives). The extension of the algorithm described in this paper reduces the amount of false positives for three particular cases: when a gate lock protects a cycle, when a single thread introduces a cycle, and when the code segments in different threads that cause the cycle can actually not execute in parallel. The paper formalizes a theory for dynamic deadlock detection and compares it to model checking and static analysis techniques. It furthermore describes an implementation for analyzing Java programs and its application to two case studies: a planetary rover and a space craft altitude control system.
NASA Astrophysics Data System (ADS)
Moissinac, Henri; Maitre, Henri; Bloch, Isabelle
1995-11-01
An image interpretation method is presented for the automatic processing of aerial pictures of a urban landscape. In order to improve the picture analysis, some a priori knowledge extracted from a geographic map is introduced. A coherent graph-based model of the city is built, starting with the road network. A global uncertainty management scheme has been designed in order to evaluate the final confidence we can have in the final results. This model and the uncertainty management tend to reflect the hierarchy of the available data and the interpretation levels. The symbolic relationships linking the different kinds of elements are taken into account while propagating and combining the confidence measures along the interpretation process.
Learning Extended Finite State Machines
NASA Technical Reports Server (NTRS)
Cassel, Sofia; Howar, Falk; Jonsson, Bengt; Steffen, Bernhard
2014-01-01
We present an active learning algorithm for inferring extended finite state machines (EFSM)s, combining data flow and control behavior. Key to our learning technique is a novel learning model based on so-called tree queries. The learning algorithm uses the tree queries to infer symbolic data constraints on parameters, e.g., sequence numbers, time stamps, identifiers, or even simple arithmetic. We describe sufficient conditions for the properties that the symbolic constraints provided by a tree query in general must have to be usable in our learning model. We have evaluated our algorithm in a black-box scenario, where tree queries are realized through (black-box) testing. Our case studies include connection establishment in TCP and a priority queue from the Java Class Library.
NASA Astrophysics Data System (ADS)
Song, Tianyu; Kam, Pooi-Yuen
2016-02-01
Since atmospheric turbulence and pointing errors cause signal intensity fluctuations and the background radiation surrounding the free-space optical (FSO) receiver contributes an undesired noisy component, the receiver requires accurate channel state information (CSI) and background information to adjust the detection threshold. In most previous studies, for CSI acquisition, pilot symbols were employed, which leads to a reduction of spectral and energy efficiency; and an impractical assumption that the background radiation component is perfectly known was made. In this paper, we develop an efficient and robust sequence receiver, which acquires the CSI and the background information implicitly and requires no knowledge about the channel model information. It is robust since it can automatically estimate the CSI and background component and detect the data sequence accordingly. Its decision metric has a simple form and involves no integrals, and thus can be easily evaluated. A Viterbi-type trellis-search algorithm is adopted to improve the search efficiency, and a selective-store strategy is adopted to overcome a potential error floor problem as well as to increase the memory efficiency. To further simplify the receiver, a decision-feedback symbol-by-symbol receiver is proposed as an approximation of the sequence receiver. By simulations and theoretical analysis, we show that the performance of both the sequence receiver and the symbol-by-symbol receiver, approach that of detection with perfect knowledge of the CSI and background radiation, as the length of the window for forming the decision metric increases.
A modular architecture for transparent computation in recurrent neural networks.
Carmantini, Giovanni S; Beim Graben, Peter; Desroches, Mathieu; Rodrigues, Serafim
2017-01-01
Computation is classically studied in terms of automata, formal languages and algorithms; yet, the relation between neural dynamics and symbolic representations and operations is still unclear in traditional eliminative connectionism. Therefore, we suggest a unique perspective on this central issue, to which we would like to refer as transparent connectionism, by proposing accounts of how symbolic computation can be implemented in neural substrates. In this study we first introduce a new model of dynamics on a symbolic space, the versatile shift, showing that it supports the real-time simulation of a range of automata. We then show that the Gödelization of versatile shifts defines nonlinear dynamical automata, dynamical systems evolving on a vectorial space. Finally, we present a mapping between nonlinear dynamical automata and recurrent artificial neural networks. The mapping defines an architecture characterized by its granular modularity, where data, symbolic operations and their control are not only distinguishable in activation space, but also spatially localizable in the network itself, while maintaining a distributed encoding of symbolic representations. The resulting networks simulate automata in real-time and are programmed directly, in the absence of network training. To discuss the unique characteristics of the architecture and their consequences, we present two examples: (i) the design of a Central Pattern Generator from a finite-state locomotive controller, and (ii) the creation of a network simulating a system of interactive automata that supports the parsing of garden-path sentences as investigated in psycholinguistics experiments. Copyright © 2016 Elsevier Ltd. All rights reserved.
Abstraction Techniques for Parameterized Verification
2006-11-01
approach for applying model checking to unbounded systems is to extract finite state models from them using conservative abstraction techniques. Prop...36 2.5.1 Multiple Reference Processes . . . . . . . . . . . . . . . . . . . 36 2.5.2 Adding Monitor Processes...model checking to complex pieces of code like device drivers depends on the use of abstraction methods. An abstraction method extracts a small finite
The sound symbolism bootstrapping hypothesis for language acquisition and language evolution.
Imai, Mutsumi; Kita, Sotaro
2014-09-19
Sound symbolism is a non-arbitrary relationship between speech sounds and meaning. We review evidence that, contrary to the traditional view in linguistics, sound symbolism is an important design feature of language, which affects online processing of language, and most importantly, language acquisition. We propose the sound symbolism bootstrapping hypothesis, claiming that (i) pre-verbal infants are sensitive to sound symbolism, due to a biologically endowed ability to map and integrate multi-modal input, (ii) sound symbolism helps infants gain referential insight for speech sounds, (iii) sound symbolism helps infants and toddlers associate speech sounds with their referents to establish a lexical representation and (iv) sound symbolism helps toddlers learn words by allowing them to focus on referents embedded in a complex scene, alleviating Quine's problem. We further explore the possibility that sound symbolism is deeply related to language evolution, drawing the parallel between historical development of language across generations and ontogenetic development within individuals. Finally, we suggest that sound symbolism bootstrapping is a part of a more general phenomenon of bootstrapping by means of iconic representations, drawing on similarities and close behavioural links between sound symbolism and speech-accompanying iconic gesture. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
A Survey of New Trends in Symbolic Execution for Software Testing and Analysis
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Visser, Willem
2009-01-01
Symbolic execution is a well-known program analysis technique which represents values of program inputs with symbolic values instead of concrete (initialized) data and executes the program by manipulating program expressions involving the symbolic values. Symbolic execution has been proposed over three decades ago but recently it has found renewed interest in the research community, due in part to the progress in decision procedures, availability of powerful computers and new algorithmic developments. We provide a survey of some of the new research trends in symbolic execution, with particular emphasis on applications to test generation and program analysis. We first describe an approach that handles complex programming constructs such as input data structures, arrays, as well as multi-threading. We follow with a discussion of abstraction techniques that can be used to limit the (possibly infinite) number of symbolic configurations that need to be analyzed for the symbolic execution of looping programs. Furthermore, we describe recent hybrid techniques that combine concrete and symbolic execution to overcome some of the inherent limitations of symbolic execution, such as handling native code or availability of decision procedures for the application domain. Finally, we give a short survey of interesting new applications, such as predictive testing, invariant inference, program repair, analysis of parallel numerical programs and differential symbolic execution.
Agrillo, Christian; Piffer, Laura; Adriano, Andrea
2013-07-01
A significant debate surrounds the nature of the cognitive mechanisms involved in non-symbolic number estimation. Several studies have suggested the existence of the same cognitive system for estimation of time, space, and number, called "a theory of magnitude" (ATOM). In addition, researchers have proposed the theory that non-symbolic number abilities might support our mathematical skills. Despite the large number of studies carried out, no firm conclusions can be drawn on either topic. In the present study, we correlated the performance of adults on non-symbolic magnitude estimations and symbolic numerical tasks. Non-symbolic magnitude abilities were assessed by asking participants to estimate which auditory tone lasted longer (time), which line was longer (space), and which group of dots was more numerous (number). To assess symbolic numerical abilities, participants were required to perform mental calculations and mathematical reasoning. We found a positive correlation between non-symbolic and symbolic numerical abilities. On the other hand, no correlation was found among non-symbolic estimations of time, space, and number. Our study supports the idea that mathematical abilities rely on rudimentary numerical skills that predate verbal language. By contrast, the lack of correlation among non-symbolic estimations of time, space, and number is incompatible with the idea that these magnitudes are entirely processed by the same cognitive system.
The Automation of Nowcast Model Assessment Processes
2016-09-01
that will automate real-time WRE-N model simulations, collect and quality control check weather observations for assimilation and verification, and...domains centered near White Sands Missile Range, New Mexico, where the Meteorological Sensor Array (MSA) will be located. The MSA will provide...observations and performing quality -control checks for the pre-forecast data assimilation period. 2. Run the WRE-N model to generate model forecast data
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-12
... Change Relating to Rebates and Fees for Adding and Removing Liquidity in Select Symbols April 6, 2012...'s Pricing Schedule entitled ``Rebates and Fees for Adding and Removing Liquidity in Select Symbols,'' specifically to remove various Select Symbols.\\3\\ \\3\\ The term ``Select Symbols'' refers to the symbols which...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-01
... Change Amending the Rebates and Fees for Adding and Removing Liquidity in Select Symbols January 26, 2012...'s Fee Schedule titled ``Rebates and Fees for Adding and Removing Liquidity in Select Symbols,'' specifically to remove various Select Symbols.\\3\\ \\3\\ The term ``Select Symbols'' refers to the symbols which...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-09
... Change Relating to the Rebates and Fees for Adding and Removing Liquidity in Select Symbols January 3...'s Fee Schedule titled ``Rebates and Fees for Adding and Removing Liquidity in Select Symbols,'' specifically to amend the Select Symbols.\\3\\ \\3\\ The term ``Select Symbols'' refers to the symbols which are...
Temporal Precedence Checking for Switched Models and its Application to a Parallel Landing Protocol
NASA Technical Reports Server (NTRS)
Duggirala, Parasara Sridhar; Wang, Le; Mitra, Sayan; Viswanathan, Mahesh; Munoz, Cesar A.
2014-01-01
This paper presents an algorithm for checking temporal precedence properties of nonlinear switched systems. This class of properties subsume bounded safety and capture requirements about visiting a sequence of predicates within given time intervals. The algorithm handles nonlinear predicates that arise from dynamics-based predictions used in alerting protocols for state-of-the-art transportation systems. It is sound and complete for nonlinear switch systems that robustly satisfy the given property. The algorithm is implemented in the Compare Execute Check Engine (C2E2) using validated simulations. As a case study, a simplified model of an alerting system for closely spaced parallel runways is considered. The proposed approach is applied to this model to check safety properties of the alerting logic for different operating conditions such as initial velocities, bank angles, aircraft longitudinal separation, and runway separation.
Perrin, Maxine; Robillard, Manon; Roy-Charland, Annie
2017-12-01
This study examined eye movements during a visual search task as well as cognitive abilities within three age groups. The aim was to explore scanning patterns across symbol grids and to better understand the impact of symbol location in AAC displays on speed and accuracy of symbol selection. For the study, 60 students were asked to locate a series of symbols on 16 cell grids. The EyeLink 1000 was used to measure eye movements, accuracy, and response time. Accuracy was high across all cells. Participants had faster response times, longer fixations, and more frequent fixations on symbols located in the middle of the grid. Group comparisons revealed significant differences for accuracy and reaction times. The Leiter-R was used to evaluate cognitive abilities. Sustained attention and cognitive flexibility scores predicted the participants' reaction time and accuracy in symbol selection. Findings suggest that symbol location within AAC devices and individuals' cognitive abilities influence the speed and accuracy of retrieving symbols.
Automated Non-Alphanumeric Symbol Resolution in Clinical Texts
Moon, SungRim; Pakhomov, Serguei; Ryan, James; Melton, Genevieve B.
2011-01-01
Although clinical texts contain many symbols, relatively little attention has been given to symbol resolution by medical natural language processing (NLP) researchers. Interpreting the meaning of symbols may be viewed as a special case of Word Sense Disambiguation (WSD). One thousand instances of four common non-alphanumeric symbols (‘+’, ‘–’, ‘/’, and ‘#’) were randomly extracted from a clinical document repository and annotated by experts. The symbols and their surrounding context, in addition to bag-of-Words (BoW), and heuristic rules were evaluated as features for the following classifiers: Naïve Bayes, Support Vector Machine, and Decision Tree, using 10-fold cross-validation. Accuracies for ‘+’, ‘–’, ‘/’, and ‘#’ were 80.11%, 80.22%, 90.44%, and 95.00% respectively, with Naïve Bayes. While symbol context contributed the most, BoW was also helpful for disambiguation of some symbols. Symbol disambiguation with supervised techniques can be implemented with reasonable accuracy as a module for medical NLP systems. PMID:22195157
The effects of age on symbol comprehension in central rail hubs in Taiwan.
Liu, Yung-Ching; Ho, Chin-Heng
2012-11-01
The purpose of this study was to investigate the effects of age and symbol design features on passengers' comprehension of symbols and the performance of these symbols with regard to route guidance. In the first experiment, 30 young participants and 30 elderly participants interpreted the meanings and rated the features of 39 symbols. Researchers collected data on each subject's comprehension time, comprehension score, and feature ratings for each symbol. In the second experiment, this study used a series of photos to simulate scenarios in which passengers follow symbols to arrive at their destinations. The length of time each participant required to follow his/her route and his/her errors were recorded. Older adults experienced greater difficulty in understanding particular symbols as compared to younger adults. Familiarity was the feature most highly correlated with comprehension of symbols and accuracy of semantic depiction was the best predictor of behavior in following routes. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Symbolic Constraint Maintenance Grid
NASA Technical Reports Server (NTRS)
James, Mark
2006-01-01
Version 3.1 of Symbolic Constraint Maintenance Grid (SCMG) is a software system that provides a general conceptual framework for utilizing pre-existing programming techniques to perform symbolic transformations of data. SCMG also provides a language (and an associated communication method and protocol) for representing constraints on the original non-symbolic data. SCMG provides a facility for exchanging information between numeric and symbolic components without knowing the details of the components themselves. In essence, it integrates symbolic software tools (for diagnosis, prognosis, and planning) with non-artificial-intelligence software. SCMG executes a process of symbolic summarization and monitoring of continuous time series data that are being abstractly represented as symbolic templates of information exchange. This summarization process enables such symbolic- reasoning computing systems as artificial- intelligence planning systems to evaluate the significance and effects of channels of data more efficiently than would otherwise be possible. As a result of the increased efficiency in representation, reasoning software can monitor more channels and is thus able to perform monitoring and control functions more effectively.
Fuzzy Intervals for Designing Structural Signature: An Application to Graphic Symbol Recognition
NASA Astrophysics Data System (ADS)
Luqman, Muhammad Muzzamil; Delalandre, Mathieu; Brouard, Thierry; Ramel, Jean-Yves; Lladós, Josep
The motivation behind our work is to present a new methodology for symbol recognition. The proposed method employs a structural approach for representing visual associations in symbols and a statistical classifier for recognition. We vectorize a graphic symbol, encode its topological and geometrical information by an attributed relational graph and compute a signature from this structural graph. We have addressed the sensitivity of structural representations to noise, by using data adapted fuzzy intervals. The joint probability distribution of signatures is encoded by a Bayesian network, which serves as a mechanism for pruning irrelevant features and choosing a subset of interesting features from structural signatures of underlying symbol set. The Bayesian network is deployed in a supervised learning scenario for recognizing query symbols. The method has been evaluated for robustness against degradations & deformations on pre-segmented 2D linear architectural & electronic symbols from GREC databases, and for its recognition abilities on symbols with context noise i.e. cropped symbols.
The method of a joint intraday security check system based on cloud computing
NASA Astrophysics Data System (ADS)
Dong, Wei; Feng, Changyou; Zhou, Caiqi; Cai, Zhi; Dan, Xu; Dai, Sai; Zhang, Chuancheng
2017-01-01
The intraday security check is the core application in the dispatching control system. The existing security check calculation only uses the dispatch center’s local model and data as the functional margin. This paper introduces the design of all-grid intraday joint security check system based on cloud computing and its implementation. To reduce the effect of subarea bad data on the all-grid security check, a new power flow algorithm basing on comparison and adjustment with inter-provincial tie-line plan is presented. And the numerical example illustrated the effectiveness and feasibility of the proposed method.
Experiment of Enzyme Kinetics Using Guided Inquiry Model for Enhancing Generic Science Skills
NASA Astrophysics Data System (ADS)
Amida, N.; Supriyanti, F. M. T.; Liliasari
2017-02-01
This study aims to enhance generic science skills of students using guided inquiry model through experiments of enzyme kinetics. This study used quasi-experimental methods, with pretest-posttestnonequivalent control group design. Subjects of this study were chemistry students enrolled in biochemistry lab course, consisted of 18 students in experimental class and 19 students in control class. Instrument in this study were essay test that involves 5 indicators of generic science skills (i.e. direct observation, causality, symbolic language, mathematical modeling, and concepts formation) and also student worksheets. The results showed that the experiments of kinetics enzyme using guided inquiry model have been enhance generic science skills in high category with a value of
76 FR 35344 - Airworthiness Directives; Costruzioni Aeronautiche Tecnam srl Model P2006T Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-17
... retraction/extension ground checks performed on the P2006T, a loose Seeger ring was found on the nose landing... specified products. The MCAI states: During Landing Gear retraction/extension ground checks performed on the... airworthiness information (MCAI) states: During Landing Gear retraction/extension ground checks performed on the...
Estimation of chaotic coupled map lattices using symbolic vector dynamics
NASA Astrophysics Data System (ADS)
Wang, Kai; Pei, Wenjiang; Cheung, Yiu-ming; Shen, Yi; He, Zhenya
2010-01-01
In [K. Wang, W.J. Pei, Z.Y. He, Y.M. Cheung, Phys. Lett. A 367 (2007) 316], an original symbolic vector dynamics based method has been proposed for initial condition estimation in additive white Gaussian noisy environment. The estimation precision of this estimation method is determined by symbolic errors of the symbolic vector sequence gotten by symbolizing the received signal. This Letter further develops the symbolic vector dynamical estimation method. We correct symbolic errors with backward vector and the estimated values by using different symbols, and thus the estimation precision can be improved. Both theoretical and experimental results show that this algorithm enables us to recover initial condition of coupled map lattice exactly in both noisy and noise free cases. Therefore, we provide novel analytical techniques for understanding turbulences in coupled map lattice.
Developmental dyscalculia and low numeracy in Chinese children.
Chan, Winnie Wai Lan; Au, Terry K; Tang, Joey
2013-05-01
Children struggle with mathematics for different reasons. Developmental dyscalculia and low numeracy - two kinds of mathematical difficulties - may have their roots, respectively, in poor understanding of exact non-symbolic numerosities and of symbolic numerals. This study was the first to explore whether Chinese children, despite cultural and linguistic factors supporting their mathematical learning, also showed such mathematical difficulties and whether such difficulties have measurable impact on children's early school mathematical performance. First-graders, classified as dyscalculia, low numeracy, or normal achievement, were compared for their performance in various school mathematical tasks requiring a grasp of non-symbolic numerosities (i.e., non-symbolic tasks) or an understanding of symbolic numerals (i.e., symbolic tasks). Children with dyscalculia showed poorer performance than their peers in non-symbolic tasks but not symbolic ones, whereas those with low numeracy showed poorer performance in symbolic tasks but not non-symbolic ones. As hypothesized, these findings suggested that dyscalculia and low numeracy were distinct deficits and caused by deficits in non-symbolic and symbolic processing, respectively. These findings went beyond prior research that only documented generally low mathematical achievements for these two groups of children. Moreover, these deficits appeared to be persistent and could not be remedied simply through day-to-day school mathematical learning. The present findings highlighted the importance of tailoring early learning support for children with these distinct deficits, and pointed to future directions for the screening of such mathematical difficulties among Chinese children. Copyright © 2013 Elsevier Ltd. All rights reserved.
A comparison of full-spectrum and complex-symbol combining techniques for the Galileo S-band mission
NASA Technical Reports Server (NTRS)
Million, S.; Shah, B.; Hinedi, S.
1994-01-01
Full-spectrum combining (FSC) and complex-symbol combining (CSC) are two antenna-arraying techniques being considered for the Galileo spacecraft's upcoming encounter with Jupiter. This article describes the performance of these techniques in terms of symbol signal-to-noise ratio (SNR) degradation and symbol SNR loss. It is shown that both degradation and loss are approximately equal at low values of symbol SNR but diverge at high SNR values. For the Galileo S-band (2.2 to 2.3 GHz) mission, degradation provides a good estimate of performance as the symbol SNR is typically below -5 dB. For the following arrays - two 70-m antennas, one 70-m and one 34-m antenna, one 70-m and two 34-m antennas, and one 70-m and three 34-m antennas - it is shown that FSC has less degradation than CSC when the subcarrier and symbol window-loop bandwidth products are above 3.0, 10.0, 8.5, and 8.2 mHz at the symbol rate of 200 sym/sec, and above 1.2, 4.5, 4.0, and 3.5 mHz at a symbol rate of 400 sym/sec, respectively. Moreover, for an array of four 34-m antennas, FSC has less degradation than CSC when the subcarrier and symbol window-loop bandwidth products are above 0.32 mHz at the symbol rate of 50 sym/sec and above 0.8 mHz at the symbol rate of 25 sym/sec.
NASA Astrophysics Data System (ADS)
Sagita, R.; Azra, F.; Azhar, M.
2018-04-01
The research has created the module of mole concept based on structured inquiry with interconection of macro, submicro, and symbolic representation and determined the validity and practicality of the module. The research type was Research and Development (R&D). The development model was 4-D models that consist of four steps: define, design, develop, and disseminate. The research was limited on develop step. The instrument of the research was questionnaire form that consist of validity and practicality sheets. The module was validated by 5 validators. Practicality module was tested by 2 chemistry teachers and 28 students of grade XI MIA 5 at SMAN 4 of Padang. Validity and practicality data were analysed by using the kappa Cohen formula. The moment kappa average of 5 validators was 0,95 with highest validity category. The moment kappa average of teachers and students were 0,89 and 0,91 praticality with high category. The result of the research showed that the module of mole concept based on structured inquiry with interconection of macro, submicro, and symbolic representation was valid and practice to be used on the learning chemistry.
Automatic Generation of Algorithms for the Statistical Analysis of Planetary Nebulae Images
NASA Technical Reports Server (NTRS)
Fischer, Bernd
2004-01-01
Analyzing data sets collected in experiments or by observations is a Core scientific activity. Typically, experimentd and observational data are &aught with uncertainty, and the analysis is based on a statistical model of the conjectured underlying processes, The large data volumes collected by modern instruments make computer support indispensible for this. Consequently, scientists spend significant amounts of their time with the development and refinement of the data analysis programs. AutoBayes [GF+02, FS03] is a fully automatic synthesis system for generating statistical data analysis programs. Externally, it looks like a compiler: it takes an abstract problem specification and translates it into executable code. Its input is a concise description of a data analysis problem in the form of a statistical model as shown in Figure 1; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. Internally, however, it is quite different: AutoBayes derives a customized algorithm implementing the given model using a schema-based process, and then further refines and optimizes the algorithm into code. A schema is a parameterized code template with associated semantic constraints which define and restrict the template s applicability. The schema parameters are instantiated in a problem-specific way during synthesis as AutoBayes checks the constraints against the original model or, recursively, against emerging sub-problems. AutoBayes schema library contains problem decomposition operators (which are justified by theorems in a formal logic in the domain of Bayesian networks) as well as machine learning algorithms (e.g., EM, k-Means) and nu- meric optimization methods (e.g., Nelder-Mead simplex, conjugate gradient). AutoBayes augments this schema-based approach by symbolic computation to derive closed-form solutions whenever possible. This is a major advantage over other statistical data analysis systems which use numerical approximations even in cases where closed-form solutions exist. AutoBayes is implemented in Prolog and comprises approximately 75.000 lines of code. In this paper, we take one typical scientific data analysis problem-analyzing planetary nebulae images taken by the Hubble Space Telescope-and show how AutoBayes can be used to automate the implementation of the necessary anal- ysis programs. We initially follow the analysis described by Knuth and Hajian [KHO2] and use AutoBayes to derive code for the published models. We show the details of the code derivation process, including the symbolic computations and automatic integration of library procedures, and compare the results of the automatically generated and manually implemented code. We then go beyond the original analysis and use AutoBayes to derive code for a simple image segmentation procedure based on a mixture model which can be used to automate a manual preproceesing step. Finally, we combine the original approach with the simple segmentation which yields a more detailed analysis. This also demonstrates that AutoBayes makes it easy to combine different aspects of data analysis.
Symbolic and non-symbolic number magnitude processing in children with developmental dyscalculia.
Castro Cañizares, Danilka; Reigosa Crespo, Vivian; González Alemañy, Eduardo
2012-11-01
The aim of this study was to evaluate if children with Developmental Dyscalculia (DD) exhibit a general deficit in magnitude representations or a specific deficit in the connection of symbolic representations with the corresponding analogous magnitudes. DD was diagnosed using a timed arithmetic task. The experimental magnitude comparison tasks were presented in non-symbolic and symbolic formats. DD and typically developing (TD) children showed similar numerical distance and size congruity effects. However, DD children performed significantly slower in the symbolic task. These results are consistent with the access deficit hypothesis, according to which DD children's deficits are caused by difficulties accessing magnitude information from numerical symbols rather than in processing numerosities per se.
Adolescent Contraceptive Use: Models, Research, and Directions.
ERIC Educational Resources Information Center
Whitley, Bernard E., Jr.; Schofield, Janet Ward
Both the career model and the decision model have been proposed to explain patterns of contraceptive use in teenagers. The career model views contraceptive use as a symbol of a woman's sexuality and implies a clear decision to be sexually active. The decision model is based on the subjective expected utility (SEU) theory which holds that people…
ERIC Educational Resources Information Center
Cangelosi, Angelo; Riga, Thomas
2006-01-01
The grounding of symbols in computational models of linguistic abilities is one of the fundamental properties of psychologically plausible cognitive models. In this article, we present an embodied model for the grounding of language in action based on epigenetic robots. Epigenetic robotics is one of the new cognitive modeling approaches to…
Effects of Participant Modeling on Information Acquisition and Skill Utilization.
ERIC Educational Resources Information Center
Klingman, Avigdor; And Others
1984-01-01
Assessed the contribution of active participant modeling in coping skills training in children (N=38) highly fearful of dentists. Results provided evidence for the greater efficacy of active practice relative to symbolic modeling for the learning and utilization of coping strategies to reduce stress during aversive procedures. (LLL)
Exploring Solid-State Structure and Physical Properties: A Molecular and Crystal Model Exercise
ERIC Educational Resources Information Center
Bindel, Thomas H.
2008-01-01
A crystal model laboratory exercise is presented that allows students to examine relations among the microscopic-macroscopic-symbolic levels, using crystalline mineral samples and corresponding crystal models. Students explore the relationship between solid-state structure and crystal form. Other structure-property relationships are explored. The…
Sediment trapping efficiency of adjustable check dam in laboratory and field experiment
NASA Astrophysics Data System (ADS)
Wang, Chiang; Chen, Su-Chin; Lu, Sheng-Jui
2014-05-01
Check dam has been constructed at mountain area to block debris flow, but has been filled after several events and lose its function of trapping. For the reason, the main facilities of our research is the adjustable steel slit check dam, which with the advantages of fast building, easy to remove or adjust it function. When we can remove transverse beams to drain sediments off and keep the channel continuity. We constructed adjustable steel slit check dam on the Landow torrent, Huisun Experiment Forest station as the prototype to compare with model in laboratory. In laboratory experiments, the Froude number similarity was used to design the dam model. The main comparisons focused on types of sediment trapping and removing, sediment discharge, and trapping rate of slit check dam. In different types of removing transverse beam showed different kind of sediment removal and differences on rate of sediment removing, removing rate, and particle size distribution. The sediment discharge in check dam with beams is about 40%~80% of check dam without beams. Furthermore, the spacing of beams is considerable factor to the sediment discharge. In field experiment, this research uses time-lapse photography to record the adjustable steel slit check dam on the Landow torrent. The typhoon Soulik made rainfall amounts of 600 mm in eight hours and induced debris flow in Landow torrent. Image data of time-lapse photography demonstrated that after several sediment transport event the adjustable steel slit check dam was buried by debris flow. The result of lab and field experiments: (1)Adjustable check dam could trap boulders and stop woody debris flow and flush out fine sediment to supply the need of downstream river. (2)The efficiency of sediment trapping in adjustable check dam with transverse beams was significantly improved. (3)The check dam without transverse beams can remove the sediment and keep the ecosystem continuity.
Eagle, Dawn M; Noschang, Cristie; d'Angelo, Laure-Sophie Camilla; Noble, Christie A; Day, Jacob O; Dongelmans, Marie Louise; Theobald, David E; Mar, Adam C; Urcelay, Gonzalo P; Morein-Zamir, Sharon; Robbins, Trevor W
2014-05-01
Excessive checking is a common, debilitating symptom of obsessive-compulsive disorder (OCD). In an established rodent model of OCD checking behaviour, quinpirole (dopamine D2/3-receptor agonist) increased checking in open-field tests, indicating dopaminergic modulation of checking-like behaviours. We designed a novel operant paradigm for rats (observing response task (ORT)) to further examine cognitive processes underpinning checking behaviour and clarify how and why checking develops. We investigated i) how quinpirole increases checking, ii) dependence of these effects on D2/3 receptor function (following treatment with D2/3 receptor antagonist sulpiride) and iii) effects of reward uncertainty. In the ORT, rats pressed an 'observing' lever for information about the location of an 'active' lever that provided food reinforcement. High- and low-checkers (defined from baseline observing) received quinpirole (0.5mg/kg, 10 treatments) or vehicle. Parametric task manipulations assessed observing/checking under increasing task demands relating to reinforcement uncertainty (variable response requirement and active-lever location switching). Treatment with sulpiride further probed the pharmacological basis of long-term behavioural changes. Quinpirole selectively increased checking, both functional observing lever presses (OLPs) and non-functional extra OLPs (EOLPs). The increase in OLPs and EOLPs was long-lasting, without further quinpirole administration. Quinpirole did not affect the immediate ability to use information from checking. Vehicle and quinpirole-treated rats (VEH and QNP respectively) were selectively sensitive to different forms of uncertainty. Sulpiride reduced non-functional EOLPs in QNP rats but had no effect on functional OLPs. These data have implications for treatment of compulsive checking in OCD, particularly for serotonin-reuptake-inhibitor treatment-refractory cases, where supplementation with dopamine receptor antagonists may be beneficial. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.
Lionello-DeNolf, Karen M.; Farber, Rachel; Jones, B. Max; Dube, William V.
2014-01-01
Matching-to-sample (MTS) is often used to teach symbolic relationships between spoken or printed words and their referents to children with intellectual and developmental disabilities. However, many children have difficulty learning symbolic matching, even though they may demonstrate generalized identity matching. The current study investigated whether training on symbolic MTS tasks in which the stimuli are physically dissimilar but members of familiar categories (i.e., thematic matching) can remediate an individual’s difficulty learning symbolic MTS tasks involving non-representative stimuli. Three adolescent males diagnosed with autism spectrum disorder were first trained on symbolic MTS tasks with unfamiliar, non-representative form stimuli. Thematic matching was introduced after the participants failed to learn 0, 2 or 4 symbolic MTS tasks and before additional symbolic MTS tasks were introduced. After exposure to thematic matching, accuracy on symbolic MTS tasks with novel stimuli increased to above chance for all participants. For two participants, high accuracy (> 90%) was achieved on a majority of these sessions. Thus, thematic matching may be an effective intervention for students with limited verbal repertoires and who have difficulty learning symbolic MTS tasks. Possible explanations for the facilitative effect of thematic matching are considered and warrant further investigation. PMID:24634695
Incidental recall on WAIS-R digit symbol discriminates Alzheimer's and Parkinson's diseases.
Demakis, G J; Sawyer, T P; Fritz, D; Sweet, J J
2001-03-01
The purpose of this study was to examine how Alzheimer's (n = 37) and Parkinson's (n = 21) patients perform on the incidental recall adaptation to the Digit Symbol of the Wechsler Adult Intelligence Scale-Revised (WAIS-R) and how such performance is related to established cognitive efficiency and memory measures. This adaptation requires the examinee to complete the entire subtest and then, without warning, to immediately recall the symbols associated with each number. Groups did not differ significantly on standard Digit Symbol administration (90 seconds), but on recall Parkinson's patients recalled significantly more symbols and symbol-number pairs than Alzheimer's patients. Using only the number of symbols recalled, discriminate function analysis correctly classified 76% of these patients. Correlations between age-corrected scaled score, symbols incidentally recalled, and established measures of cognitive efficiency and memory provided evidence of convergent and divergent validity. Age-corrected scaled scores were more consistently and strongly related to cognitive efficiency, whereas symbols recalled were more consistently and strongly related to memory measures. These findings suggest that the Digit Symbol recall adaptation is actually assessing memory and that it can be another useful way to detect memory impairment. Copyright 2001 John Wiley & Sons, Inc.
Lesch, Mary F; Powell, W Ryan; Horrey, William J; Wogalter, Michael S
2013-01-01
This study teased apart the effects of comprehensibility and complexity on older adults' comprehension of warning symbols by manipulating the relevance of additional information in further refining the meaning of the symbol. Symbols were systematically altered such that increased visual complexity (in the form of contextual cues) resulted in increased comprehensibility. One hundred older adults, aged 50-71 years, were tested on their comprehension of these symbols before and after training. High comprehensibility-complexity symbols were found to be better understood than low- or medium-comprehensibility-complexity symbols and the effectiveness of the contextual cues varied as a function of training. Therefore, the nature of additional detail determines whether increased complexity is detrimental or beneficial to older adults' comprehension - if the additional details provide 'cues to knowledge', older adults' comprehension improves as a result of the increased complexity. However, some cues may require training in order to be effective. Research suggests that older adults have greater difficulty in understanding more complex symbols. However, we found that when the complexity of symbols was increased through the addition of contextual cues, older adults' comprehension actually improved. Contextual cues aid older adults in making the connection between the symbol and its referent.
Schlosser, Ralf W; Shane, Howard; Sorce, James; Koul, Rajinder; Bloomfield, Emma
2011-09-01
The purpose of this study was to identify graphic symbols for verbs and prepositions that were performing and underperforming in static and animated formats in a recent experiment on the effects of animation on transparency, name agreement, and identification of graphic symbols. Variable-specific criteria were developed in order to define when a symbol is considered to be performing in terms of its transparency, name agreement, and identification accuracy. Additionally, across-variable heuristic criteria were developed that allowed classification of symbols into four categories: (a) performing exceptionally, (b) performing effectively, (c) performing adequately, and (d) performing inadequately. These criteria were applied to 24 symbols for verbs and 8 symbols for prepositions in both animated and static formats. Results indicated that the vast majority of the symbols performed adequately or better while a few did not. Potential reasons as to why some of the symbols may have underperformed are discussed. Where appropriate, implications for modifying existing symbols and future research are drawn. Although the fact that the heuristic criteria were developed post-hoc is discussed as a limitation, the benefits of the proposed categories bode well for future applications.
Hermans, Veerle; Monzote, Lianet; Van den Sande, Björn; Mukadi, Pierre; Sopheak, Thai; Gillet, Philippe; Jacobs, Jan
2011-11-02
Graphical symbols on in vitro diagnostics (IVD symbols) replace the need for text in different languages and are used on malaria rapid diagnostic tests (RDTs) marketed worldwide. The present study assessed the comprehension of IVD symbols labelled on malaria RDT kits among laboratory staff in four different countries. Participants (n = 293) in Belgium (n = 96), the Democratic Republic of the Congo (DRC, n = 87), Cambodia (n = 59) and Cuba (n = 51) were presented with an anonymous questionnaire with IVD symbols extracted from ISO 15223 and EN 980 presented as stand-alone symbols (n = 18) and in context (affixed on RDT packages, n = 16). Responses were open-ended and scored for correctness by local professionals. Presented as stand-alone, three and five IVD symbols were correctly scored for comprehension by 67% and 50% of participants; when contextually presented, five and seven symbols reached the 67% and 50% correct score respectively. 'Batch code' scored best (correctly scored by 71.3% of participants when presented as stand-alone), 'Authorized representative in the European Community' scored worst (1.4% correct). Another six IVD symbols were scored correctly by less than 10% of participants: 'Do not reuse', 'In vitro diagnostic medical device', 'Sufficient for', 'Date of manufacture', 'Authorised representative in EC', and 'Do not use if package is damaged'. Participants in Belgium and Cuba both scored six symbols above the 67% criterion, participants from DRC and Cambodia scored only two and one symbols above this criterion. Low correct scores were observed for safety-related IVD symbols, such as for 'Biological Risk' (42.7%) and 'Do not reuse' (10.9%). Comprehension of IVD symbols on RDTs among laboratory staff in four international settings was unsatisfactory. Administrative and outreach procedures should be undertaken to assure their acquaintance by end-users.
36 CFR 264.11 - Use of symbol.
Code of Federal Regulations, 2010 CFR
2010-07-01
... MANAGEMENT Mount St. Helens National Volcanic Monument Symbol § 264.11 Use of symbol. Except as provided in § 264.12, use of the Mount St. Helens National Volcanic Monument official symbol, including a facsimile...
Wilkinson, Krista M; Snell, Julie
2011-11-01
Communication about feelings is a core element of human interaction. Aided augmentative and alternative communication systems must therefore include symbols representing these concepts. The symbols must be readily distinguishable in order for users to communicate effectively. However, emotions are represented within most systems by schematic faces in which subtle distinctions are difficult to represent. We examined whether background color cuing and spatial arrangement might help children identify symbols for different emotions. Thirty nondisabled children searched for symbols representing emotions within an 8-choice array. On some trials, a color cue signaled the valence of the emotion (positive vs. negative). Additionally, the symbols were either (a) organized with the negatively valenced symbols at the top and the positive symbols on the bottom of the display or (b) distributed randomly throughout. Dependent variables were accuracy and speed of responses. The speed with which children could locate a target was significantly faster for displays in which symbols were clustered by valence, but only when the symbols had white backgrounds. Addition of a background color cue did not facilitate responses. Rapid search was facilitated by a spatial organization cue, but not by the addition of background color. Further examination of the situations in which color cues may be useful is warranted.
Perception and multimeaning analysis of graphic symbols for Thai picture-based communication system.
Chompoobutr, Sarinya; Potibal, Puttachart; Boriboon, Monthika; Phantachat, Wantanee
2013-03-01
Graphic symbols are a vital part of most augmentative and alternative communication systems. Communication fluency of graphic symbol user depends on how well the relationship between symbols and its referents are learnt. The first aim of this study is to survey the perception of the selected graphic symbols across seven age groups of participants with different educational background. Sixty-five individuals identified themselves as Thai and ranged in age from 10 to 50 years participated in the investigation used 64 graphic symbols. The last aim of this study is to demonstrate the analysis of multimeaning graphic symbols, which will be used in Thai Picture-based communication system. The twenty graphic symbols with 9-14 meanings are analyzed in both syntactic and semantic aspects. The meanings are divided into five categories: noun, verb/adjective, size, color and shape. Respect to the first aim, the results suggest that the participants under investigation with different sexes, age groups, as well as various educational levels perceive the features or inherent characteristics of such graphic symbols similarly. The results of the analysis of multimeaning of graphic symbols indicate that the foundation of Minspeak, polysemy and redundancy of the words illustrates the inherit meanings of the real-life objects, and it also conveys that the Thai graphic symbols are influenced by numerous factors in Thai circumstance such as ability, motivation, experience, worldview and culture.
The evaluation of display symbology - A chronometric study of visual search. [on cathode ray tubes
NASA Technical Reports Server (NTRS)
Remington, R.; Williams, D.
1984-01-01
Three single-target visual search tasks were used to evaluate a set of CRT symbols for a helicopter traffic display. The search tasks were representative of the kinds of information extraction required in practice, and reaction time was used to measure the efficiency with which symbols could be located and identified. The results show that familiar numeric symbols were responded to more quickly than graphic symbols. The addition of modifier symbols such as a nearby flashing dot or surrounding square had a greater disruptive effect on the graphic symbols than the alphanumeric characters. The results suggest that a symbol set is like a list that must be learned. Factors that affect the time to respond to items in a list, such as familiarity and visual discriminability, and the division of list items into categories, also affect the time to identify symbols.
Creating illusions of past encounter through brief exposure.
Brown, Alan S; Marsh, Elizabeth J
2009-05-01
Titchener (1928) suggested that briefly glancing at a scene could make it appear strangely familiar when it was fully processed moments later. The closest laboratory demonstration used words as stimuli, and showed that briefly glancing at a to-be-judged word increased the subject's belief that it had been presented in an earlier study list (Jacoby & Whitehouse, 1989). We evaluated whether a hasty glance could elicit a false belief in a prior encounter, from a time and place outside of the experiment. This goal precluded using word stimuli, so we had subjects evaluate unfamiliar symbols. Each symbol was preceded by a brief exposure to an identical symbol, a different symbol, or no symbol. A brief glance at an identical symbol increased attributions to preexperimental experience, relative to a glance at a different symbol or no symbol, providing a possible mechanism for common illusions of false recognition.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Du, Qiang, E-mail: jyanghkbu@gmail.com; Yang, Jiang, E-mail: qd2125@columbia.edu
This work is concerned with the Fourier spectral approximation of various integral differential equations associated with some linear nonlocal diffusion and peridynamic operators under periodic boundary conditions. For radially symmetric kernels, the nonlocal operators under consideration are diagonalizable in the Fourier space so that the main computational challenge is on the accurate and fast evaluation of their eigenvalues or Fourier symbols consisting of possibly singular and highly oscillatory integrals. For a large class of fractional power-like kernels, we propose a new approach based on reformulating the Fourier symbols both as coefficients of a series expansion and solutions of some simplemore » ODE models. We then propose a hybrid algorithm that utilizes both truncated series expansions and high order Runge–Kutta ODE solvers to provide fast evaluation of Fourier symbols in both one and higher dimensional spaces. It is shown that this hybrid algorithm is robust, efficient and accurate. As applications, we combine this hybrid spectral discretization in the spatial variables and the fourth-order exponential time differencing Runge–Kutta for temporal discretization to offer high order approximations of some nonlocal gradient dynamics including nonlocal Allen–Cahn equations, nonlocal Cahn–Hilliard equations, and nonlocal phase-field crystal models. Numerical results show the accuracy and effectiveness of the fully discrete scheme and illustrate some interesting phenomena associated with the nonlocal models.« less
Chew, Cindy S; Forte, Jason D; Reeve, Robert A
2016-12-01
Early math abilities are claimed to be linked to magnitude representation ability. Some claim that nonsymbolic magnitude abilities scaffold the acquisition of symbolic (Arabic number) magnitude abilities and influence math ability. Others claim that symbolic magnitude abilities, and ipso facto math abilities, are independent of nonsymbolic abilities and instead depend on the ability to process number symbols (e.g., 2, 7). Currently, the issue of whether symbolic abilities are or are not related to nonsymbolic abilities, and the cognitive factors associated with nonsymbolic-symbolic relationships, remains unresolved. We suggest that different nonsymbolic-symbolic relationships reside within the general magnitude ability distribution and that different cognitive abilities are likely associated with these different relationships. We further suggest that the different nonsymbolic-symbolic relationships and cognitive abilities in combination differentially predict math abilities. To test these claims, we used latent profile analysis to identify nonsymbolic-symbolic judgment patterns of 124, 5- to 7-year-olds. We also assessed four cognitive factors (visuospatial working memory [VSWM], naming numbers, nonverbal IQ, and basic reaction time [RT]) and two math abilities (number transcoding and single-digit addition abilities). Four nonsymbolic-symbolic ability profiles were identified. Naming numbers, VSWM, and basic RT abilities were differentially associated with the different ability profiles and in combination differentially predicted math abilities. Findings show that different patterns of nonsymbolic-symbolic magnitude abilities can be identified and suggest that an adequate account of math development should specify the inter-relationship between cognitive factors and nonsymbolic-symbolic ability patterns. Copyright © 2016 Elsevier Inc. All rights reserved.
Magnitude processing of symbolic and non-symbolic proportions: an fMRI study.
Mock, Julia; Huber, Stefan; Bloechle, Johannes; Dietrich, Julia F; Bahnmueller, Julia; Rennig, Johannes; Klein, Elise; Moeller, Korbinian
2018-05-10
Recent research indicates that processing proportion magnitude is associated with activation in the intraparietal sulcus. Thus, brain areas associated with the processing of numbers (i.e., absolute magnitude) were activated during processing symbolic fractions as well as non-symbolic proportions. Here, we investigated systematically the cognitive processing of symbolic (e.g., fractions and decimals) and non-symbolic proportions (e.g., dot patterns and pie charts) in a two-stage procedure. First, we investigated relative magnitude-related activations of proportion processing. Second, we evaluated whether symbolic and non-symbolic proportions share common neural substrates. We conducted an fMRI study using magnitude comparison tasks with symbolic and non-symbolic proportions, respectively. As an indicator for magnitude-related processing of proportions, the distance effect was evaluated. A conjunction analysis indicated joint activation of specific occipito-parietal areas including right intraparietal sulcus (IPS) during proportion magnitude processing. More specifically, results indicate that the IPS, which is commonly associated with absolute magnitude processing, is involved in processing relative magnitude information as well, irrespective of symbolic or non-symbolic presentation format. However, we also found distinct activation patterns for the magnitude processing of the different presentation formats. Our findings suggest that processing for the separate presentation formats is not only associated with magnitude manipulations in the IPS, but also increasing demands on executive functions and strategy use associated with frontal brain regions as well as visual attention and encoding in occipital regions. Thus, the magnitude processing of proportions may not exclusively reflect processing of number magnitude information but also rather domain-general processes.
Khanum, Saeeda; Hanif, Rubina; Spelke, Elizabeth S; Berteletti, Ilaria; Hyde, Daniel C
2016-01-01
Current theories of numerical cognition posit that uniquely human symbolic number abilities connect to an early developing cognitive system for representing approximate numerical magnitudes, the approximate number system (ANS). In support of this proposal, recent laboratory-based training experiments with U.S. children show enhanced performance on symbolic addition after brief practice comparing or adding arrays of dots without counting: tasks that engage the ANS. Here we explore the nature and generality of this effect through two brief training experiments. In Experiment 1, elementary school children in Pakistan practiced either a non-symbolic numerical addition task or a line-length addition task with no numerical content, and then were tested on symbolic addition. After training, children in the numerical training group completed the symbolic addition test faster than children in the line length training group, suggesting a causal role of brief, non-symbolic numerical training on exact, symbolic addition. These findings replicate and extend the core findings of a recent U.S. laboratory-based study to non-Western children tested in a school setting, attesting to the robustness and generalizability of the observed training effects. Experiment 2 tested whether ANS training would also enhance the consistency of performance on a symbolic number line task. Over several analyses of the data there was some evidence that approximate number training enhanced symbolic number line placements relative to control conditions. Together, the findings suggest that engagement of the ANS through brief training procedures enhances children's immediate attention to number and engagement with symbolic number tasks.
Khanum, Saeeda; Hanif, Rubina; Spelke, Elizabeth S.; Berteletti, Ilaria; Hyde, Daniel C.
2016-01-01
Current theories of numerical cognition posit that uniquely human symbolic number abilities connect to an early developing cognitive system for representing approximate numerical magnitudes, the approximate number system (ANS). In support of this proposal, recent laboratory-based training experiments with U.S. children show enhanced performance on symbolic addition after brief practice comparing or adding arrays of dots without counting: tasks that engage the ANS. Here we explore the nature and generality of this effect through two brief training experiments. In Experiment 1, elementary school children in Pakistan practiced either a non-symbolic numerical addition task or a line-length addition task with no numerical content, and then were tested on symbolic addition. After training, children in the numerical training group completed the symbolic addition test faster than children in the line length training group, suggesting a causal role of brief, non-symbolic numerical training on exact, symbolic addition. These findings replicate and extend the core findings of a recent U.S. laboratory-based study to non-Western children tested in a school setting, attesting to the robustness and generalizability of the observed training effects. Experiment 2 tested whether ANS training would also enhance the consistency of performance on a symbolic number line task. Over several analyses of the data there was some evidence that approximate number training enhanced symbolic number line placements relative to control conditions. Together, the findings suggest that engagement of the ANS through brief training procedures enhances children's immediate attention to number and engagement with symbolic number tasks. PMID:27764117
A social skills analysis in childhood and adolescence using symbolic interactionism.
Russell, A
1984-02-01
Support is obtained from the literature about the need for advances in the conceptualization of "social skills." There is agreement that much is known about how to improve social skills, but less attention has been given to what to change or improve. The present article outlines a model of social skills in childhood and adolescence using the concepts and literature on symbolic interactionism in an attempt to provide a possible conceptual framework for social skills. The proposed model is organized around the concepts of role-taking, role-making, definition of the situation, and self. Each concept is taken in turn and how it could contribute to the analysis or understanding of social skills in childhood and adolescence is shown. The article concludes with a discussion of ways in which the proposed scheme might be used in one area of social skills - friendship making. Some possible difficulties and limitations in the model are noted.
Besalú, Emili
2016-01-01
The Superposing Significant Interaction Rules (SSIR) method is described. It is a general combinatorial and symbolic procedure able to rank compounds belonging to combinatorial analogue series. The procedure generates structure-activity relationship (SAR) models and also serves as an inverse SAR tool. The method is fast and can deal with large databases. SSIR operates from statistical significances calculated from the available library of compounds and according to the previously attached molecular labels of interest or non-interest. The required symbolic codification allows dealing with almost any combinatorial data set, even in a confidential manner, if desired. The application example categorizes molecules as binding or non-binding, and consensus ranking SAR models are generated from training and two distinct cross-validation methods: leave-one-out and balanced leave-two-out (BL2O), the latter being suited for the treatment of binary properties. PMID:27240346
A Model for Indexing Medical Documents Combining Statistical and Symbolic Knowledge.
Avillach, Paul; Joubert, Michel; Fieschi, Marius
2007-01-01
OBJECTIVES: To develop and evaluate an information processing method based on terminologies, in order to index medical documents in any given documentary context. METHODS: We designed a model using both symbolic general knowledge extracted from the Unified Medical Language System (UMLS) and statistical knowledge extracted from a domain of application. Using statistical knowledge allowed us to contextualize the general knowledge for every particular situation. For each document studied, the extracted terms are ranked to highlight the most significant ones. The model was tested on a set of 17,079 French standardized discharge summaries (SDSs). RESULTS: The most important ICD-10 term of each SDS was ranked 1st or 2nd by the method in nearly 90% of the cases. CONCLUSIONS: The use of several terminologies leads to more precise indexing. The improvement achieved in the model’s implementation performances as a result of using semantic relationships is encouraging. PMID:18693792
An experimental method to verify soil conservation by check dams on the Loess Plateau, China.
Xu, X Z; Zhang, H W; Wang, G Q; Chen, S C; Dang, W Q
2009-12-01
A successful experiment with a physical model requires necessary conditions of similarity. This study presents an experimental method with a semi-scale physical model. The model is used to monitor and verify soil conservation by check dams in a small watershed on the Loess Plateau of China. During experiments, the model-prototype ratio of geomorphic variables was kept constant under each rainfall event. Consequently, experimental data are available for verification of soil erosion processes in the field and for predicting soil loss in a model watershed with check dams. Thus, it can predict the amount of soil loss in a catchment. This study also mentions four criteria: similarities of watershed geometry, grain size and bare land, Froude number (Fr) for rainfall event, and soil erosion in downscaled models. The efficacy of the proposed method was confirmed using these criteria in two different downscaled model experiments. The B-Model, a large scale model, simulates watershed prototype. The two small scale models, D(a) and D(b), have different erosion rates, but are the same size. These two models simulate hydraulic processes in the B-Model. Experiment results show that while soil loss in the small scale models was converted by multiplying the soil loss scale number, it was very close to that of the B-Model. Obviously, with a semi-scale physical model, experiments are available to verify and predict soil loss in a small watershed area with check dam system on the Loess Plateau, China.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-17
... BD- 100 Time Limits/Maintenance Checks. The actions described in this service information are... Challenger 300 BD-100 Time Limits/Maintenance Checks. (1) For the new tasks identified in Bombardier TR 5-2... Requirements,'' in Part 2 of Chapter 5 of Bombardier Challenger 300 BD-100 Time Limits/ Maintenance Checks...
75 FR 66655 - Airworthiness Directives; PILATUS Aircraft Ltd. Model PC-7 Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-29
... December 3, 2010 (the effective date of this AD), check the airplane maintenance records to determine if... of the airplane. Do this check following paragraph 3.A. of Pilatus Aircraft Ltd. PC-7 Service... maintenance records check required in paragraph (f)(1) of this AD or it is unclear whether or not the left and...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-05
... Bombardier Challenger 300 BD-100 Time Limits/Maintenance Checks Manual. For this task, the initial compliance..., of Part 2, of the Bombardier Challenger 300 BD-100 Time Limits/Maintenance Checks Manual, the general.../Maintenance Checks Manual, provided that the relevant information in the general revision is identical to that...
Lesch, M F; Horrey, W J; Wogalter, M S; Powell, W R
2011-10-01
Age-related changes in selective attention, inhibitory efficiency, and the ability to form new associations suggest that older adults may have greater difficulty with more complex and less comprehensible symbols. We examined comprehension of symbols varying in terms of ratings of familiarity, complexity, and comprehensibility, by younger (aged 18-35) and older (aged 55-70) adults. It was found that older adults have greater difficulty than younger adults in comprehending warning symbols and that accident scenario training improves comprehension. Regression analyses indicated that familiarity and comprehensibility were important in determining performance on the pre-training comprehension test by both younger and older adults. However, training eliminated the effects of stimulus characteristics for younger adults, while older adults' comprehension continued to be significantly influenced by comprehensibility. We suggest that symbol design incorporates cues to knowledge to facilitate the linkage between new knowledge (i.e. the warning symbol) and relevant knowledge in long-term memory. Statement of Relevance: Symbol characteristics play an important role in age-related differences in warning symbol comprehension. To optimise comprehension by older adults, symbols should have a clear relationship with areal-world referent. Alternatively, symbol design could incorporate cues to knowledge to facilitate the linkage between new knowledge and relevant knowledge in long-term memory.
Lesch, Mary F.; Powell, W. Ryan; Horrey, William J.; Wogalter, Michael S.
2013-01-01
This study teased apart the effects of comprehensibility and complexity on older adults' comprehension of warning symbols by manipulating the relevance of additional information in further refining the meaning of the symbol. Symbols were systematically altered such that increased visual complexity (in the form of contextual cues) resulted in increased comprehensibility. One hundred older adults, aged 50–71 years, were tested on their comprehension of these symbols before and after training. High comprehensibility–complexity symbols were found to be better understood than low- or medium-comprehensibility–complexity symbols and the effectiveness of the contextual cues varied as a function of training. Therefore, the nature of additional detail determines whether increased complexity is detrimental or beneficial to older adults' comprehension – if the additional details provide ‘cues to knowledge’, older adults' comprehension improves as a result of the increased complexity. However, some cues may require training in order to be effective. Practitioner Summary: Research suggests that older adults have greater difficulty in understanding more complex symbols. However, we found that when the complexity of symbols was increased through the addition of contextual cues, older adults' comprehension actually improved. Contextual cues aid older adults in making the connection between the symbol and its referent. PMID:23767856
Medical Symbols in Practice: Myths vs Reality.
Shetty, Anil; Shetty, Shraddha; Dsouza, Oliver
2014-08-01
The caduceus is the popular symbol of medicine. However, premier health organizations and regulatory bodies such as the World Health Organization and the Medical Council of India use a different symbol- the rod of Asclepius in their logo. There is an increasing awareness and recognition that the caduceus is a false symbol and has no historical substantiation as an emblem of medicine. Many academic and health institutions in the western hemisphere have changed their logo as a consequence. There are other symbols of medicine which are similarly misunderstood. The purpose of the study is to assess the knowledge of common medical symbols among doctors and medical students. Three hundred doctors and medical students were assessed on their knowledge about the Rx symbol, the Red Cross emblem and the true representative emblem of medicine. Logos and emblems of elite medical colleges and medical associations were also studied. Only 6% of doctors were aware that the Rod of Asclepius is the true symbol of healing. Knowledge of the significance of the Rx symbol and the origin of the Red Cross emblem was 55% and 39 %. There is very little awareness about the rod of Asclepius and most institutions have adopted a logo based on the caduceus. Awareness of the true origins and the symbolism of the emblems is lacking in the medical fraternity.
"I share, therefore I am": personality traits, life satisfaction, and Facebook check-ins.
Wang, Shaojung Sharon
2013-12-01
This study explored whether agreeableness, extraversion, and openness function to influence self-disclosure behavior, which in turn impacts the intensity of checking in on Facebook. A complete path from extraversion to Facebook check-in through self-disclosure and sharing was found. The indirect effect from sharing to check-in intensity through life satisfaction was particularly salient. The central component of check-in is for users to disclose a specific location selectively that has implications on demonstrating their social lives, lifestyles, and tastes, enabling a selective and optimized self-image. Implications on the hyperpersonal model and warranting principle are discussed.
A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process
NASA Technical Reports Server (NTRS)
Wang, Yi; Tamai, Tetsuo
2009-01-01
Since the complexity of software systems continues to grow, most engineers face two serious problems: the state space explosion problem and the problem of how to debug systems. In this paper, we propose a game-theoretic approach to full branching time model checking on three-valued semantics. The three-valued models and logics provide successful abstraction that overcomes the state space explosion problem. The game style model checking that generates counter-examples can guide refinement or identify validated formulas, which solves the system debugging problem. Furthermore, output of our game style method will give significant information to engineers in detecting where errors have occurred and what the causes of the errors are.
Efficient Translation of LTL Formulae into Buchi Automata
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Lerda, Flavio
2001-01-01
Model checking is a fully automated technique for checking that a system satisfies a set of required properties. With explicit-state model checkers, properties are typically defined in linear-time temporal logic (LTL), and are translated into B chi automata in order to be checked. This report presents how we have combined and improved existing techniques to obtain an efficient LTL to B chi automata translator. In particular, we optimize the core of existing tableau-based approaches to generate significantly smaller automata. Our approach has been implemented and is being released as part of the Java PathFinder software (JPF), an explicit state model checker under development at the NASA Ames Research Center.
Parallel Distributed Processing Theory in the Age of Deep Networks.
Bowers, Jeffrey S
2017-12-01
Parallel distributed processing (PDP) models in psychology are the precursors of deep networks used in computer science. However, only PDP models are associated with two core psychological claims, namely that all knowledge is coded in a distributed format and cognition is mediated by non-symbolic computations. These claims have long been debated in cognitive science, and recent work with deep networks speaks to this debate. Specifically, single-unit recordings show that deep networks learn units that respond selectively to meaningful categories, and researchers are finding that deep networks need to be supplemented with symbolic systems to perform some tasks. Given the close links between PDP and deep networks, it is surprising that research with deep networks is challenging PDP theory. Copyright © 2017. Published by Elsevier Ltd.
Integration of perception and reasoning in fast neural modules
NASA Technical Reports Server (NTRS)
Fritz, David G.
1989-01-01
Artificial neural systems promise to integrate symbolic and sub-symbolic processing to achieve real time control of physical systems. Two potential alternatives exist. In one, neural nets can be used to front-end expert systems. The expert systems, in turn, are developed with varying degrees of parallelism, including their implementation in neural nets. In the other, rule-based reasoning and sensor data can be integrated within a single hybrid neural system. The hybrid system reacts as a unit to provide decisions (problem solutions) based on the simultaneous evaluation of data and rules. Discussed here is a model hybrid system based on the fuzzy cognitive map (FCM). The operation of the model is illustrated with the control of a hypothetical satellite that intelligently alters its attitude in space in response to an intersecting micrometeorite shower.
40 CFR 1042.905 - Symbols, acronyms, and abbreviations.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Symbols, acronyms, and abbreviations... Definitions and Other Reference Information § 1042.905 Symbols, acronyms, and abbreviations. The following symbols, acronyms, and abbreviations apply to this part: ABTAveraging, banking, and trading. AECDauxiliary...
40 CFR 1033.905 - Symbols, acronyms, and abbreviations.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Symbols, acronyms, and abbreviations. 1033.905 Section 1033.905 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR....905 Symbols, acronyms, and abbreviations. The following symbols, acronyms, and abbreviations apply to...
Concrete Model Checking with Abstract Matching and Refinement
NASA Technical Reports Server (NTRS)
Pasareanu Corina S.; Peianek Radek; Visser, Willem
2005-01-01
We propose an abstraction-based model checking method which relies on refinement of an under-approximation of the feasible behaviors of the system under analysis. The method preserves errors to safety properties, since all analyzed behaviors are feasible by definition. The method does not require an abstract transition relation to he generated, but instead executes the concrete transitions while storing abstract versions of the concrete states, as specified by a set of abstraction predicates. For each explored transition. the method checks, with the help of a theorem prover, whether there is any loss of precision introduced by abstraction. The results of these checks are used to decide termination or to refine the abstraction, by generating new abstraction predicates. If the (possibly infinite) concrete system under analysis has a finite bisimulation quotient, then the method is guaranteed to eventually explore an equivalent finite bisimilar structure. We illustrate the application of the approach for checking concurrent programs. We also show how a lightweight variant can be used for efficient software testing.
Logic Model Checking of Unintended Acceleration Claims in Toyota Vehicles
NASA Technical Reports Server (NTRS)
Gamble, Ed
2012-01-01
Part of the US Department of Transportation investigation of Toyota sudden unintended acceleration (SUA) involved analysis of the throttle control software, JPL Laboratory for Reliable Software applied several techniques including static analysis and logic model checking, to the software; A handful of logic models were build, Some weaknesses were identified; however, no cause for SUA was found; The full NASA report includes numerous other analyses
NASA Technical Reports Server (NTRS)
Gamble, Ed; Holzmann, Gerard
2011-01-01
Part of the US DOT investigation of Toyota SUA involved analysis of the throttle control software. JPL LaRS applied several techniques, including static analysis and logic model checking, to the software. A handful of logic models were built. Some weaknesses were identified; however, no cause for SUA was found. The full NASA report includes numerous other analyses
[Rod of Asclepius. Symbol of medicine].
Young, Pablo; Finn, Bárbara C; Bruetman, Julio E; Cesaro Gelos, Jorge; Trimarchi, Hernán
2013-09-01
Symbolism is one of the most archaic forms of human thoughts. Symbol derives from the Latin word symbolum, and the latter from the Greek symbolon or symballo, which means "I coincide, I make matches". The Medicine symbol represents a whole series of historical and ethical values. Asclepius Rod with one serpent entwined, has traditionally been the symbol of scientific medicine. In a misconception that has lasted 500 years, the Caduceus of Hermes, entwined by two serpents and with two wings, has been considered the symbol of Medicine. However, the Caduceus is the current symbol of Commerce. Asclepius Rod and the Caduceus of Hermes represent two professions, Medicine and Commerce that, in ethical practice, should not be mixed. Physicians should be aware of their real emblem, its historical origin and meaning.
Which graphic symbols do 4-year-old children choose to represent each of the four basic emotions?
Visser, Naomi; Alant, Erna; Harty, Michal
2008-12-01
The purpose of this study was to investigate which graphic symbols are perceived by typically developing 4-year-old children as the best representation of four basic emotions. Participants were asked to respond to questions by using graphic symbols taken from PCS, PICSYM, and Makaton for four basic emotions: happy, sad, afraid, angry. The purpose was to determine which graphic symbol the children selected as a representation of an emotion. Frequencies of choices per symbol were obtained and the different symbols were analysed in terms of facial features that distinguish them from each other. The most preferred symbol per emotion was also identified. Results showed that children recognized the emotion happy with more ease than the emotions sad, afraid, and angry.
Can vocal conditioning trigger a semiotic ratchet in marmosets?
Turesson, Hjalmar K; Ribeiro, Sidarta
2015-01-01
The complexity of human communication has often been taken as evidence that our language reflects a true evolutionary leap, bearing little resemblance to any other animal communication system. The putative uniqueness of the human language poses serious evolutionary and ethological challenges to a rational explanation of human communication. Here we review ethological, anatomical, molecular, and computational results across several species to set boundaries for these challenges. Results from animal behavior, cognitive psychology, neurobiology, and semiotics indicate that human language shares multiple features with other primate communication systems, such as specialized brain circuits for sensorimotor processing, the capability for indexical (pointing) and symbolic (referential) signaling, the importance of shared intentionality for associative learning, affective conditioning and parental scaffolding of vocal production. The most substantial differences lie in the higher human capacity for symbolic compositionality, fast vertical transmission of new symbols across generations, and irreversible accumulation of novel adaptive behaviors (cultural ratchet). We hypothesize that increasingly-complex vocal conditioning of an appropriate animal model may be sufficient to trigger a semiotic ratchet, evidenced by progressive sign complexification, as spontaneous contact calls become indexes, then symbols and finally arguments (strings of symbols). To test this hypothesis, we outline a series of conditioning experiments in the common marmoset (Callithrix jacchus). The experiments are designed to probe the limits of vocal communication in a prosocial, highly vocal primate 35 million years far from the human lineage, so as to shed light on the mechanisms of semiotic complexification and cultural transmission, and serve as a naturalistic behavioral setting for the investigation of language disorders.
Can vocal conditioning trigger a semiotic ratchet in marmosets?
Turesson, Hjalmar K.; Ribeiro, Sidarta
2015-01-01
The complexity of human communication has often been taken as evidence that our language reflects a true evolutionary leap, bearing little resemblance to any other animal communication system. The putative uniqueness of the human language poses serious evolutionary and ethological challenges to a rational explanation of human communication. Here we review ethological, anatomical, molecular, and computational results across several species to set boundaries for these challenges. Results from animal behavior, cognitive psychology, neurobiology, and semiotics indicate that human language shares multiple features with other primate communication systems, such as specialized brain circuits for sensorimotor processing, the capability for indexical (pointing) and symbolic (referential) signaling, the importance of shared intentionality for associative learning, affective conditioning and parental scaffolding of vocal production. The most substantial differences lie in the higher human capacity for symbolic compositionality, fast vertical transmission of new symbols across generations, and irreversible accumulation of novel adaptive behaviors (cultural ratchet). We hypothesize that increasingly-complex vocal conditioning of an appropriate animal model may be sufficient to trigger a semiotic ratchet, evidenced by progressive sign complexification, as spontaneous contact calls become indexes, then symbols and finally arguments (strings of symbols). To test this hypothesis, we outline a series of conditioning experiments in the common marmoset (Callithrix jacchus). The experiments are designed to probe the limits of vocal communication in a prosocial, highly vocal primate 35 million years far from the human lineage, so as to shed light on the mechanisms of semiotic complexification and cultural transmission, and serve as a naturalistic behavioral setting for the investigation of language disorders. PMID:26500583
Model selection and assessment for multi-species occupancy models
Broms, Kristin M.; Hooten, Mevin B.; Fitzpatrick, Ryan M.
2016-01-01
While multi-species occupancy models (MSOMs) are emerging as a popular method for analyzing biodiversity data, formal checking and validation approaches for this class of models have lagged behind. Concurrent with the rise in application of MSOMs among ecologists, a quiet regime shift is occurring in Bayesian statistics where predictive model comparison approaches are experiencing a resurgence. Unlike single-species occupancy models that use integrated likelihoods, MSOMs are usually couched in a Bayesian framework and contain multiple levels. Standard model checking and selection methods are often unreliable in this setting and there is only limited guidance in the ecological literature for this class of models. We examined several different contemporary Bayesian hierarchical approaches for checking and validating MSOMs and applied these methods to a freshwater aquatic study system in Colorado, USA, to better understand the diversity and distributions of plains fishes. Our findings indicated distinct differences among model selection approaches, with cross-validation techniques performing the best in terms of prediction.