Sample records for automata linear rules

  1. Boolean linear differential operators on elementary cellular automata

    NASA Astrophysics Data System (ADS)

    Martín Del Rey, Ángel

    2014-12-01

    In this paper, the notion of boolean linear differential operator (BLDO) on elementary cellular automata (ECA) is introduced and some of their more important properties are studied. Special attention is paid to those differential operators whose coefficients are the ECA with rule numbers 90 and 150.

  2. Cellular Automata

    NASA Astrophysics Data System (ADS)

    Gutowitz, Howard

    1991-08-01

    Cellular automata, dynamic systems in which space and time are discrete, are yielding interesting applications in both the physical and natural sciences. The thirty four contributions in this book cover many aspects of contemporary studies on cellular automata and include reviews, research reports, and guides to recent literature and available software. Chapters cover mathematical analysis, the structure of the space of cellular automata, learning rules with specified properties: cellular automata in biology, physics, chemistry, and computation theory; and generalizations of cellular automata in neural nets, Boolean nets, and coupled map lattices. Current work on cellular automata may be viewed as revolving around two central and closely related problems: the forward problem and the inverse problem. The forward problem concerns the description of properties of given cellular automata. Properties considered include reversibility, invariants, criticality, fractal dimension, and computational power. The role of cellular automata in computation theory is seen as a particularly exciting venue for exploring parallel computers as theoretical and practical tools in mathematical physics. The inverse problem, an area of study gaining prominence particularly in the natural sciences, involves designing rules that possess specified properties or perform specified task. A long-term goal is to develop a set of techniques that can find a rule or set of rules that can reproduce quantitative observations of a physical system. Studies of the inverse problem take up the organization and structure of the set of automata, in particular the parameterization of the space of cellular automata. Optimization and learning techniques, like the genetic algorithm and adaptive stochastic cellular automata are applied to find cellular automaton rules that model such physical phenomena as crystal growth or perform such adaptive-learning tasks as balancing an inverted pole. Howard Gutowitz is

  3. Opinion evolution based on cellular automata rules in small world networks

    NASA Astrophysics Data System (ADS)

    Shi, Xiao-Ming; Shi, Lun; Zhang, Jie-Fang

    2010-03-01

    In this paper, we apply cellular automata rules, which can be given by a truth table, to human memory. We design each memory as a tracking survey mode that keeps the most recent three opinions. Each cellular automata rule, as a personal mechanism, gives the final ruling in one time period based on the data stored in one's memory. The key focus of the paper is to research the evolution of people's attitudes to the same question. Based on a great deal of empirical observations from computer simulations, all the rules can be classified into 20 groups. We highlight the fact that the phenomenon shown by some rules belonging to the same group will be altered within several steps by other rules in different groups. It is truly amazing that, compared with the last hundreds of presidential voting in America, the eras of important events in America's history coincide with the simulation results obtained by our model.

  4. Analytical formulation of cellular automata rules using data models

    NASA Astrophysics Data System (ADS)

    Jaenisch, Holger M.; Handley, James W.

    2009-05-01

    We present a unique method for converting traditional cellular automata (CA) rules into analytical function form. CA rules have been successfully used for morphological image processing and volumetric shape recognition and classification. Further, the use of CA rules as analog models to the physical and biological sciences can be significantly extended if analytical (as opposed to discrete) models could be formulated. We show that such transformations are possible. We use as our example John Horton Conway's famous "Game of Life" rule set. We show that using Data Modeling, we are able to derive both polynomial and bi-spectrum models of the IF-THEN rules that yield equivalent results. Further, we demonstrate that the "Game of Life" rule set can be modeled using the multi-fluxion, yielding a closed form nth order derivative and integral. All of the demonstrated analytical forms of the CA rule are general and applicable to real-time use.

  5. Inferring the Limit Behavior of Some Elementary Cellular Automata

    NASA Astrophysics Data System (ADS)

    Ruivo, Eurico L. P.; de Oliveira, Pedro P. B.

    Cellular automata locally define dynamical systems, discrete in space, time and in the state variables, capable of displaying arbitrarily complex global emergent behavior. One core question in the study of cellular automata refers to their limit behavior, that is, to the global dynamical features in an infinite time evolution. Previous works have shown that for finite time evolutions, the dynamics of one-dimensional cellular automata can be described by regular languages and, therefore, by finite automata. Such studies have shown the existence of growth patterns in the evolution of such finite automata for some elementary cellular automata rules and also inferred the limit behavior of such rules based upon the growth patterns; however, the results on the limit behavior were obtained manually, by direct inspection of the structures that arise during the time evolution. Here we present the formalization of an automatic method to compute such structures. Based on this, the rules of the elementary cellular automata space were classified according to the existence of a growth pattern in their finite automata. Also, we present a method to infer the limit graph of some elementary cellular automata rules, derived from the analysis of the regular expressions that describe their behavior in finite time. Finally, we analyze some attractors of two rules for which we could not compute the whole limit set.

  6. Nonsynchronous updating in the multiverse of cellular automata

    NASA Astrophysics Data System (ADS)

    Reia, Sandro M.; Kinouchi, Osame

    2015-04-01

    In this paper we study updating effects on cellular automata rule space. We consider a subset of 6144 order-3 automata from the space of 262144 bidimensional outer-totalistic rules. We compare synchronous to asynchronous and sequential updatings. Focusing on two automata, we discuss how update changes destroy typical structures of these rules. Besides, we show that the first-order phase transition in the multiverse of synchronous cellular automata, revealed with the use of a recently introduced control parameter, seems to be robust not only to changes in update schema but also to different initial densities.

  7. Nonsynchronous updating in the multiverse of cellular automata.

    PubMed

    Reia, Sandro M; Kinouchi, Osame

    2015-04-01

    In this paper we study updating effects on cellular automata rule space. We consider a subset of 6144 order-3 automata from the space of 262144 bidimensional outer-totalistic rules. We compare synchronous to asynchronous and sequential updatings. Focusing on two automata, we discuss how update changes destroy typical structures of these rules. Besides, we show that the first-order phase transition in the multiverse of synchronous cellular automata, revealed with the use of a recently introduced control parameter, seems to be robust not only to changes in update schema but also to different initial densities.

  8. Monitoring with Data Automata

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus

    2014-01-01

    We present a form of automaton, referred to as data automata, suited for monitoring sequences of data-carrying events, for example emitted by an executing software system. This form of automata allows states to be parameterized with data, forming named records, which are stored in an efficiently indexed data structure, a form of database. This very explicit approach differs from other automaton-based monitoring approaches. Data automata are also characterized by allowing transition conditions to refer to other parameterized states, and by allowing transitions sequences. The presented automaton concept is inspired by rule-based systems, especially the Rete algorithm, which is one of the well-established algorithms for executing rule-based systems. We present an optimized external DSL for data automata, as well as a comparable unoptimized internal DSL (API) in the Scala programming language, in order to compare the two solutions. An evaluation compares these two solutions to several other monitoring systems.

  9. Probabilistic Cellular Automata

    PubMed Central

    Agapie, Alexandru; Giuclea, Marius

    2014-01-01

    Abstract Cellular automata are binary lattices used for modeling complex dynamical systems. The automaton evolves iteratively from one configuration to another, using some local transition rule based on the number of ones in the neighborhood of each cell. With respect to the number of cells allowed to change per iteration, we speak of either synchronous or asynchronous automata. If randomness is involved to some degree in the transition rule, we speak of probabilistic automata, otherwise they are called deterministic. With either type of cellular automaton we are dealing with, the main theoretical challenge stays the same: starting from an arbitrary initial configuration, predict (with highest accuracy) the end configuration. If the automaton is deterministic, the outcome simplifies to one of two configurations, all zeros or all ones. If the automaton is probabilistic, the whole process is modeled by a finite homogeneous Markov chain, and the outcome is the corresponding stationary distribution. Based on our previous results for the asynchronous case—connecting the probability of a configuration in the stationary distribution to its number of zero-one borders—the article offers both numerical and theoretical insight into the long-term behavior of synchronous cellular automata. PMID:24999557

  10. Probabilistic cellular automata.

    PubMed

    Agapie, Alexandru; Andreica, Anca; Giuclea, Marius

    2014-09-01

    Cellular automata are binary lattices used for modeling complex dynamical systems. The automaton evolves iteratively from one configuration to another, using some local transition rule based on the number of ones in the neighborhood of each cell. With respect to the number of cells allowed to change per iteration, we speak of either synchronous or asynchronous automata. If randomness is involved to some degree in the transition rule, we speak of probabilistic automata, otherwise they are called deterministic. With either type of cellular automaton we are dealing with, the main theoretical challenge stays the same: starting from an arbitrary initial configuration, predict (with highest accuracy) the end configuration. If the automaton is deterministic, the outcome simplifies to one of two configurations, all zeros or all ones. If the automaton is probabilistic, the whole process is modeled by a finite homogeneous Markov chain, and the outcome is the corresponding stationary distribution. Based on our previous results for the asynchronous case-connecting the probability of a configuration in the stationary distribution to its number of zero-one borders-the article offers both numerical and theoretical insight into the long-term behavior of synchronous cellular automata.

  11. Cellular automata rule characterization and classification using texture descriptors

    NASA Astrophysics Data System (ADS)

    Machicao, Jeaneth; Ribas, Lucas C.; Scabini, Leonardo F. S.; Bruno, Odermir M.

    2018-05-01

    The cellular automata (CA) spatio-temporal patterns have attracted the attention from many researchers since it can provide emergent behavior resulting from the dynamics of each individual cell. In this manuscript, we propose an approach of texture image analysis to characterize and classify CA rules. The proposed method converts the CA spatio-temporal patterns into a gray-scale image. The gray-scale is obtained by creating a binary number based on the 8-connected neighborhood of each dot of the CA spatio-temporal pattern. We demonstrate that this technique enhances the CA rule characterization and allow to use different texture image analysis algorithms. Thus, various texture descriptors were evaluated in a supervised training approach aiming to characterize the CA's global evolution. Our results show the efficiency of the proposed method for the classification of the elementary CA (ECAs), reaching a maximum of 99.57% of accuracy rate according to the Li-Packard scheme (6 classes) and 94.36% for the classification of the 88 rules scheme. Moreover, within the image analysis context, we found a better performance of the method by means of a transformation of the binary states to a gray-scale.

  12. Linear System Control Using Stochastic Learning Automata

    NASA Technical Reports Server (NTRS)

    Ziyad, Nigel; Cox, E. Lucien; Chouikha, Mohamed F.

    1998-01-01

    This paper explains the use of a Stochastic Learning Automata (SLA) to control switching between three systems to produce the desired output response. The SLA learns the optimal choice of the damping ratio for each system to achieve a desired result. We show that the SLA can learn these states for the control of an unknown system with the proper choice of the error criteria. The results of using a single automaton are compared to using multiple automata.

  13. Lempel-Ziv complexity analysis of one dimensional cellular automata.

    PubMed

    Estevez-Rams, E; Lora-Serrano, R; Nunes, C A J; Aragón-Fernández, B

    2015-12-01

    Lempel-Ziv complexity measure has been used to estimate the entropy density of a string. It is defined as the number of factors in a production factorization of a string. In this contribution, we show that its use can be extended, by using the normalized information distance, to study the spatiotemporal evolution of random initial configurations under cellular automata rules. In particular, the transfer information from time consecutive configurations is studied, as well as the sensitivity to perturbed initial conditions. The behavior of the cellular automata rules can be grouped in different classes, but no single grouping captures the whole nature of the involved rules. The analysis carried out is particularly appropriate for studying the computational processing capabilities of cellular automata rules.

  14. Lempel-Ziv complexity analysis of one dimensional cellular automata

    NASA Astrophysics Data System (ADS)

    Estevez-Rams, E.; Lora-Serrano, R.; Nunes, C. A. J.; Aragón-Fernández, B.

    2015-12-01

    Lempel-Ziv complexity measure has been used to estimate the entropy density of a string. It is defined as the number of factors in a production factorization of a string. In this contribution, we show that its use can be extended, by using the normalized information distance, to study the spatiotemporal evolution of random initial configurations under cellular automata rules. In particular, the transfer information from time consecutive configurations is studied, as well as the sensitivity to perturbed initial conditions. The behavior of the cellular automata rules can be grouped in different classes, but no single grouping captures the whole nature of the involved rules. The analysis carried out is particularly appropriate for studying the computational processing capabilities of cellular automata rules.

  15. Evolution of Cellular Automata toward a LIFE-Like Rule Guided by 1/ƒ Noise

    NASA Astrophysics Data System (ADS)

    Ninagawa, Shigeru

    There is evidence in favor of a relationship between the presence of 1/ƒ noise and computational universality in cellular automata. To confirm the relationship, we search for two-dimensional cellular automata with a 1/ƒ power spectrum by means of genetic algorithms. The power spectrum is calculated from the evolution of the state of the cell, starting from a random initial configuration. The fitness is estimated by the power spectrum with consideration of the spectral similarity to the 1/ƒ spectrum. The result shows that the rule with the highest fitness over the most runs exhibits a 1/ƒ type spectrum and its transition function and behavior are quite similar to those of the Game of Life, which is known to be a computationally universal cellular automaton. These results support the relationship between the presence of 1/ƒ noise and computational universality.

  16. Cellular Automata Generalized To An Inferential System

    NASA Astrophysics Data System (ADS)

    Blower, David J.

    2007-11-01

    Stephen Wolfram popularized elementary one-dimensional cellular automata in his book, A New Kind of Science. Among many remarkable things, he proved that one of these cellular automata was a Universal Turing Machine. Such cellular automata can be interpreted in a different way by viewing them within the context of the formal manipulation rules from probability theory. Bayes's Theorem is the most famous of such formal rules. As a prelude, we recapitulate Jaynes's presentation of how probability theory generalizes classical logic using modus ponens as the canonical example. We emphasize the important conceptual standing of Boolean Algebra for the formal rules of probability manipulation and give an alternative demonstration augmenting and complementing Jaynes's derivation. We show the complementary roles played in arguments of this kind by Bayes's Theorem and joint probability tables. A good explanation for all of this is afforded by the expansion of any particular logic function via the disjunctive normal form (DNF). The DNF expansion is a useful heuristic emphasized in this exposition because such expansions point out where relevant 0s should be placed in the joint probability tables for logic functions involving any number of variables. It then becomes a straightforward exercise to rely on Boolean Algebra, Bayes's Theorem, and joint probability tables in extrapolating to Wolfram's cellular automata. Cellular automata are seen as purely deductive systems, just like classical logic, which probability theory is then able to generalize. Thus, any uncertainties which we might like to introduce into the discussion about cellular automata are handled with ease via the familiar inferential path. Most importantly, the difficult problem of predicting what cellular automata will do in the far future is treated like any inferential prediction problem.

  17. Quantum cellular automata and free quantum field theory

    NASA Astrophysics Data System (ADS)

    D'Ariano, Giacomo Mauro; Perinotti, Paolo

    2017-02-01

    In a series of recent papers [1-4] it has been shown how free quantum field theory can be derived without using mechanical primitives (including space-time, special relativity, quantization rules, etc.), but only considering the easiest quantum algorithm encompassing a countable set of quantum systems whose network of interactions satisfies the simple principles of unitarity, homogeneity, locality, and isotropy. This has opened the route to extending the axiomatic information-theoretic derivation of the quantum theory of abstract systems [5, 6] to include quantum field theory. The inherent discrete nature of the informational axiomatization leads to an extension of quantum field theory to a quantum cellular automata theory, where the usual field theory is recovered in a regime where the discrete structure of the automata cannot be probed. A simple heuristic argument sets the scale of discreteness to the Planck scale, and the customary physical regime where discreteness is not visible is the relativistic one of small wavevectors. In this paper we provide a thorough derivation from principles that in the most general case the graph of the quantum cellular automaton is the Cayley graph of a finitely presented group, and showing how for the case corresponding to Euclidean emergent space (where the group resorts to an Abelian one) the automata leads to Weyl, Dirac and Maxwell field dynamics in the relativistic limit. We conclude with some perspectives towards the more general scenario of non-linear automata for interacting quantum field theory.

  18. Genetic Algorithm Calibration of Probabilistic Cellular Automata for Modeling Mining Permit Activity

    USGS Publications Warehouse

    Louis, S.J.; Raines, G.L.

    2003-01-01

    We use a genetic algorithm to calibrate a spatially and temporally resolved cellular automata to model mining activity on public land in Idaho and western Montana. The genetic algorithm searches through a space of transition rule parameters of a two dimensional cellular automata model to find rule parameters that fit observed mining activity data. Previous work by one of the authors in calibrating the cellular automaton took weeks - the genetic algorithm takes a day and produces rules leading to about the same (or better) fit to observed data. These preliminary results indicate that genetic algorithms are a viable tool in calibrating cellular automata for this application. Experience gained during the calibration of this cellular automata suggests that mineral resource information is a critical factor in the quality of the results. With automated calibration, further refinements of how the mineral-resource information is provided to the cellular automaton will probably improve our model.

  19. Failover in Cellular Automata

    NASA Astrophysics Data System (ADS)

    Kumar, Shailesh; Rao, Shrisha

    This paper studies a phenomenon called failover, and shows that this phenomenon (in particular, stateless failover) can be modeled by Game of Life cellular automata. This is the first time that this sophisticated real-life system behavior has been modeled in abstract terms. A cellular automata (CA) configuration is constructed that exhibits emergent failover. The configuration is based on standard Game of Life rules. Gliders and glider-guns form the core messaging structure in the configuration. The blinker is represented as the basic computational unit, and it is shown how it can be recreated in case of a failure. Stateless failover using the primary-backup mechanism is demonstrated. The details of the CA components used in the configuration and its working are described, and a simulation of the complete configuration is also presented.

  20. Color image encryption based on hybrid hyper-chaotic system and cellular automata

    NASA Astrophysics Data System (ADS)

    Yaghouti Niyat, Abolfazl; Moattar, Mohammad Hossein; Niazi Torshiz, Masood

    2017-03-01

    This paper proposes an image encryption scheme based on Cellular Automata (CA). CA is a self-organizing structure with a set of cells in which each cell is updated by certain rules that are dependent on a limited number of neighboring cells. The major disadvantages of cellular automata in cryptography include limited number of reversal rules and inability to produce long sequences of states by these rules. In this paper, a non-uniform cellular automata framework is proposed to solve this problem. This proposed scheme consists of confusion and diffusion steps. In confusion step, the positions of the original image pixels are replaced by chaos mapping. Key image is created using non-uniform cellular automata and then the hyper-chaotic mapping is used to select random numbers from the image key for encryption. The main contribution of the paper is the application of hyper chaotic functions and non-uniform CA for robust key image generation. Security analysis and experimental results show that the proposed method has a very large key space and is resistive against noise and attacks. The correlation between adjacent pixels in the encrypted image is reduced and the amount of entropy is equal to 7.9991 which is very close to 8 which is ideal.

  1. Using cellular automata to generate image representation for biological sequences.

    PubMed

    Xiao, X; Shao, S; Ding, Y; Huang, Z; Chen, X; Chou, K-C

    2005-02-01

    A novel approach to visualize biological sequences is developed based on cellular automata (Wolfram, S. Nature 1984, 311, 419-424), a set of discrete dynamical systems in which space and time are discrete. By transforming the symbolic sequence codes into the digital codes, and using some optimal space-time evolvement rules of cellular automata, a biological sequence can be represented by a unique image, the so-called cellular automata image. Many important features, which are originally hidden in a long and complicated biological sequence, can be clearly revealed thru its cellular automata image. With biological sequences entering into databanks rapidly increasing in the post-genomic era, it is anticipated that the cellular automata image will become a very useful vehicle for investigation into their key features, identification of their function, as well as revelation of their "fingerprint". It is anticipated that by using the concept of the pseudo amino acid composition (Chou, K.C. Proteins: Structure, Function, and Genetics, 2001, 43, 246-255), the cellular automata image approach can also be used to improve the quality of predicting protein attributes, such as structural class and subcellular location.

  2. Automata-Based Verification of Temporal Properties on Running Programs

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Havelund, Klaus; Lan, Sonie (Technical Monitor)

    2001-01-01

    This paper presents an approach to checking a running program against its Linear Temporal Logic (LTL) specifications. LTL is a widely used logic for expressing properties of programs viewed as sets of executions. Our approach consists of translating LTL formulae to finite-state automata, which are used as observers of the program behavior. The translation algorithm we propose modifies standard LTL to Buchi automata conversion techniques to generate automata that check finite program traces. The algorithm has been implemented in a tool, which has been integrated with the generic JPaX framework for runtime analysis of Java programs.

  3. Minimal entropy approximation for cellular automata

    NASA Astrophysics Data System (ADS)

    Fukś, Henryk

    2014-02-01

    We present a method for the construction of approximate orbits of measures under the action of cellular automata which is complementary to the local structure theory. The local structure theory is based on the idea of Bayesian extension, that is, construction of a probability measure consistent with given block probabilities and maximizing entropy. If instead of maximizing entropy one minimizes it, one can develop another method for the construction of approximate orbits, at the heart of which is the iteration of finite-dimensional maps, called minimal entropy maps. We present numerical evidence that the minimal entropy approximation sometimes outperforms the local structure theory in characterizing the properties of cellular automata. The density response curve for elementary CA rule 26 is used to illustrate this claim.

  4. Efficient Translation of LTL Formulae into Buchi Automata

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Lerda, Flavio

    2001-01-01

    Model checking is a fully automated technique for checking that a system satisfies a set of required properties. With explicit-state model checkers, properties are typically defined in linear-time temporal logic (LTL), and are translated into B chi automata in order to be checked. This report presents how we have combined and improved existing techniques to obtain an efficient LTL to B chi automata translator. In particular, we optimize the core of existing tableau-based approaches to generate significantly smaller automata. Our approach has been implemented and is being released as part of the Java PathFinder software (JPF), an explicit state model checker under development at the NASA Ames Research Center.

  5. Usage Automata

    NASA Astrophysics Data System (ADS)

    Bartoletti, Massimo

    Usage automata are an extension of finite stata automata, with some additional features (e.g. parameters and guards) that improve their expressivity. Usage automata are expressive enough to model security requirements of real-world applications; at the same time, they are simple enough to be statically amenable, e.g. they can be model-checked against abstractions of program usages. We study here some foundational aspects of usage automata. In particular, we discuss about their expressive power, and about their effective use in run-time mechanisms for enforcing usage policies.

  6. Generic framework for mining cellular automata models on protein-folding simulations.

    PubMed

    Diaz, N; Tischer, I

    2016-05-13

    Cellular automata model identification is an important way of building simplified simulation models. In this study, we describe a generic architectural framework to ease the development process of new metaheuristic-based algorithms for cellular automata model identification in protein-folding trajectories. Our framework was developed by a methodology based on design patterns that allow an improved experience for new algorithms development. The usefulness of the proposed framework is demonstrated by the implementation of four algorithms, able to obtain extremely precise cellular automata models of the protein-folding process with a protein contact map representation. Dynamic rules obtained by the proposed approach are discussed, and future use for the new tool is outlined.

  7. Simulation of root forms using cellular automata model

    NASA Astrophysics Data System (ADS)

    Winarno, Nanang; Prima, Eka Cahya; Afifah, Ratih Mega Ayu

    2016-02-01

    This research aims to produce a simulation program for root forms using cellular automata model. Stephen Wolfram in his book entitled "A New Kind of Science" discusses the formation rules based on the statistical analysis. In accordance with Stephen Wolfram's investigation, the research will develop a basic idea of computer program using Delphi 7 programming language. To best of our knowledge, there is no previous research developing a simulation describing root forms using the cellular automata model compared to the natural root form with the presence of stone addition as the disturbance. The result shows that (1) the simulation used four rules comparing results of the program towards the natural photographs and each rule had shown different root forms; (2) the stone disturbances prevent the root growth and the multiplication of root forms had been successfully modeled. Therefore, this research had added some stones, which have size of 120 cells placed randomly in the soil. Like in nature, stones cannot be penetrated by plant roots. The result showed that it is very likely to further develop the program of simulating root forms by 50 variations.

  8. Simulation of root forms using cellular automata model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winarno, Nanang, E-mail: nanang-winarno@upi.edu; Prima, Eka Cahya; Afifah, Ratih Mega Ayu

    This research aims to produce a simulation program for root forms using cellular automata model. Stephen Wolfram in his book entitled “A New Kind of Science” discusses the formation rules based on the statistical analysis. In accordance with Stephen Wolfram’s investigation, the research will develop a basic idea of computer program using Delphi 7 programming language. To best of our knowledge, there is no previous research developing a simulation describing root forms using the cellular automata model compared to the natural root form with the presence of stone addition as the disturbance. The result shows that (1) the simulation usedmore » four rules comparing results of the program towards the natural photographs and each rule had shown different root forms; (2) the stone disturbances prevent the root growth and the multiplication of root forms had been successfully modeled. Therefore, this research had added some stones, which have size of 120 cells placed randomly in the soil. Like in nature, stones cannot be penetrated by plant roots. The result showed that it is very likely to further develop the program of simulating root forms by 50 variations.« less

  9. Configurable Cellular Automata for Pseudorandom Number Generation

    NASA Astrophysics Data System (ADS)

    Quieta, Marie Therese; Guan, Sheng-Uei

    This paper proposes a generalized structure of cellular automata (CA) — the configurable cellular automata (CoCA). With selected properties from programmable CA (PCA) and controllable CA (CCA), a new approach to cellular automata is developed. In CoCA, the cells are dynamically reconfigured at run-time via a control CA. Reconfiguration of a cell simply means varying the properties of that cell with time. Some examples of properties to be reconfigured are rule selection, boundary condition, and radius. While the objective of this paper is to propose CoCA as a new CA method, the main focus is to design a CoCA that can function as a good pseudorandom number generator (PRNG). As a PRNG, CoCA can be a suitable candidate as it can pass 17 out of 18 Diehard tests with 31 cells. CoCA PRNG's performance based on Diehard test is considered superior over other CA PRNG works. Moreover, CoCA opens new rooms for research not only in the field of random number generation, but in modeling complex systems as well.

  10. QM Automata: A New Class of Restricted Quantum Membrane Automata.

    PubMed

    Giannakis, Konstantinos; Singh, Alexandros; Kastampolidou, Kalliopi; Papalitsas, Christos; Andronikos, Theodore

    2017-01-01

    The term "Unconventional Computing" describes the use of non-standard methods and models in computing. It is a recently established field, with many interesting and promising results. In this work we combine notions from quantum computing with aspects of membrane computing to define what we call QM automata. Specifically, we introduce a variant of quantum membrane automata that operate in accordance with the principles of quantum computing. We explore the functionality and capabilities of the QM automata through indicative examples. Finally we suggest future directions for research on QM automata.

  11. Modeling and Density Estimation of an Urban Freeway Network Based on Dynamic Graph Hybrid Automata

    PubMed Central

    Chen, Yangzhou; Guo, Yuqi; Wang, Ying

    2017-01-01

    In this paper, in order to describe complex network systems, we firstly propose a general modeling framework by combining a dynamic graph with hybrid automata and thus name it Dynamic Graph Hybrid Automata (DGHA). Then we apply this framework to model traffic flow over an urban freeway network by embedding the Cell Transmission Model (CTM) into the DGHA. With a modeling procedure, we adopt a dual digraph of road network structure to describe the road topology, use linear hybrid automata to describe multi-modes of dynamic densities in road segments and transform the nonlinear expressions of the transmitted traffic flow between two road segments into piecewise linear functions in terms of multi-mode switchings. This modeling procedure is modularized and rule-based, and thus is easily-extensible with the help of a combination algorithm for the dynamics of traffic flow. It can describe the dynamics of traffic flow over an urban freeway network with arbitrary topology structures and sizes. Next we analyze mode types and number in the model of the whole freeway network, and deduce a Piecewise Affine Linear System (PWALS) model. Furthermore, based on the PWALS model, a multi-mode switched state observer is designed to estimate the traffic densities of the freeway network, where a set of observer gain matrices are computed by using the Lyapunov function approach. As an example, we utilize the PWALS model and the corresponding switched state observer to traffic flow over Beijing third ring road. In order to clearly interpret the principle of the proposed method and avoid computational complexity, we adopt a simplified version of Beijing third ring road. Practical application for a large-scale road network will be implemented by decentralized modeling approach and distributed observer designing in the future research. PMID:28353664

  12. Modeling and Density Estimation of an Urban Freeway Network Based on Dynamic Graph Hybrid Automata.

    PubMed

    Chen, Yangzhou; Guo, Yuqi; Wang, Ying

    2017-03-29

    In this paper, in order to describe complex network systems, we firstly propose a general modeling framework by combining a dynamic graph with hybrid automata and thus name it Dynamic Graph Hybrid Automata (DGHA). Then we apply this framework to model traffic flow over an urban freeway network by embedding the Cell Transmission Model (CTM) into the DGHA. With a modeling procedure, we adopt a dual digraph of road network structure to describe the road topology, use linear hybrid automata to describe multi-modes of dynamic densities in road segments and transform the nonlinear expressions of the transmitted traffic flow between two road segments into piecewise linear functions in terms of multi-mode switchings. This modeling procedure is modularized and rule-based, and thus is easily-extensible with the help of a combination algorithm for the dynamics of traffic flow. It can describe the dynamics of traffic flow over an urban freeway network with arbitrary topology structures and sizes. Next we analyze mode types and number in the model of the whole freeway network, and deduce a Piecewise Affine Linear System (PWALS) model. Furthermore, based on the PWALS model, a multi-mode switched state observer is designed to estimate the traffic densities of the freeway network, where a set of observer gain matrices are computed by using the Lyapunov function approach. As an example, we utilize the PWALS model and the corresponding switched state observer to traffic flow over Beijing third ring road. In order to clearly interpret the principle of the proposed method and avoid computational complexity, we adopt a simplified version of Beijing third ring road. Practical application for a large-scale road network will be implemented by decentralized modeling approach and distributed observer designing in the future research.

  13. Refining Linear Fuzzy Rules by Reinforcement Learning

    NASA Technical Reports Server (NTRS)

    Berenji, Hamid R.; Khedkar, Pratap S.; Malkani, Anil

    1996-01-01

    Linear fuzzy rules are increasingly being used in the development of fuzzy logic systems. Radial basis functions have also been used in the antecedents of the rules for clustering in product space which can automatically generate a set of linear fuzzy rules from an input/output data set. Manual methods are usually used in refining these rules. This paper presents a method for refining the parameters of these rules using reinforcement learning which can be applied in domains where supervised input-output data is not available and reinforcements are received only after a long sequence of actions. This is shown for a generalization of radial basis functions. The formation of fuzzy rules from data and their automatic refinement is an important step in closing the gap between the application of reinforcement learning methods in the domains where only some limited input-output data is available.

  14. Construction of living cellular automata using the Physarum plasmodium

    NASA Astrophysics Data System (ADS)

    Shirakawa, Tomohiro; Sato, Hiroshi; Ishiguro, Shinji

    2015-04-01

    The plasmodium of Physarum polycephalum is a unicellular and multinuclear giant amoeba that has an amorphous cell body. To clearly observe how the plasmodium makes decisions in its motile and exploratory behaviours, we developed a new experimental system to pseudo-discretize the motility of the organism. In our experimental space that has agar surfaces arranged in a two-dimensional lattice, the continuous and omnidirectional movement of the plasmodium was limited to the stepwise one, and the direction of the locomotion was also limited to four neighbours. In such an experimental system, a cellular automata-like system was constructed using the living cell. We further analysed the exploratory behaviours of the plasmodium by duplicating the experimental results in the simulation models of cellular automata. As a result, it was revealed that the behaviours of the plasmodium are not reproduced by only local state transition rules; and for the reproduction, a kind of historical rule setting is needed.

  15. A Compositional Translation of Timed Automata with Deadlines to Uppaal Timed Automata

    NASA Astrophysics Data System (ADS)

    Gómez, Rodolfo

    Timed Automata with Deadlines (TAD) are a form of timed automata that admit a more natural representation of urgent actions, with the additional advantage of avoiding the most common form of timelocks. We offer a compositional translation of a practically useful subset of TAD to timed safety automata (the well-known variant of timed automata where time progress conditions are expressed by invariants). More precisely, we translate networks of TAD to the modeling language of Uppaal, a state-of-the-art verification tool for timed automata. We also describe an implementation of this translation, which allows Uppaal to aid the design and analysis of TAD models.

  16. A Cellular Automata-based Model for Simulating Restitution Property in a Single Heart Cell.

    PubMed

    Sabzpoushan, Seyed Hojjat; Pourhasanzade, Fateme

    2011-01-01

    Ventricular fibrillation is the cause of the most sudden mortalities. Restitution is one of the specific properties of ventricular cell. The recent findings have clearly proved the correlation between the slope of restitution curve with ventricular fibrillation. This; therefore, mandates the modeling of cellular restitution to gain high importance. A cellular automaton is a powerful tool for simulating complex phenomena in a simple language. A cellular automaton is a lattice of cells where the behavior of each cell is determined by the behavior of its neighboring cells as well as the automata rule. In this paper, a simple model is depicted for the simulation of the property of restitution in a single cardiac cell using cellular automata. At first, two state variables; action potential and recovery are introduced in the automata model. In second, automata rule is determined and then recovery variable is defined in such a way so that the restitution is developed. In order to evaluate the proposed model, the generated restitution curve in our study is compared with the restitution curves from the experimental findings of valid sources. Our findings indicate that the presented model is not only capable of simulating restitution in cardiac cell, but also possesses the capability of regulating the restitution curve.

  17. A cryptographic hash function based on chaotic network automata

    NASA Astrophysics Data System (ADS)

    Machicao, Jeaneth; Bruno, Odemir M.

    2017-12-01

    Chaos theory has been used to develop several cryptographic methods relying on the pseudo-random properties extracted from simple nonlinear systems such as cellular automata (CA). Cryptographic hash functions (CHF) are commonly used to check data integrity. CHF “compress” arbitrary long messages (input) into much smaller representations called hash values or message digest (output), designed to prevent the ability to reverse the hash values into the original message. This paper proposes a chaos-based CHF inspired on an encryption method based on chaotic CA rule B1357-S2468. Here, we propose an hybrid model that combines CA and networks, called network automata (CNA), whose chaotic spatio-temporal outputs are used to compute a hash value. Following the Merkle and Damgård model of construction, a portion of the message is entered as the initial condition of the network automata, so that the rest parts of messages are iteratively entered to perturb the system. The chaotic network automata shuffles the message using flexible control parameters, so that the generated hash value is highly sensitive to the message. As demonstrated in our experiments, the proposed model has excellent pseudo-randomness and sensitivity properties with acceptable performance when compared to conventional hash functions.

  18. Object Synthesis in Conway's Game of Life and Other Cellular Automata

    NASA Astrophysics Data System (ADS)

    Niemiec, Mark D.

    Of the very large number of cellular automata rules in existence, a relatively small number of rules may be considered interesting. Some of the features that make such rules interesting permit patterns to expand, contract, separate into multiple sub-patterns, or combine with other patterns. Such rules generally include still-lifes, oscillators, spaceships, spaceship guns, and puffer trains. Such structures can often be used to construct more complicated computational circuitry, and rules that contain them can often be shown to be computationally universal. Conway's Game of Life is one rule that has been well-studied for several decades, and has been shown to be very fruitful in this regard.

  19. Weighted Watson-Crick automata

    NASA Astrophysics Data System (ADS)

    Tamrin, Mohd Izzuddin Mohd; Turaev, Sherzod; Sembok, Tengku Mohd Tengku

    2014-07-01

    There are tremendous works in biotechnology especially in area of DNA molecules. The computer society is attempting to develop smaller computing devices through computational models which are based on the operations performed on the DNA molecules. A Watson-Crick automaton, a theoretical model for DNA based computation, has two reading heads, and works on double-stranded sequences of the input related by a complementarity relation similar with the Watson-Crick complementarity of DNA nucleotides. Over the time, several variants of Watson-Crick automata have been introduced and investigated. However, they cannot be used as suitable DNA based computational models for molecular stochastic processes and fuzzy processes that are related to important practical problems such as molecular parsing, gene disease detection, and food authentication. In this paper we define new variants of Watson-Crick automata, called weighted Watson-Crick automata, developing theoretical models for molecular stochastic and fuzzy processes. We define weighted Watson-Crick automata adapting weight restriction mechanisms associated with formal grammars and automata. We also study the generative capacities of weighted Watson-Crick automata, including probabilistic and fuzzy variants. We show that weighted variants of Watson-Crick automata increase their generative power.

  20. Efficient Algorithms for Handling Nondeterministic Automata

    NASA Astrophysics Data System (ADS)

    Vojnar, Tomáš

    Finite (word, tree, or omega) automata play an important role in different areas of computer science, including, for instance, formal verification. Often, deterministic automata are used for which traditional algorithms for important operations such as minimisation and inclusion checking are available. However, the use of deterministic automata implies a need to determinise nondeterministic automata that often arise during various computations even when the computations start with deterministic automata. Unfortunately, determinisation is a very expensive step since deterministic automata may be exponentially bigger than the original nondeterministic automata. That is why, it appears advantageous to avoid determinisation and work directly with nondeterministic automata. This, however, brings a need to be able to implement operations traditionally done on deterministic automata on nondeterministic automata instead. In particular, this is the case of inclusion checking and minimisation (or rather reduction of the size of automata). In the talk, we review several recently proposed techniques for inclusion checking on nondeterministic finite word and tree automata as well as Büchi automata. These techniques are based on using the so called antichains, possibly combined with a use of suitable simulation relations (and, in the case of Büchi automata, the so called Ramsey-based or rank-based approaches). Further, we discuss techniques for reducing the size of nondeterministic word and tree automata using quotienting based on the recently proposed notion of mediated equivalences. The talk is based on several common works with Parosh Aziz Abdulla, Ahmed Bouajjani, Yu-Fang Chen, Peter Habermehl, Lisa Kaati, Richard Mayr, Tayssir Touili, Lorenzo Clemente, Lukáš Holík, and Chih-Duo Hong.

  1. Towards a voxel-based geographic automata for the simulation of geospatial processes

    NASA Astrophysics Data System (ADS)

    Jjumba, Anthony; Dragićević, Suzana

    2016-07-01

    Many geographic processes evolve in a three dimensional space and time continuum. However, when they are represented with the aid of geographic information systems (GIS) or geosimulation models they are modelled in a framework of two-dimensional space with an added temporal component. The objective of this study is to propose the design and implementation of voxel-based automata as a methodological approach for representing spatial processes evolving in the four-dimensional (4D) space-time domain. Similar to geographic automata models which are developed to capture and forecast geospatial processes that change in a two-dimensional spatial framework using cells (raster geospatial data), voxel automata rely on the automata theory and use three-dimensional volumetric units (voxels). Transition rules have been developed to represent various spatial processes which range from the movement of an object in 3D to the diffusion of airborne particles and landslide simulation. In addition, the proposed 4D models demonstrate that complex processes can be readily reproduced from simple transition functions without complex methodological approaches. The voxel-based automata approach provides a unique basis to model geospatial processes in 4D for the purpose of improving representation, analysis and understanding their spatiotemporal dynamics. This study contributes to the advancement of the concepts and framework of 4D GIS.

  2. Free Quantum Field Theory from Quantum Cellular Automata

    NASA Astrophysics Data System (ADS)

    Bisio, Alessandro; D'Ariano, Giacomo Mauro; Perinotti, Paolo; Tosini, Alessandro

    2015-10-01

    After leading to a new axiomatic derivation of quantum theory (see D'Ariano et al. in Found Phys, 2015), the new informational paradigm is entering the domain of quantum field theory, suggesting a quantum automata framework that can be regarded as an extension of quantum field theory to including an hypothetical Planck scale, and with the usual quantum field theory recovered in the relativistic limit of small wave-vectors. Being derived from simple principles (linearity, unitarity, locality, homogeneity, isotropy, and minimality of dimension), the automata theory is quantum ab-initio, and does not assume Lorentz covariance and mechanical notions. Being discrete it can describe localized states and measurements (unmanageable by quantum field theory), solving all the issues plaguing field theory originated from the continuum. These features make the theory an ideal framework for quantum gravity, with relativistic covariance and space-time emergent solely from the interactions, and not assumed a priori. The paper presents a synthetic derivation of the automata theory, showing how the principles lead to a description in terms of a quantum automaton over a Cayley graph of a group. Restricting to Abelian groups we show how the automata recover the Weyl, Dirac and Maxwell dynamics in the relativistic limit. We conclude with some new routes about the more general scenario of non-Abelian Cayley graphs. The phenomenology arising from the automata theory in the ultra-relativistic domain and the analysis of corresponding distorted Lorentz covariance is reviewed in Bisio et al. (Found Phys 2015, in this same issue).

  3. Definition and application of a five-parameter characterization of one-dimensional cellular automata rule space.

    PubMed

    Oliveira, G M; de Oliveira, P P; Omar, N

    2001-01-01

    Cellular automata (CA) are important as prototypical, spatially extended, discrete dynamical systems. Because the problem of forecasting dynamic behavior of CA is undecidable, various parameter-based approximations have been developed to address the problem. Out of the analysis of the most important parameters available to this end we proposed some guidelines that should be followed when defining a parameter of that kind. Based upon the guidelines, new parameters were proposed and a set of five parameters was selected; two of them were drawn from the literature and three are new ones, defined here. This article presents all of them and makes their qualities evident. Then, two results are described, related to the use of the parameter set in the Elementary Rule Space: a phase transition diagram, and some general heuristics for forecasting the dynamics of one-dimensional CA. Finally, as an example of the application of the selected parameters in high cardinality spaces, results are presented from experiments involving the evolution of radius-3 CA in the Density Classification Task, and radius-2 CA in the Synchronization Task.

  4. Parallel Implementation of Triangular Cellular Automata for Computing Two-Dimensional Elastodynamic Response on Arbitrary Domains

    NASA Astrophysics Data System (ADS)

    Leamy, Michael J.; Springer, Adam C.

    In this research we report parallel implementation of a Cellular Automata-based simulation tool for computing elastodynamic response on complex, two-dimensional domains. Elastodynamic simulation using Cellular Automata (CA) has recently been presented as an alternative, inherently object-oriented technique for accurately and efficiently computing linear and nonlinear wave propagation in arbitrarily-shaped geometries. The local, autonomous nature of the method should lead to straight-forward and efficient parallelization. We address this notion on symmetric multiprocessor (SMP) hardware using a Java-based object-oriented CA code implementing triangular state machines (i.e., automata) and the MPI bindings written in Java (MPJ Express). We use MPJ Express to reconfigure our existing CA code to distribute a domain's automata to cores present on a dual quad-core shared-memory system (eight total processors). We note that this message passing parallelization strategy is directly applicable to computer clustered computing, which will be the focus of follow-on research. Results on the shared memory platform indicate nearly-ideal, linear speed-up. We conclude that the CA-based elastodynamic simulator is easily configured to run in parallel, and yields excellent speed-up on SMP hardware.

  5. Structure and Reversibility of 2D von Neumann Cellular Automata Over Triangular Lattice

    NASA Astrophysics Data System (ADS)

    Uguz, Selman; Redjepov, Shovkat; Acar, Ecem; Akin, Hasan

    2017-06-01

    Even though the fundamental main structure of cellular automata (CA) is a discrete special model, the global behaviors at many iterative times and on big scales could be a close, nearly a continuous, model system. CA theory is a very rich and useful phenomena of dynamical model that focuses on the local information being relayed to the neighboring cells to produce CA global behaviors. The mathematical points of the basic model imply the computable values of the mathematical structure of CA. After modeling the CA structure, an important problem is to be able to move forwards and backwards on CA to understand their behaviors in more elegant ways. A possible case is when CA is to be a reversible one. In this paper, we investigate the structure and the reversibility of two-dimensional (2D) finite, linear, triangular von Neumann CA with null boundary case. It is considered on ternary field ℤ3 (i.e. 3-state). We obtain their transition rule matrices for each special case. For given special triangular information (transition) rule matrices, we prove which triangular linear 2D von Neumann CAs are reversible or not. It is known that the reversibility cases of 2D CA are generally a much challenged problem. In the present study, the reversibility problem of 2D triangular, linear von Neumann CA with null boundary is resolved completely over ternary field. As far as we know, there is no structure and reversibility study of von Neumann 2D linear CA on triangular lattice in the literature. Due to the main CA structures being sufficiently simple to investigate in mathematical ways, and also very complex to obtain in chaotic systems, it is believed that the present construction can be applied to many areas related to these CA using any other transition rules.

  6. Potential field cellular automata model for pedestrian flow

    NASA Astrophysics Data System (ADS)

    Zhang, Peng; Jian, Xiao-Xia; Wong, S. C.; Choi, Keechoo

    2012-02-01

    This paper proposes a cellular automata model of pedestrian flow that defines a cost potential field, which takes into account the costs of travel time and discomfort, for a pedestrian to move to an empty neighboring cell. The formulation is based on a reconstruction of the density distribution and the underlying physics, including the rule for resolving conflicts, which is comparable to that in the floor field cellular automaton model. However, we assume that each pedestrian is familiar with the surroundings, thereby minimizing his or her instantaneous cost. This, in turn, helps reduce the randomness in selecting a target cell, which improves the existing cellular automata modelings, together with the computational efficiency. In the presence of two pedestrian groups, which are distinguished by their destinations, the cost distribution for each group is magnified due to the strong interaction between the two groups. As a typical phenomenon, the formation of lanes in the counter flow is reproduced.

  7. Deriving urban dynamic evolution rules from self-adaptive cellular automata with multi-temporal remote sensing images

    NASA Astrophysics Data System (ADS)

    He, Yingqing; Ai, Bin; Yao, Yao; Zhong, Fajun

    2015-06-01

    Cellular automata (CA) have proven to be very effective for simulating and predicting the spatio-temporal evolution of complex geographical phenomena. Traditional methods generally pose problems in determining the structure and parameters of CA for a large, complex region or a long-term simulation. This study presents a self-adaptive CA model integrated with an artificial immune system to discover dynamic transition rules automatically. The model's parameters are allowed to be self-modified with the application of multi-temporal remote sensing images: that is, the CA can adapt itself to the changed and complex environment. Therefore, urban dynamic evolution rules over time can be efficiently retrieved by using this integrated model. The proposed AIS-based CA model was then used to simulate the rural-urban land conversion of Guangzhou city, located in the core of China's Pearl River Delta. The initial urban land was directly classified from TM satellite image in the year 1990. Urban land in the years 1995, 2000, 2005, 2009 and 2012 was correspondingly used as the observed data to calibrate the model's parameters. With the quantitative index figure of merit (FoM) and pattern similarity, the comparison was further performed between the AIS-based model and a Logistic CA model. The results indicate that the AIS-based CA model can perform better and with higher precision in simulating urban evolution, and the simulated spatial pattern is closer to the actual development situation.

  8. PAM: Particle automata model in simulation of Fusarium graminearum pathogen expansion.

    PubMed

    Wcisło, Rafał; Miller, S Shea; Dzwinel, Witold

    2016-01-21

    The multi-scale nature and inherent complexity of biological systems are a great challenge for computer modeling and classical modeling paradigms. We present a novel particle automata modeling metaphor in the context of developing a 3D model of Fusarium graminearum infection in wheat. The system consisting of the host plant and Fusarium pathogen cells can be represented by an ensemble of discrete particles defined by a set of attributes. The cells-particles can interact with each other mimicking mechanical resistance of the cell walls and cell coalescence. The particles can move, while some of their attributes can be changed according to prescribed rules. The rules can represent cellular scales of a complex system, while the integrated particle automata model (PAM) simulates its overall multi-scale behavior. We show that due to the ability of mimicking mechanical interactions of Fusarium tip cells with the host tissue, the model is able to simulate realistic penetration properties of the colonization process reproducing both vertical and lateral Fusarium invasion scenarios. The comparison of simulation results with micrographs from laboratory experiments shows encouraging qualitative agreement between the two. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Cellular automata and epidemiological models with spatial dependence

    NASA Astrophysics Data System (ADS)

    Fuentes, M. A.; Kuperman, M. N.

    We present a cellular automata model developed to study the evolution of an infectivity nucleus in several conditions and for two kinds of epidemiologically different diseases. We analyse the role of the model parameters, concerning the epidemiological and demographic aspects of the problem, and of the evolution rules in relation to the spread of such infectious diseases, the arising of periodic temporal modulations related to the infectivity and recovery fronts, and the evolution of travelling waves. Among the obtained results we find analogies to endemic situations and pandemics.

  10. Self-organized perturbations enhance class IV behavior and 1/f power spectrum in elementary cellular automata.

    PubMed

    Nakajima, Kohei; Haruna, Taichi

    2011-09-01

    In this paper, we propose a new class of cellular automata based on the modification of its state space. It is introduced to model a computation which is exposed to an environment. We formalized the computation as extension and projection processes of its state space and resulting misidentifications of the state. This is motivated to embed the role of an environment into the system itself, which naturally induces self-organized internal perturbations rather than the usual external perturbations. Implementing this structure into the elementary cellular automata, we characterized its effect by means of input entropy and power spectral analysis. As a result, the cellular automata with this structure showed robust class IV behavior and a 1/f power spectrum in a wide range of rule space comparative to the notion of the edge of chaos. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  11. Generalized hydrodynamic transport in lattice-gas automata

    NASA Technical Reports Server (NTRS)

    Luo, Li-Shi; Chen, Hudong; Chen, Shiyi; Doolen, Gary D.; Lee, Yee-Chun

    1991-01-01

    The generalized hydrodynamics of two-dimensional lattice-gas automata is solved analytically in the linearized Boltzmann approximation. The dependence of the transport coefficients (kinematic viscosity, bulk viscosity, and sound speed) upon wave number k is obtained analytically. Anisotropy of these coefficients due to the lattice symmetry is studied for the entire range of wave number, k. Boundary effects due to a finite mean free path (Knudsen layer) are analyzed, and accurate comparisons are made with lattice-gas simulations.

  12. On Matrices, Automata, and Double Counting

    NASA Astrophysics Data System (ADS)

    Beldiceanu, Nicolas; Carlsson, Mats; Flener, Pierre; Pearson, Justin

    Matrix models are ubiquitous for constraint problems. Many such problems have a matrix of variables M, with the same constraint defined by a finite-state automaton A on each row of M and a global cardinality constraint gcc on each column of M. We give two methods for deriving, by double counting, necessary conditions on the cardinality variables of the gcc constraints from the automaton A. The first method yields linear necessary conditions and simple arithmetic constraints. The second method introduces the cardinality automaton, which abstracts the overall behaviour of all the row automata and can be encoded by a set of linear constraints. We evaluate the impact of our methods on a large set of nurse rostering problem instances.

  13. Fire and Heat Spreading Model Based on Cellular Automata Theory

    NASA Astrophysics Data System (ADS)

    Samartsev, A. A.; Rezchikov, A. F.; Kushnikov, V. A.; Ivashchenko, V. A.; Bogomolov, A. S.; Filimonyuk, L. Yu; Dolinina, O. N.; Kushnikov, O. V.; Shulga, T. E.; Tverdokhlebov, V. A.; Fominykh, D. S.

    2018-05-01

    The distinctive feature of the proposed fire and heat spreading model in premises is the reduction of the computational complexity due to the use of the theory of cellular automata with probability rules of behavior. The possibilities and prospects of using this model in practice are noted. The proposed model has a simple mechanism of integration with agent-based evacuation models. The joint use of these models could improve floor plans and reduce the time of evacuation from premises during fires.

  14. Growth and Decay in Life-Like Cellular Automata

    NASA Astrophysics Data System (ADS)

    Eppstein, David

    Since the study of life began, many have asked: is it unique in the universe, or are there other interesting forms of life elsewhere? Before we can answer that question, we should ask others: What makes life special? If we happen across another system with life-like behavior, how would we be able to recognize it? We are speaking, of course, of the mathematical systems of cellular automata, of the fascinating patterns that have been discovered and engineered in Conway's Game of Life, and of the possible existence of other cellular automaton rules with equally complex behavior to that of Life.

  15. Iterons, fractals and computations of automata

    NASA Astrophysics Data System (ADS)

    Siwak, Paweł

    1999-03-01

    Processing of strings by some automata, when viewed on space-time (ST) diagrams, reveals characteristic soliton-like coherent periodic objects. They are inherently associated with iterations of automata mappings thus we call them the iterons. In the paper we present two classes of one-dimensional iterons: particles and filtrons. The particles are typical for parallel (cellular) processing, while filtrons, introduced in (32) are specific for serial processing of strings. In general, the images of iterated automata mappings exhibit not only coherent entities but also the fractals, and quasi-periodic and chaotic dynamics. We show typical images of such computations: fractals, multiplication by a number, and addition of binary numbers defined by a Turing machine. Then, the particles are presented as iterons generated by cellular automata in three computations: B/U code conversion (13, 29), majority classification (9), and in discrete version of the FPU (Fermi-Pasta-Ulam) dynamics (7, 23). We disclose particles by a technique of combinational recoding of ST diagrams (as opposed to sequential recoding). Subsequently, we recall the recursive filters based on FCA (filter cellular automata) window operators, and considered by Park (26), Ablowitz (1), Fokas (11), Fuchssteiner (12), Bruschi (5) and Jiang (20). We present the automata equivalents to these filters (33). Some of them belong to the class of filter automata introduced in (30). We also define and illustrate some properties of filtrons. Contrary to particles, the filtrons interact nonlocally in the sense that distant symbols may influence one another. Thus their interactions are very unusual. Some examples have been given in (32). Here we show new examples of filtron phenomena: multifiltron solitonic collisions, attracting and repelling filtrons, trapped bouncing filtrons (which behave like a resonance cavity) and quasi filtrons.

  16. Runtime Analysis of Linear Temporal Logic Specifications

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Havelund, Klaus

    2001-01-01

    This report presents an approach to checking a running program against its Linear Temporal Logic (LTL) specifications. LTL is a widely used logic for expressing properties of programs viewed as sets of executions. Our approach consists of translating LTL formulae to finite-state automata, which are used as observers of the program behavior. The translation algorithm we propose modifies standard LTL to B chi automata conversion techniques to generate automata that check finite program traces. The algorithm has been implemented in a tool, which has been integrated with the generic JPaX framework for runtime analysis of Java programs.

  17. Algebraic Systems and Pushdown Automata

    NASA Astrophysics Data System (ADS)

    Petre, Ion; Salomaa, Arto

    We concentrate in this chapter on the core aspects of algebraic series, pushdown automata, and their relation to formal languages. We choose to follow here a presentation of their theory based on the concept of properness. We introduce in Sect. 2 some auxiliary notions and results needed throughout the chapter, in particular the notions of discrete convergence in semirings and C-cycle free infinite matrices. In Sect. 3 we introduce the algebraic power series in terms of algebraic systems of equations. We focus on interconnections with context-free grammars and on normal forms. We then conclude the section with a presentation of the theorems of Shamir and Chomsky-Schützenberger. We discuss in Sect. 4 the algebraic and the regulated rational transductions, as well as some representation results related to them. Section 5 is dedicated to pushdown automata and focuses on the interconnections with classical (non-weighted) pushdown automata and on the interconnections with algebraic systems. We then conclude the chapter with a brief discussion of some of the other topics related to algebraic systems and pushdown automata.

  18. Data Automata in Scala

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus

    2014-01-01

    The field of runtime verification has during the last decade seen a multitude of systems for monitoring event sequences (traces) emitted by a running system. The objective is to ensure correctness of a system by checking its execution traces against formal specifications representing requirements. A special challenge is data parameterized events, where monitors have to keep track of the combination of control states as well as data constraints, relating events and the data they carry across time points. This poses a challenge wrt. efficiency of monitors, as well as expressiveness of logics. Data automata is a form of automata where states are parameterized with data, supporting monitoring of data parameterized events. We describe the full details of a very simple API in the Scala programming language, an internal DSL (Domain-Specific Language), implementing data automata. The small implementation suggests a design pattern. Data automata allow transition conditions to refer to other states than the source state, and allow target states of transitions to be inlined, offering a temporal logic flavored notation. An embedding of a logic in a high-level language like Scala in addition allows monitors to be programmed using all of Scala's language constructs, offering the full flexibility of a programming language. The framework is demonstrated on an XML processing scenario previously addressed in related work.

  19. Double Linear Damage Rule for Fatigue Analysis

    NASA Technical Reports Server (NTRS)

    Halford, G.; Manson, S.

    1985-01-01

    Double Linear Damage Rule (DLDR) method for use by structural designers to determine fatigue-crack-initiation life when structure subjected to unsteady, variable-amplitude cyclic loadings. Method calculates in advance of service how many loading cycles imposed on structural component before macroscopic crack initiates. Approach eventually used in design of high performance systems and incorporated into design handbooks and codes.

  20. Phase transitions in coupled map lattices and in associated probabilistic cellular automata.

    PubMed

    Just, Wolfram

    2006-10-01

    Analytical tools are applied to investigate piecewise linear coupled map lattices in terms of probabilistic cellular automata. The so-called disorder condition of probabilistic cellular automata is closely related with attracting sets in coupled map lattices. The importance of this condition for the suppression of phase transitions is illustrated by spatially one-dimensional systems. Invariant densities and temporal correlations are calculated explicitly. Ising type phase transitions are found for one-dimensional coupled map lattices acting on repelling sets and for a spatially two-dimensional Miller-Huse-like system with stable long time dynamics. Critical exponents are calculated within a finite size scaling approach. The relevance of detailed balance of the resulting probabilistic cellular automaton for the critical behavior is pointed out.

  1. Predictability in Cellular Automata

    PubMed Central

    Agapie, Alexandru; Andreica, Anca; Chira, Camelia; Giuclea, Marius

    2014-01-01

    Modelled as finite homogeneous Markov chains, probabilistic cellular automata with local transition probabilities in (0, 1) always posses a stationary distribution. This result alone is not very helpful when it comes to predicting the final configuration; one needs also a formula connecting the probabilities in the stationary distribution to some intrinsic feature of the lattice configuration. Previous results on the asynchronous cellular automata have showed that such feature really exists. It is the number of zero-one borders within the automaton's binary configuration. An exponential formula in the number of zero-one borders has been proved for the 1-D, 2-D and 3-D asynchronous automata with neighborhood three, five and seven, respectively. We perform computer experiments on a synchronous cellular automaton to check whether the empirical distribution obeys also that theoretical formula. The numerical results indicate a perfect fit for neighbourhood three and five, which opens the way for a rigorous proof of the formula in this new, synchronous case. PMID:25271778

  2. Predictability in cellular automata.

    PubMed

    Agapie, Alexandru; Andreica, Anca; Chira, Camelia; Giuclea, Marius

    2014-01-01

    Modelled as finite homogeneous Markov chains, probabilistic cellular automata with local transition probabilities in (0, 1) always posses a stationary distribution. This result alone is not very helpful when it comes to predicting the final configuration; one needs also a formula connecting the probabilities in the stationary distribution to some intrinsic feature of the lattice configuration. Previous results on the asynchronous cellular automata have showed that such feature really exists. It is the number of zero-one borders within the automaton's binary configuration. An exponential formula in the number of zero-one borders has been proved for the 1-D, 2-D and 3-D asynchronous automata with neighborhood three, five and seven, respectively. We perform computer experiments on a synchronous cellular automaton to check whether the empirical distribution obeys also that theoretical formula. The numerical results indicate a perfect fit for neighbourhood three and five, which opens the way for a rigorous proof of the formula in this new, synchronous case.

  3. A 3-D model of tumor progression based on complex automata driven by particle dynamics.

    PubMed

    Wcisło, Rafał; Dzwinel, Witold; Yuen, David A; Dudek, Arkadiusz Z

    2009-12-01

    The dynamics of a growing tumor involving mechanical remodeling of healthy tissue and vasculature is neglected in most of the existing tumor models. This is due to the lack of efficient computational framework allowing for simulation of mechanical interactions. Meanwhile, just these interactions trigger critical changes in tumor growth dynamics and are responsible for its volumetric and directional progression. We describe here a novel 3-D model of tumor growth, which combines particle dynamics with cellular automata concept. The particles represent both tissue cells and fragments of the vascular network. They interact with their closest neighbors via semi-harmonic central forces simulating mechanical resistance of the cell walls. The particle dynamics is governed by both the Newtonian laws of motion and the cellular automata rules. These rules can represent cell life-cycle and other biological interactions involving smaller spatio-temporal scales. We show that our complex automata, particle based model can reproduce realistic 3-D dynamics of the entire system consisting of the tumor, normal tissue cells, blood vessels and blood flow. It can explain phenomena such as the inward cell motion in avascular tumor, stabilization of tumor growth by the external pressure, tumor vascularization due to the process of angiogenesis, trapping of healthy cells by invading tumor, and influence of external (boundary) conditions on the direction of tumor progression. We conclude that the particle model can serve as a general framework for designing advanced multiscale models of tumor dynamics and it is very competitive to the modeling approaches presented before.

  4. A novel chaotic based image encryption using a hybrid model of deoxyribonucleic acid and cellular automata

    NASA Astrophysics Data System (ADS)

    Enayatifar, Rasul; Sadaei, Hossein Javedani; Abdullah, Abdul Hanan; Lee, Malrey; Isnin, Ismail Fauzi

    2015-08-01

    Currently, there are many studies have conducted on developing security of the digital image in order to protect such data while they are sending on the internet. This work aims to propose a new approach based on a hybrid model of the Tinkerbell chaotic map, deoxyribonucleic acid (DNA) and cellular automata (CA). DNA rules, DNA sequence XOR operator and CA rules are used simultaneously to encrypt the plain-image pixels. To determine rule number in DNA sequence and also CA, a 2-dimension Tinkerbell chaotic map is employed. Experimental results and computer simulations, both confirm that the proposed scheme not only demonstrates outstanding encryption, but also resists various typical attacks.

  5. Wavefront cellular learning automata.

    PubMed

    Moradabadi, Behnaz; Meybodi, Mohammad Reza

    2018-02-01

    This paper proposes a new cellular learning automaton, called a wavefront cellular learning automaton (WCLA). The proposed WCLA has a set of learning automata mapped to a connected structure and uses this structure to propagate the state changes of the learning automata over the structure using waves. In the WCLA, after one learning automaton chooses its action, if this chosen action is different from the previous action, it can send a wave to its neighbors and activate them. Each neighbor receiving the wave is activated and must choose a new action. This structure for the WCLA is necessary in many dynamic areas such as social networks, computer networks, grid computing, and web mining. In this paper, we introduce the WCLA framework as an optimization tool with diffusion capability, study its behavior over time using ordinary differential equation solutions, and present its accuracy using expediency analysis. To show the superiority of the proposed WCLA, we compare the proposed method with some other types of cellular learning automata using two benchmark problems.

  6. Wavefront cellular learning automata

    NASA Astrophysics Data System (ADS)

    Moradabadi, Behnaz; Meybodi, Mohammad Reza

    2018-02-01

    This paper proposes a new cellular learning automaton, called a wavefront cellular learning automaton (WCLA). The proposed WCLA has a set of learning automata mapped to a connected structure and uses this structure to propagate the state changes of the learning automata over the structure using waves. In the WCLA, after one learning automaton chooses its action, if this chosen action is different from the previous action, it can send a wave to its neighbors and activate them. Each neighbor receiving the wave is activated and must choose a new action. This structure for the WCLA is necessary in many dynamic areas such as social networks, computer networks, grid computing, and web mining. In this paper, we introduce the WCLA framework as an optimization tool with diffusion capability, study its behavior over time using ordinary differential equation solutions, and present its accuracy using expediency analysis. To show the superiority of the proposed WCLA, we compare the proposed method with some other types of cellular learning automata using two benchmark problems.

  7. Observability of Automata Networks: Fixed and Switching Cases.

    PubMed

    Li, Rui; Hong, Yiguang; Wang, Xingyuan

    2018-04-01

    Automata networks are a class of fully discrete dynamical systems, which have received considerable interest in various different areas. This brief addresses the observability of automata networks and switched automata networks in a unified framework, and proposes simple necessary and sufficient conditions for observability. The results are achieved by employing methods from symbolic computation, and are suited for implementation using computer algebra systems. Several examples are presented to demonstrate the application of the results.

  8. Excellent approach to modeling urban expansion by fuzzy cellular automata: agent base model

    NASA Astrophysics Data System (ADS)

    Khajavigodellou, Yousef; Alesheikh, Ali A.; Mohammed, Abdulrazak A. S.; Chapi, Kamran

    2014-09-01

    Recently, the interaction between humans and their environment is the one of important challenges in the world. Landuse/ cover change (LUCC) is a complex process that includes actors and factors at different social and spatial levels. The complexity and dynamics of urban systems make the applicable practice of urban modeling very difficult. With the increased computational power and the greater availability of spatial data, micro-simulation such as the agent based and cellular automata simulation methods, has been developed by geographers, planners, and scholars, and it has shown great potential for representing and simulating the complexity of the dynamic processes involved in urban growth and land use change. This paper presents Fuzzy Cellular Automata in Geospatial Information System and remote Sensing to simulated and predicted urban expansion pattern. These FCA-based dynamic spatial urban models provide an improved ability to forecast and assess future urban growth and to create planning scenarios, allowing us to explore the potential impacts of simulations that correspond to urban planning and management policies. A fuzzy inference guided cellular automata approach. Semantic or linguistic knowledge on Land use change is expressed as fuzzy rules, based on which fuzzy inference is applied to determine the urban development potential for each pixel. The model integrates an ABM (agent-based model) and FCA (Fuzzy Cellular Automata) to investigate a complex decision-making process and future urban dynamic processes. Based on this model rapid development and green land protection under the influences of the behaviors and decision modes of regional authority agents, real estate developer agents, resident agents and non- resident agents and their interactions have been applied to predict the future development patterns of the Erbil metropolitan region.

  9. The Neurona at Home project: Simulating a large-scale cellular automata brain in a distributed computing environment

    NASA Astrophysics Data System (ADS)

    Acedo, L.; Villanueva-Oller, J.; Moraño, J. A.; Villanueva, R.-J.

    2013-01-01

    The Berkeley Open Infrastructure for Network Computing (BOINC) has become the standard open source solution for grid computing in the Internet. Volunteers use their computers to complete an small part of the task assigned by a dedicated server. We have developed a BOINC project called Neurona@Home whose objective is to simulate a cellular automata random network with, at least, one million neurons. We consider a cellular automata version of the integrate-and-fire model in which excitatory and inhibitory nodes can activate or deactivate neighbor nodes according to a set of probabilistic rules. Our aim is to determine the phase diagram of the model and its behaviour and to compare it with the electroencephalographic signals measured in real brains.

  10. An authenticated image encryption scheme based on chaotic maps and memory cellular automata

    NASA Astrophysics Data System (ADS)

    Bakhshandeh, Atieh; Eslami, Ziba

    2013-06-01

    This paper introduces a new image encryption scheme based on chaotic maps, cellular automata and permutation-diffusion architecture. In the permutation phase, a piecewise linear chaotic map is utilized to confuse the plain-image and in the diffusion phase, we employ the Logistic map as well as a reversible memory cellular automata to obtain an efficient and secure cryptosystem. The proposed method admits advantages such as highly secure diffusion mechanism, computational efficiency and ease of implementation. A novel property of the proposed scheme is its authentication ability which can detect whether the image is tampered during the transmission or not. This is particularly important in applications where image data or part of it contains highly sensitive information. Results of various analyses manifest high security of this new method and its capability for practical image encryption.

  11. Addressing population heterogeneity and distribution in epidemics models using a cellular automata approach

    PubMed Central

    2014-01-01

    Background The spread of an infectious disease is determined by biological and social factors. Models based on cellular automata are adequate to describe such natural systems consisting of a massive collection of simple interacting objects. They characterize the time evolution of the global system as the emergent behaviour resulting from the interaction of the objects, whose behaviour is defined through a set of simple rules that encode the individual behaviour and the transmission dynamic. Methods An epidemic is characterized trough an individual–based–model built upon cellular automata. In the proposed model, each individual of the population is represented by a cell of the automata. This way of modeling an epidemic situation allows to individually define the characteristic of each individual, establish different scenarios and implement control strategies. Results A cellular automata model to study the time evolution of a heterogeneous populations through the various stages of disease was proposed, allowing the inclusion of individual heterogeneity, geographical characteristics and social factors that determine the dynamic of the desease. Different assumptions made to built the classical model were evaluated, leading to following results: i) for low contact rate (like in quarantine process or low density population areas) the number of infective individuals is lower than other areas where the contact rate is higher, and ii) for different initial spacial distributions of infected individuals different epidemic dynamics are obtained due to its influence on the transition rate and the reproductive ratio of disease. Conclusions The contact rate and spatial distributions have a central role in the spread of a disease. For low density populations the spread is very low and the number of infected individuals is lower than in highly populated areas. The spacial distribution of the population and the disease focus as well as the geographical characteristic of the area

  12. Addressing population heterogeneity and distribution in epidemics models using a cellular automata approach.

    PubMed

    López, Leonardo; Burguerner, Germán; Giovanini, Leonardo

    2014-04-12

    The spread of an infectious disease is determined by biological and social factors. Models based on cellular automata are adequate to describe such natural systems consisting of a massive collection of simple interacting objects. They characterize the time evolution of the global system as the emergent behaviour resulting from the interaction of the objects, whose behaviour is defined through a set of simple rules that encode the individual behaviour and the transmission dynamic. An epidemic is characterized trough an individual-based-model built upon cellular automata. In the proposed model, each individual of the population is represented by a cell of the automata. This way of modeling an epidemic situation allows to individually define the characteristic of each individual, establish different scenarios and implement control strategies. A cellular automata model to study the time evolution of a heterogeneous populations through the various stages of disease was proposed, allowing the inclusion of individual heterogeneity, geographical characteristics and social factors that determine the dynamic of the desease. Different assumptions made to built the classical model were evaluated, leading to following results: i) for low contact rate (like in quarantine process or low density population areas) the number of infective individuals is lower than other areas where the contact rate is higher, and ii) for different initial spacial distributions of infected individuals different epidemic dynamics are obtained due to its influence on the transition rate and the reproductive ratio of disease. The contact rate and spatial distributions have a central role in the spread of a disease. For low density populations the spread is very low and the number of infected individuals is lower than in highly populated areas. The spacial distribution of the population and the disease focus as well as the geographical characteristic of the area play a central role in the dynamics of the

  13. A new simulation system of traffic flow based on cellular automata principle

    NASA Astrophysics Data System (ADS)

    Shan, Junru

    2017-05-01

    Traffic flow is a complex system of multi-behavior so it is difficult to give a specific mathematical equation to express it. With the rapid development of computer technology, it is an important method to study the complex traffic behavior by simulating the interaction mechanism between vehicles and reproduce complex traffic behavior. Using the preset of multiple operating rules, cellular automata is a kind of power system which has discrete time and space. It can be a good simulation of the real traffic process and a good way to solve the traffic problems.

  14. Maximizing the Adjacent Possible in Automata Chemistries.

    PubMed

    Hickinbotham, Simon; Clark, Edward; Nellis, Adam; Stepney, Susan; Clarke, Tim; Young, Peter

    2016-01-01

    Automata chemistries are good vehicles for experimentation in open-ended evolution, but they are by necessity complex systems whose low-level properties require careful design. To aid the process of designing automata chemistries, we develop an abstract model that classifies the features of a chemistry from a physical (bottom up) perspective and from a biological (top down) perspective. There are two levels: things that can evolve, and things that cannot. We equate the evolving level with biology and the non-evolving level with physics. We design our initial organisms in the biology, so they can evolve. We design the physics to facilitate evolvable biologies. This architecture leads to a set of design principles that should be observed when creating an instantiation of the architecture. These principles are Everything Evolves, Everything's Soft, and Everything Dies. To evaluate these ideas, we present experiments in the recently developed Stringmol automata chemistry. We examine the properties of Stringmol with respect to the principles, and so demonstrate the usefulness of the principles in designing automata chemistries.

  15. Reversible elementary cellular automaton with rule number 150 and periodic boundary conditions over 𝔽p

    NASA Astrophysics Data System (ADS)

    Martín Del Rey, A.; Rodríguez Sánchez, G.

    2015-03-01

    The study of the reversibility of elementary cellular automata with rule number 150 over the finite state set 𝔽p and endowed with periodic boundary conditions is done. The dynamic of such discrete dynamical systems is characterized by means of characteristic circulant matrices, and their analysis allows us to state that the reversibility depends on the number of cells of the cellular space and to explicitly compute the corresponding inverse cellular automata.

  16. a Predator-Prey Model Based on the Fully Parallel Cellular Automata

    NASA Astrophysics Data System (ADS)

    He, Mingfeng; Ruan, Hongbo; Yu, Changliang

    We presented a predator-prey lattice model containing moveable wolves and sheep, which are characterized by Penna double bit strings. Sexual reproduction and child-care strategies are considered. To implement this model in an efficient way, we build a fully parallel Cellular Automata based on a new definition of the neighborhood. We show the roles played by the initial densities of the populations, the mutation rate and the linear size of the lattice in the evolution of this model.

  17. Fuzzy tree automata and syntactic pattern recognition.

    PubMed

    Lee, E T

    1982-04-01

    An approach of representing patterns by trees and processing these trees by fuzzy tree automata is described. Fuzzy tree automata are defined and investigated. The results include that the class of fuzzy root-to-frontier recognizable ¿-trees is closed under intersection, union, and complementation. Thus, the class of fuzzy root-to-frontier recognizable ¿-trees forms a Boolean algebra. Fuzzy tree automata are applied to processing fuzzy tree representation of patterns based on syntactic pattern recognition. The grade of acceptance is defined and investigated. Quantitative measures of ``approximate isosceles triangle,'' ``approximate elongated isosceles triangle,'' ``approximate rectangle,'' and ``approximate cross'' are defined and used in the illustrative examples of this approach. By using these quantitative measures, a house, a house with high roof, and a church are also presented as illustrative examples. In addition, three fuzzy tree automata are constructed which have the capability of processing the fuzzy tree representations of ``fuzzy houses,'' ``houses with high roofs,'' and ``fuzzy churches,'' respectively. The results may have useful applications in pattern recognition, image processing, artificial intelligence, pattern database design and processing, image science, and pictorial information systems.

  18. Cellular automata models for diffusion of information and highway traffic flow

    NASA Astrophysics Data System (ADS)

    Fuks, Henryk

    In the first part of this work we study a family of deterministic models for highway traffic flow which generalize cellular automaton rule 184. This family is parameterized by the speed limit m and another parameter k that represents degree of 'anticipatory driving'. We compare two driving strategies with identical maximum throughput: 'conservative' driving with high speed limit and 'anticipatory' driving with low speed limit. Those two strategies are evaluated in terms of accident probability. We also discuss fundamental diagrams of generalized traffic rules and examine limitations of maximum achievable throughput. Possible modifications of the model are considered. For rule 184, we present exact calculations of the order parameter in a transition from the moving phase to the jammed phase using the method of preimage counting, and use this result to construct a solution to the density classification problem. In the second part we propose a probabilistic cellular automaton model for the spread of innovations, rumors, news, etc., in a social system. We start from simple deterministic models, for which exact expressions for the density of adopters are derived. For a more realistic model, based on probabilistic cellular automata, we study the influence of a range of interaction R on the shape of the adoption curve. When the probability of adoption is proportional to the local density of adopters, and individuals can drop the innovation with some probability p, the system exhibits a second order phase transition. Critical line separating regions of parameter space in which asymptotic density of adopters is positive from the region where it is equal to zero converges toward the mean-field line when the range of the interaction increases. In a region between R=1 critical line and the mean-field line asymptotic density of adopters depends on R, becoming zero if R is too small (smaller than some critical value). This result demonstrates the importance of connectivity in

  19. Mitochondrial fusion through membrane automata.

    PubMed

    Giannakis, Konstantinos; Andronikos, Theodore

    2015-01-01

    Studies have shown that malfunctions in mitochondrial processes can be blamed for diseases. However, the mechanism behind these operations is yet not sufficiently clear. In this work we present a novel approach to describe a biomolecular model for mitochondrial fusion using notions from the membrane computing. We use a case study defined in BioAmbient calculus and we show how to translate it in terms of a P automata variant. We combine brane calculi with (mem)brane automata to produce a new scheme capable of describing simple, realistic models. We propose the further use of similar methods and the test of other biomolecular models with the same behaviour.

  20. A study for bank effect on ship traffic in narrow water channels using cellular automata

    NASA Astrophysics Data System (ADS)

    Sun, Zhuo; Cong, Shuang; Pan, Junnan; Zheng, Jianfeng

    2017-12-01

    In narrow water channels, bank might affect nearby ships due to hydrodynamic forces (bank effect). To avoid accidents, different sailing rules (i.e., lane-changing, speed control) are required. In this paper, a two-lane cellular automata model is proposed to evaluate such phenomena. Numerical experiments show that ships will form a “slow-moving chunk” in the bank area, which will significantly block the flux. As further study demonstrated to alleviate bank effect, ship speed and bank length should be controlled.

  1. Emergent dynamic structures and statistical law in spherical lattice gas automata

    NASA Astrophysics Data System (ADS)

    Yao, Zhenwei

    2017-12-01

    Various lattice gas automata have been proposed in the past decades to simulate physics and address a host of problems on collective dynamics arising in diverse fields. In this work, we employ the lattice gas model defined on the sphere to investigate the curvature-driven dynamic structures and analyze the statistical behaviors in equilibrium. Under the simple propagation and collision rules, we show that the uniform collective movement of the particles on the sphere is geometrically frustrated, leading to several nonequilibrium dynamic structures not found in the planar lattice, such as the emergent bubble and vortex structures. With the accumulation of the collision effect, the system ultimately reaches equilibrium in the sense that the distribution of the coarse-grained speed approaches the two-dimensional Maxwell-Boltzmann distribution despite the population fluctuations in the coarse-grained cells. The emergent regularity in the statistical behavior of the system is rationalized by mapping our system to a generalized random walk model. This work demonstrates the capability of the spherical lattice gas automaton in revealing the lattice-guided dynamic structures and simulating the equilibrium physics. It suggests the promising possibility of using lattice gas automata defined on various curved surfaces to explore geometrically driven nonequilibrium physics.

  2. Emergent dynamic structures and statistical law in spherical lattice gas automata.

    PubMed

    Yao, Zhenwei

    2017-12-01

    Various lattice gas automata have been proposed in the past decades to simulate physics and address a host of problems on collective dynamics arising in diverse fields. In this work, we employ the lattice gas model defined on the sphere to investigate the curvature-driven dynamic structures and analyze the statistical behaviors in equilibrium. Under the simple propagation and collision rules, we show that the uniform collective movement of the particles on the sphere is geometrically frustrated, leading to several nonequilibrium dynamic structures not found in the planar lattice, such as the emergent bubble and vortex structures. With the accumulation of the collision effect, the system ultimately reaches equilibrium in the sense that the distribution of the coarse-grained speed approaches the two-dimensional Maxwell-Boltzmann distribution despite the population fluctuations in the coarse-grained cells. The emergent regularity in the statistical behavior of the system is rationalized by mapping our system to a generalized random walk model. This work demonstrates the capability of the spherical lattice gas automaton in revealing the lattice-guided dynamic structures and simulating the equilibrium physics. It suggests the promising possibility of using lattice gas automata defined on various curved surfaces to explore geometrically driven nonequilibrium physics.

  3. Self-organisation in Cellular Automata with Coalescent Particles: Qualitative and Quantitative Approaches

    NASA Astrophysics Data System (ADS)

    Hellouin de Menibus, Benjamin; Sablik, Mathieu

    2017-06-01

    This article introduces new tools to study self-organisation in a family of simple cellular automata which contain some particle-like objects with good collision properties (coalescence) in their time evolution. We draw an initial configuration at random according to some initial shift-ergodic measure, and use the limit measure to describe the asymptotic behaviour of the automata. We first take a qualitative approach, i.e. we obtain information on the limit measure(s). We prove that only particles moving in one particular direction can persist asymptotically. This provides some previously unknown information on the limit measures of various deterministic and probabilistic cellular automata: 3 and 4-cyclic cellular automata [introduced by Fisch (J Theor Probab 3(2):311-338, 1990; Phys D 45(1-3):19-25, 1990)], one-sided captive cellular automata [introduced by Theyssier (Captive Cellular Automata, 2004)], the majority-traffic cellular automaton, a self stabilisation process towards a discrete line [introduced by Regnault and Rémila (in: Mathematical Foundations of Computer Science 2015—40th International Symposium, MFCS 2015, Milan, Italy, Proceedings, Part I, 2015)]. In a second time we restrict our study to a subclass, the gliders cellular automata. For this class we show quantitative results, consisting in the asymptotic law of some parameters: the entry times [generalising K ůrka et al. (in: Proceedings of AUTOMATA, 2011)], the density of particles and the rate of convergence to the limit measure.

  4. Cellular automata and integrodifferential equation models for cell renewal in mosaic tissues

    PubMed Central

    Bloomfield, J. M.; Sherratt, J. A.; Painter, K. J.; Landini, G.

    2010-01-01

    Mosaic tissues are composed of two or more genetically distinct cell types. They occur naturally, and are also a useful experimental method for exploring tissue growth and maintenance. By marking the different cell types, one can study the patterns formed by proliferation, renewal and migration. Here, we present mathematical modelling suggesting that small changes in the type of interaction that cells have with their local cellular environment can lead to very different outcomes for the composition of mosaics. In cell renewal, proliferation of each cell type may depend linearly or nonlinearly on the local proportion of cells of that type, and these two possibilities produce very different patterns. We study two variations of a cellular automaton model based on simple rules for renewal. We then propose an integrodifferential equation model, and again consider two different forms of cellular interaction. The results of the continuous and cellular automata models are qualitatively the same, and we observe that changes in local environment interaction affect the dynamics for both. Furthermore, we demonstrate that the models reproduce some of the patterns seen in actual mosaic tissues. In particular, our results suggest that the differing patterns seen in organ parenchymas may be driven purely by the process of cell replacement under different interaction scenarios. PMID:20375040

  5. Asynchronous adaptive time step in quantitative cellular automata modeling

    PubMed Central

    Zhu, Hao; Pang, Peter YH; Sun, Yan; Dhar, Pawan

    2004-01-01

    Background The behaviors of cells in metazoans are context dependent, thus large-scale multi-cellular modeling is often necessary, for which cellular automata are natural candidates. Two related issues are involved in cellular automata based multi-cellular modeling: how to introduce differential equation based quantitative computing to precisely describe cellular activity, and upon it, how to solve the heavy time consumption issue in simulation. Results Based on a modified, language based cellular automata system we extended that allows ordinary differential equations in models, we introduce a method implementing asynchronous adaptive time step in simulation that can considerably improve efficiency yet without a significant sacrifice of accuracy. An average speedup rate of 4–5 is achieved in the given example. Conclusions Strategies for reducing time consumption in simulation are indispensable for large-scale, quantitative multi-cellular models, because even a small 100 × 100 × 100 tissue slab contains one million cells. Distributed and adaptive time step is a practical solution in cellular automata environment. PMID:15222901

  6. An Asynchronous Cellular Automata-Based Adaptive Illumination Facility

    NASA Astrophysics Data System (ADS)

    Bandini, Stefania; Bonomi, Andrea; Vizzari, Giuseppe; Acconci, Vito

    The term Ambient Intelligence refers to electronic environments that are sensitive and responsive to the presence of people; in the described scenario the environment itself is endowed with a set of sensors (to perceive humans or other physical entities such as dogs, bicycles, etc.), interacting with a set of actuators (lights) that choose their actions (i.e. state of illumination) in an attempt improve the overall experience of these users. The model for the interaction and action of sensors and actuators is an asynchronous Cellular Automata (CA) with memory, supporting a self-organization of the system as a response to the presence and movements of people inside it. The paper will introduce the model, as well as an ad hoc user interface for the specification of the relevant parameters of the CA transition rule that determines the overall system behaviour.

  7. Local Structure Theory for Cellular Automata.

    NASA Astrophysics Data System (ADS)

    Gutowitz, Howard Andrew

    The local structure theory (LST) is a generalization of the mean field theory for cellular automata (CA). The mean field theory makes the assumption that iterative application of the rule does not introduce correlations between the states of cells in different positions. This assumption allows the derivation of a simple formula for the limit density of each possible state of a cell. The most striking feature of CA is that they may well generate correlations between the states of cells as they evolve. The LST takes the generation of correlation explicitly into account. It thus has the potential to describe statistical characteristics in detail. The basic assumption of the LST is that though correlation may be generated by CA evolution, this correlation decays with distance. This assumption allows the derivation of formulas for the estimation of the probability of large blocks of states in terms of smaller blocks of states. Given the probabilities of blocks of size n, probabilities may be assigned to blocks of arbitrary size such that these probability assignments satisfy the Kolmogorov consistency conditions and hence may be used to define a measure on the set of all possible (infinite) configurations. Measures defined in this way are called finite (or n-) block measures. A function called the scramble operator of order n maps a measure to an approximating n-block measure. The action of a CA on configurations induces an action on measures on the set of all configurations. The scramble operator is combined with the CA map on measure to form the local structure operator (LSO). The LSO of order n maps the set of n-block measures into itself. It is hypothesised that the LSO applied to n-block measures approximates the rule itself on general measures, and does so increasingly well as n increases. The fundamental advantage of the LSO is that its action is explicitly computable from a finite system of rational recursion equations. Empirical study of a number of CA rules

  8. Cellular Automata Simulation for Wealth Distribution

    NASA Astrophysics Data System (ADS)

    Lo, Shih-Ching

    2009-08-01

    Wealth distribution of a country is a complicate system. A model, which is based on the Epstein & Axtell's "Sugars cape" model, is presented in Netlogo. The model considers the income, age, working opportunity and salary as control variables. There are still other variables should be considered while an artificial society is established. In this study, a more complicate cellular automata model for wealth distribution model is proposed. The effects of social welfare, tax, economical investment and inheritance are considered and simulated. According to the cellular automata simulation for wealth distribution, we will have a deep insight of financial policy of the government.

  9. Viewing hybrid systems as products of control systems and automata

    NASA Technical Reports Server (NTRS)

    Grossman, R. L.; Larson, R. G.

    1992-01-01

    The purpose of this note is to show how hybrid systems may be modeled as products of nonlinear control systems and finite state automata. By a hybrid system, we mean a network of consisting of continuous, nonlinear control system connected to discrete, finite state automata. Our point of view is that the automata switches between the control systems, and that this switching is a function of the discrete input symbols or letters that it receives. We show how a nonlinear control system may be viewed as a pair consisting of a bialgebra of operators coding the dynamics, and an algebra of observations coding the state space. We also show that a finite automata has a similar representation. A hybrid system is then modeled by taking suitable products of the bialgebras coding the dynamics and the observation algebras coding the state spaces.

  10. Using Mobile TLA as a Logic for Dynamic I/O Automata

    NASA Astrophysics Data System (ADS)

    Kapus, Tatjana

    Input/Output (I/O) automata and the Temporal Logic of Actions (TLA) are two well-known techniques for the specification and verification of concurrent systems. Over the past few years, they have been extended to the so-called dynamic I/O automata and, respectively, Mobile TLA (MTLA) in order to be more appropriate for mobile agent systems. Dynamic I/O automata is just a mathematical model, whereas MTLA is a logic with a formally defined language. In this paper, therefore, we investigate how MTLA could be used as a formal language for the specification of dynamic I/O automata. We do this by writing an MTLA specification of a travel agent system which has been specified semi-formally in the literature on that model. In this specification, we deal with always existing agents as well as with an initially unknown number of dynamically created agents, with mobile and non-mobile agents, with I/O-automata-style communication, and with the changing communication capabilities of mobile agents. We have previously written a TLA specification of this system. This paper shows that an MTLA specification of such a system can be more elegant and faithful to the dynamic I/O automata definition because the agent existence and location can be expressed directly by using agent and location names instead of special variables as in TLA. It also shows how the reuse of names for dynamically created and destroyed agents within the dynamic I/O automata framework can be specified in MTLA.

  11. Decentralized indirect methods for learning automata games.

    PubMed

    Tilak, Omkar; Martin, Ryan; Mukhopadhyay, Snehasis

    2011-10-01

    We discuss the application of indirect learning methods in zero-sum and identical payoff learning automata games. We propose a novel decentralized version of the well-known pursuit learning algorithm. Such a decentralized algorithm has significant computational advantages over its centralized counterpart. The theoretical study of such a decentralized algorithm requires the analysis to be carried out in a nonstationary environment. We use a novel bootstrapping argument to prove the convergence of the algorithm. To our knowledge, this is the first time that such analysis has been carried out for zero-sum and identical payoff games. Extensive simulation studies are reported, which demonstrate the proposed algorithm's fast and accurate convergence in a variety of game scenarios. We also introduce the framework of partial communication in the context of identical payoff games of learning automata. In such games, the automata may not communicate with each other or may communicate selectively. This comprehensive framework has the capability to model both centralized and decentralized games discussed in this paper.

  12. Non-Condon nonequilibrium Fermi’s golden rule rates from the linearized semiclassical method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Xiang; Geva, Eitan

    2016-08-14

    The nonequilibrium Fermi’s golden rule describes the transition between a photoexcited bright donor electronic state and a dark acceptor electronic state, when the nuclear degrees of freedom start out in a nonequilibrium state. In a previous paper [X. Sun and E. Geva, J. Chem. Theory Comput. 12, 2926 (2016)], we proposed a new expression for the nonequilibrium Fermi’s golden rule within the framework of the linearized semiclassical approximation and based on the Condon approximation, according to which the electronic coupling between donor and acceptor is assumed constant. In this paper we propose a more general expression, which is applicable tomore » the case of non-Condon electronic coupling. We test the accuracy of the new non-Condon nonequilibrium Fermi’s golden rule linearized semiclassical expression on a model where the donor and acceptor potential energy surfaces are parabolic and identical except for shifts in the equilibrium energy and geometry, and the coupling between them is linear in the nuclear coordinates. Since non-Condon effects may or may not give rise to conical intersections, both possibilities are examined by considering the following: (1) A modified Garg-Onuchic-Ambegaokar model for charge transfer in the condensed phase, where the donor-acceptor coupling is linear in the primary-mode coordinate, and for which non-Condon effects do not give rise to a conical intersection; (2) the linear vibronic coupling model for electronic transitions in gas phase molecules, where non-Condon effects give rise to conical intersections. We also present a comprehensive comparison between the linearized semiclassical expression and a progression of more approximate expressions, in both normal and inverted regions, and over a wide range of initial nonequilibrium states, temperatures, and frictions.« less

  13. Practical implementation of the double linear damage rule and damage curve approach for treating cumulative fatigue damage

    NASA Technical Reports Server (NTRS)

    Manson, S. S.; Halford, G. R.

    1980-01-01

    Simple procedures are presented for treating cumulative fatigue damage under complex loading history using either the damage curve concept or the double linear damage rule. A single equation is provided for use with the damage curve approach; each loading event providing a fraction of damage until failure is presumed to occur when the damage sum becomes unity. For the double linear damage rule, analytical expressions are provided for determining the two phases of life. The procedure involves two steps, each similar to the conventional application of the commonly used linear damage rule. When the sum of cycle ratios based on phase 1 lives reaches unity, phase 1 is presumed complete, and further loadings are summed as cycle ratios on phase 2 lives. When the phase 2 sum reaches unity, failure is presumed to occur. No other physical properties or material constants than those normally used in a conventional linear damage rule analysis are required for application of either of the two cumulative damage methods described. Illustrations and comparisons of both methods are discussed.

  14. Practical implementation of the double linear damage rule and damage curve approach for treating cumulative fatigue damage

    NASA Technical Reports Server (NTRS)

    Manson, S. S.; Halford, G. R.

    1981-01-01

    Simple procedures are given for treating cumulative fatigue damage under complex loading history using either the damage curve concept or the double linear damage rule. A single equation is given for use with the damage curve approach; each loading event providing a fraction of damage until failure is presumed to occur when the damage sum becomes unity. For the double linear damage rule, analytical expressions are given for determining the two phases of life. The procedure comprises two steps, each similar to the conventional application of the commonly used linear damage rule. Once the sum of cycle ratios based on Phase I lives reaches unity, Phase I is presumed complete, and further loadings are summed as cycle ratios based on Phase II lives. When the Phase II sum attains unity, failure is presumed to occur. It is noted that no physical properties or material constants other than those normally used in a conventional linear damage rule analysis are required for application of either of the two cumulative damage methods described. Illustrations and comparisons are discussed for both methods.

  15. A full computation-relevant topological dynamics classification of elementary cellular automata.

    PubMed

    Schüle, Martin; Stoop, Ruedi

    2012-12-01

    Cellular automata are both computational and dynamical systems. We give a complete classification of the dynamic behaviour of elementary cellular automata (ECA) in terms of fundamental dynamic system notions such as sensitivity and chaoticity. The "complex" ECA emerge to be sensitive, but not chaotic and not eventually weakly periodic. Based on this classification, we conjecture that elementary cellular automata capable of carrying out complex computations, such as needed for Turing-universality, are at the "edge of chaos."

  16. Cellular automata model for traffic flow at intersections in internet of vehicles

    NASA Astrophysics Data System (ADS)

    Zhao, Han-Tao; Liu, Xin-Ru; Chen, Xiao-Xu; Lu, Jian-Cheng

    2018-03-01

    Considering the effect of the front vehicle's speed, the influence of the brake light and the conflict of the traffic flow, we established a cellular automata model called CE-NS for traffic flow at the intersection in the non-vehicle networking environment. According to the information interaction of Internet of Vehicles (IoV), introducing parameters describing the congestion and the accurate speed of the front vehicle into the CE-NS model, we improved the rules of acceleration, deceleration and conflict, and finally established a cellular automata model for traffic flow at intersections of IoV. The relationship between traffic parameters such as vehicle speed, flow and average travel time is obtained by numerical simulation of two models. Based on this, we compared the traffic situation of the non-vehicle networking environment with conditions of IoV environment, and analyzed the influence of the different degree of IoV on the traffic flow. The results show that the traffic speed is increased, the travel time is reduced, the flux of intersections is increased and the traffic flow is more smoothly under IoV environment. After the vehicle which achieves IoV reaches a certain proportion, the operation effect of the traffic flow begins to improve obviously.

  17. The Hpp Rule with Memory and the Density Classification Task

    NASA Astrophysics Data System (ADS)

    Alonso-Sanz, Ramón

    This article considers an extension to the standard framework of cellular automata by implementing memory capability in cells. It is shown that the important block HPP rule behaves as an excellent classifier of the density in the initial configuration when applied to cells endowed with pondered memory of their previous states. If the weighing is made so that the most recent state values are assigning the highest weights, the HPP rule surpasses the performance of the best two-dimensional density classifiers reported in the literature.

  18. Solving multiconstraint assignment problems using learning automata.

    PubMed

    Horn, Geir; Oommen, B John

    2010-02-01

    This paper considers the NP-hard problem of object assignment with respect to multiple constraints: assigning a set of elements (or objects) into mutually exclusive classes (or groups), where the elements which are "similar" to each other are hopefully located in the same class. The literature reports solutions in which the similarity constraint consists of a single index that is inappropriate for the type of multiconstraint problems considered here and where the constraints could simultaneously be contradictory. This feature, where we permit possibly contradictory constraints, distinguishes this paper from the state of the art. Indeed, we are aware of no learning automata (or other heuristic) solutions which solve this problem in its most general setting. Such a scenario is illustrated with the static mapping problem, which consists of distributing the processes of a parallel application onto a set of computing nodes. This is a classical and yet very important problem within the areas of parallel computing, grid computing, and cloud computing. We have developed four learning-automata (LA)-based algorithms to solve this problem: First, a fixed-structure stochastic automata algorithm is presented, where the processes try to form pairs to go onto the same node. This algorithm solves the problem, although it requires some centralized coordination. As it is desirable to avoid centralized control, we subsequently present three different variable-structure stochastic automata (VSSA) algorithms, which have superior partitioning properties in certain settings, although they forfeit some of the scalability features of the fixed-structure algorithm. All three VSSA algorithms model the processes as automata having first the hosting nodes as possible actions; second, the processes as possible actions; and, third, attempting to estimate the process communication digraph prior to probabilistically mapping the processes. This paper, which, we believe, comprehensively reports the

  19. On the effect of memory in one-dimensional K=4 automata on networks

    NASA Astrophysics Data System (ADS)

    Alonso-Sanz, Ramón; Cárdenas, Juan Pablo

    2008-12-01

    The effect of implementing memory in cells of one-dimensional CA, and on nodes of various types of automata on networks with increasing degrees of random rewiring is studied in this article, paying particular attention to the case of four inputs. As a rule, memory induces a moderation in the rate of changing nodes and in the damage spreading, albeit in the latter case memory turns out to be ineffective in the control of the damage as the wiring network moves away from the ordered structure that features proper one-dimensional CA. This article complements the previous work done in the two-dimensional context.

  20. Jahn-Teller effect in molecular electronics: quantum cellular automata

    NASA Astrophysics Data System (ADS)

    Tsukerblat, B.; Palii, A.; Clemente-Juan, J. M.; Coronado, E.

    2017-05-01

    The article summarizes the main results of application of the theory of the Jahn-Teller (JT) and pseudo JT effects to the description of molecular quantum dot cellular automata (QCA), a new paradigm of quantum computing. The following issues are discussed: 1) QCA as a new paradigm of quantum computing, principles and advantages; 2) molecular implementation of QCA; 3) role of the JT effect in charge trapping, encoding of binary information in the quantum cell and non-linear cell-cell response; 4) spin-switching in molecular QCA based on mixed-valence cell; 5) intervalence optical absorption in tetrameric molecular mixed-valence cell through the symmetry assisted approach to the multimode/multilevel JT and pseudo JT problems.

  1. Unstable vicinal crystal growth from cellular automata

    NASA Astrophysics Data System (ADS)

    Krasteva, A.; Popova, H.; KrzyŻewski, F.; Załuska-Kotur, M.; Tonchev, V.

    2016-03-01

    In order to study the unstable step motion on vicinal crystal surfaces we devise vicinal Cellular Automata. Each cell from the colony has value equal to its height in the vicinal, initially the steps are regularly distributed. Another array keeps the adatoms, initially distributed randomly over the surface. The growth rule defines that each adatom at right nearest neighbor position to a (multi-) step attaches to it. The update of whole colony is performed at once and then time increases. This execution of the growth rule is followed by compensation of the consumed particles and by diffusional update(s) of the adatom population. Two principal sources of instability are employed - biased diffusion and infinite inverse Ehrlich-Schwoebel barrier (iiSE). Since these factors are not opposed by step-step repulsion the formation of multi-steps is observed but in general the step bunches preserve a finite width. We monitor the developing surface patterns and quantify the observations by scaling laws with focus on the eventual transition from diffusion-limited to kinetics-limited phenomenon. The time-scaling exponent of the bunch size N is 1/2 for the case of biased diffusion and 1/3 for the case of iiSE. Additional distinction is possible based on the time-scaling exponents of the sizes of multi-step Nmulti, these are 0.36÷0.4 (for biased diffusion) and 1/4 (iiSE).

  2. A Decomposition Theorem for Finite Automata.

    ERIC Educational Resources Information Center

    Santa Coloma, Teresa L.; Tucci, Ralph P.

    1990-01-01

    Described is automata theory which is a branch of theoretical computer science. A decomposition theorem is presented that is easier than the Krohn-Rhodes theorem. Included are the definitions, the theorem, and a proof. (KR)

  3. The 3-dimensional cellular automata for HIV infection

    NASA Astrophysics Data System (ADS)

    Mo, Youbin; Ren, Bin; Yang, Wencao; Shuai, Jianwei

    2014-04-01

    The HIV infection dynamics is discussed in detail with a 3-dimensional cellular automata model in this paper. The model can reproduce the three-phase development, i.e., the acute period, the asymptotic period and the AIDS period, observed in the HIV-infected patients in a clinic. We show that the 3D HIV model performs a better robustness on the model parameters than the 2D cellular automata. Furthermore, we reveal that the occurrence of a perpetual source to successively generate infectious waves to spread to the whole system drives the model from the asymptotic state to the AIDS state.

  4. A new cellular automata model of traffic flow with negative exponential weighted look-ahead potential

    NASA Astrophysics Data System (ADS)

    Ma, Xiao; Zheng, Wei-Fan; Jiang, Bao-Shan; Zhang, Ji-Ye

    2016-10-01

    With the development of traffic systems, some issues such as traffic jams become more and more serious. Efficient traffic flow theory is needed to guide the overall controlling, organizing and management of traffic systems. On the basis of the cellular automata model and the traffic flow model with look-ahead potential, a new cellular automata traffic flow model with negative exponential weighted look-ahead potential is presented in this paper. By introducing the negative exponential weighting coefficient into the look-ahead potential and endowing the potential of vehicles closer to the driver with a greater coefficient, the modeling process is more suitable for the driver’s random decision-making process which is based on the traffic environment that the driver is facing. The fundamental diagrams for different weighting parameters are obtained by using numerical simulations which show that the negative exponential weighting coefficient has an obvious effect on high density traffic flux. The complex high density non-linear traffic behavior is also reproduced by numerical simulations. Project supported by the National Natural Science Foundation of China (Grant Nos. 11572264, 11172247, 11402214, and 61373009).

  5. Cellular Automata and the Humanities.

    ERIC Educational Resources Information Center

    Gallo, Ernest

    1994-01-01

    The use of cellular automata to analyze several pre-Socratic hypotheses about the evolution of the physical world is discussed. These hypotheses combine characteristics of both rigorous and metaphoric language. Since the computer demands explicit instructions for each step in the evolution of the automaton, such models can reveal conceptual…

  6. Assessing Performance of Multipurpose Reservoir System Using Two-Point Linear Hedging Rule

    NASA Astrophysics Data System (ADS)

    Sasireka, K.; Neelakantan, T. R.

    2017-07-01

    Reservoir operation is the one of the important filed of water resource management. Innovative techniques in water resource management are focussed at optimizing the available water and in decreasing the environmental impact of water utilization on the natural environment. In the operation of multi reservoir system, efficient regulation of the release to satisfy the demand for various purpose like domestic, irrigation and hydropower can lead to increase the benefit from the reservoir as well as significantly reduces the damage due to floods. Hedging rule is one of the emerging techniques in reservoir operation, which reduce the severity of drought by accepting number of smaller shortages. The key objective of this paper is to maximize the minimum power production and improve the reliability of water supply for municipal and irrigation purpose by using hedging rule. In this paper, Type II two-point linear hedging rule is attempted to improve the operation of Bargi reservoir in the Narmada basin in India. The results obtained from simulation of hedging rule is compared with results from Standard Operating Policy, the result shows that the application of hedging rule significantly improved the reliability of water supply and reliability of irrigation release and firm power production.

  7. Linear solvation energy relationships: "rule of thumb" for estimation of variable values

    USGS Publications Warehouse

    Hickey, James P.; Passino-Reader, Dora R.

    1991-01-01

    For the linear solvation energy relationship (LSER), values are listed for each of the variables (Vi/100, π*, &betam, αm) for fundamental organic structures and functional groups. We give the guidelines to estimate LSER variable values quickly for a vast array of possible organic compounds such as those found in the environment. The difficulty in generating these variables has greatly discouraged the application of this quantitative structure-activity relationship (QSAR) method. This paper present the first compilation of molecular functional group values together with a utilitarian set of the LSER variable estimation rules. The availability of these variable values and rules should facilitate widespread application of LSER for hazard evaluation of environmental contaminants.

  8. Simulations of Forest Fires by the Cellular Automata Model "ABBAMPAU"

    NASA Astrophysics Data System (ADS)

    di Gregorio, S.; Bendicenti, E.

    2003-04-01

    Forest fires represent a serious environmental problem, whose negative impact is becoming day by day more worrisome. Forest fires are very complex phenomena; that need an interdisciplinary approach. The adopted method to modelling involves the definition of local rules, from which the global behaviour of the system can emerge. The paradigm of Cellular Automata was applied and the model ABBAMPAU was projected to simulate the evolution of forest fires. Cellular Automata features (parallelism and a-centrism) seem to match the system "forest fire"; the parameters, describing globally a forest fire, i.e. propagation rate, flame length and direction, fireline intensity, fire duration time et c. are mainly depending on some local characteristics i.e. vegetation type (live and dead fuel), relative humidity, fuel moisture, heat, territory morphology (altitude, slope), et c.. The only global characteristic is given by wind velocity and direction, but wind velocity and direction is locally altered according to the morphology; therefore wind has also to be considered at local level. ABBAMPAU accounts for the following aspects of the phenomenon: effects of combustion in surface and crown fire inside the cell, crown fire triggering off; surface and crown fire spread, determination of the local wind rate and direction. A validation of ABBAMPAU was tested on a real case of forest fire, in the territory of Villaputzu, Sardinia island, August 22nd, 1998. First simulations account for the main characteristics of the phenomenon and agree with the observations. The results show that the model could be applied for the forest fire preventions, the productions of risk scenarios and the evaluation of the forest fire environmental impact.

  9. Using economy of means to evolve transition rules within 2D cellular automata.

    PubMed

    Ripps, David L

    2010-01-01

    Running a cellular automaton (CA) on a rectangular lattice is a time-honored method for studying artificial life on a digital computer. Commonly, the researcher wishes to investigate some specific or general mode of behavior, say, the ability of a coherent pattern of points to glide within the lattice, or to generate copies of itself. This technique has a problem: how to design the transitions table-the set of distinct rules that specify the next content of a cell from its current content and that of its near neighbors. Often the table is painstakingly designed manually, rule by rule. The problem is exacerbated by the potentially vast number of individual rules that need be specified to cover all combinations of center and neighbors when there are several symbols in the alphabet of the CA. In this article a method is presented to have the set of rules evolve automatically while running the CA. The transition table is initially empty, with rules being added as the need arises. A novel principle drives the evolution: maximum economy of means-maximizing the reuse of rules introduced on previous cycles. This method may not be a panacea applicable to all CA studies. Nevertheless, it is sufficiently potent to evolve sets of rules and associated patterns of points that glide (periodically regenerate themselves at another location) and to generate gliding "children" that then "mate" by collision.

  10. Mammogram segmentation using maximal cell strength updation in cellular automata.

    PubMed

    Anitha, J; Peter, J Dinesh

    2015-08-01

    Breast cancer is the most frequently diagnosed type of cancer among women. Mammogram is one of the most effective tools for early detection of the breast cancer. Various computer-aided systems have been introduced to detect the breast cancer from mammogram images. In a computer-aided diagnosis system, detection and segmentation of breast masses from the background tissues is an important issue. In this paper, an automatic segmentation method is proposed to identify and segment the suspicious mass regions of mammogram using a modified transition rule named maximal cell strength updation in cellular automata (CA). In coarse-level segmentation, the proposed method performs an adaptive global thresholding based on the histogram peak analysis to obtain the rough region of interest. An automatic seed point selection is proposed using gray-level co-occurrence matrix-based sum average feature in the coarse segmented image. Finally, the method utilizes CA with the identified initial seed point and the modified transition rule to segment the mass region. The proposed approach is evaluated over the dataset of 70 mammograms with mass from mini-MIAS database. Experimental results show that the proposed approach yields promising results to segment the mass region in the mammograms with the sensitivity of 92.25% and accuracy of 93.48%.

  11. Unary probabilistic and quantum automata on promise problems

    NASA Astrophysics Data System (ADS)

    Gainutdinova, Aida; Yakaryılmaz, Abuzer

    2018-02-01

    We continue the systematic investigation of probabilistic and quantum finite automata (PFAs and QFAs) on promise problems by focusing on unary languages. We show that bounded-error unary QFAs are more powerful than bounded-error unary PFAs, and, contrary to the binary language case, the computational power of Las-Vegas QFAs and bounded-error PFAs is equivalent to the computational power of deterministic finite automata (DFAs). Then, we present a new family of unary promise problems defined with two parameters such that when fixing one parameter QFAs can be exponentially more succinct than PFAs and when fixing the other parameter PFAs can be exponentially more succinct than DFAs.

  12. Modeling biological pathway dynamics with timed automata.

    PubMed

    Schivo, Stefano; Scholma, Jetse; Wanders, Brend; Urquidi Camacho, Ricardo A; van der Vet, Paul E; Karperien, Marcel; Langerak, Rom; van de Pol, Jaco; Post, Janine N

    2014-05-01

    Living cells are constantly subjected to a plethora of environmental stimuli that require integration into an appropriate cellular response. This integration takes place through signal transduction events that form tightly interconnected networks. The understanding of these networks requires capturing their dynamics through computational support and models. ANIMO (analysis of Networks with Interactive Modeling) is a tool that enables the construction and exploration of executable models of biological networks, helping to derive hypotheses and to plan wet-lab experiments. The tool is based on the formalism of Timed Automata, which can be analyzed via the UPPAAL model checker. Thanks to Timed Automata, we can provide a formal semantics for the domain-specific language used to represent signaling networks. This enforces precision and uniformity in the definition of signaling pathways, contributing to the integration of isolated signaling events into complex network models. We propose an approach to discretization of reaction kinetics that allows us to efficiently use UPPAAL as the computational engine to explore the dynamic behavior of the network of interest. A user-friendly interface hides the use of Timed Automata from the user, while keeping the expressive power intact. Abstraction to single-parameter kinetics speeds up construction of models that remain faithful enough to provide meaningful insight. The resulting dynamic behavior of the network components is displayed graphically, allowing for an intuitive and interactive modeling experience.

  13. Preliminary cellular-automata forecast of permit activity from 1998 to 2010, Idaho and Western Montana

    USGS Publications Warehouse

    Raines, G.L.; Zientek, M.L.; Causey, J.D.; Boleneus, D.E.

    2002-01-01

    For public land management in Idaho and western Montana, the U.S. Forest Service (USFS) has requested that the U.S. Geological Survey (USGS) predict where mineral-related activity will occur in the next decade. Cellular automata provide an approach to simulation of this human activity. Cellular automata (CA) are defined by an array of cells, which evolve by a simple transition rule, the automaton. Based on exploration trends, we assume that future exploration will focus in areas of past exploration. Spatial-temporal information about mineral-related activity, that is permits issued by USFS and Bureau of Land Management (BLM) in the last decade, and spatial information about undiscovered resources, provide a basis to calibrate a CA. The CA implemented is a modified annealed voting rule that simulates mineral-related activity with spatial and temporal resolution of 1 mi2 and 1 year based on activity from 1989 to 1998. For this CA, the state of the economy and exploration technology is assumed constant for the next decade. The calibrated CA reproduces the 1989-1998-permit activity with an agreement of 94%, which increases to 98% within one year. Analysis of the confusion matrix and kappa correlation statistics indicates that the CA underestimates high activity and overestimates low activity. Spatially, the major differences between the actual and calculated activity are that the calculated activity occurs in a slightly larger number of small patches and is slightly more uneven than the actual activity. Using the calibrated CA in a Monte Carlo simulation projecting from 1998 to 2010, an estimate of the probability of mineral activity shows high levels of activity in Boise, Caribou, Elmore, Lincoln, and western Valley counties in Idaho and Beaverhead, Madison, and Stillwater counties in Montana, and generally low activity elsewhere. ?? 2002 International Association for Mathematical Geology.

  14. A stochastic cellular automata model of tautomer equilibria

    NASA Astrophysics Data System (ADS)

    Bowers, Gregory A.; Seybold, Paul G.

    2018-03-01

    Many chemical substances, including drugs and biomolecules, exist in solution not as a single species, but as a collection of tautomers and related species. Importantly, each of these species is an independent compoundwith its own specific biochemical and physicochemical properties. The species interconvert in a dynamic and often complicated manner, making modelling the overall species composition difficult. Agent-based cellular automata models are uniquely suited to meet this challenge, allowing the equilibria to be simulated using simple rulesand at the same time capturing the inherent stochasticity of the natural phenomenon. In the present example a stochastic cellular automata model is employed to simulate the tautomer equilibria of 9-anthrone and 9-anthrol in the presence of their common anion. The observed KE of the 9-anthrone ⇌ 9-anthrol tautomerisation along with the measured tautomer pKa values were used to model the equilibria at pH values 4, 7 and 10. At pH 4 and 7, the anthrone comprises >99% of the total species population, while at pH 10the anthrone and the anion each represent just under half of the total population. The advantages of the cellular automata approach over the customary coupled differential equation approach are discussed.

  15. Molecular Magnetic Quantum Cellular Automata

    DTIC Science & Technology

    2004-06-01

    Folting K, Gatteschi D, Christou G, Hendrickson D N 1993a, High-Spin Molecules - [Mn12O12(O2CR)16(H2O)4], J. Am. Chem. Soc. 115 1804 Sessoli R... Gatteschi D, Caneschi A and Novak M A 1993b, Magnetic bistability in a metal-ion cluster, Nature 365 141 Twamely J 2003, Quantum-cellular-automata

  16. A novel image encryption algorithm using chaos and reversible cellular automata

    NASA Astrophysics Data System (ADS)

    Wang, Xingyuan; Luan, Dapeng

    2013-11-01

    In this paper, a novel image encryption scheme is proposed based on reversible cellular automata (RCA) combining chaos. In this algorithm, an intertwining logistic map with complex behavior and periodic boundary reversible cellular automata are used. We split each pixel of image into units of 4 bits, then adopt pseudorandom key stream generated by the intertwining logistic map to permute these units in confusion stage. And in diffusion stage, two-dimensional reversible cellular automata which are discrete dynamical systems are applied to iterate many rounds to achieve diffusion on bit-level, in which we only consider the higher 4 bits in a pixel because the higher 4 bits carry almost the information of an image. Theoretical analysis and experimental results demonstrate the proposed algorithm achieves a high security level and processes good performance against common attacks like differential attack and statistical attack. This algorithm belongs to the class of symmetric systems.

  17. Cellular automata with object-oriented features for parallel molecular network modeling.

    PubMed

    Zhu, Hao; Wu, Yinghui; Huang, Sui; Sun, Yan; Dhar, Pawan

    2005-06-01

    Cellular automata are an important modeling paradigm for studying the dynamics of large, parallel systems composed of multiple, interacting components. However, to model biological systems, cellular automata need to be extended beyond the large-scale parallelism and intensive communication in order to capture two fundamental properties characteristic of complex biological systems: hierarchy and heterogeneity. This paper proposes extensions to a cellular automata language, Cellang, to meet this purpose. The extended language, with object-oriented features, can be used to describe the structure and activity of parallel molecular networks within cells. Capabilities of this new programming language include object structure to define molecular programs within a cell, floating-point data type and mathematical functions to perform quantitative computation, message passing capability to describe molecular interactions, as well as new operators, statements, and built-in functions. We discuss relevant programming issues of these features, including the object-oriented description of molecular interactions with molecule encapsulation, message passing, and the description of heterogeneity and anisotropy at the cell and molecule levels. By enabling the integration of modeling at the molecular level with system behavior at cell, tissue, organ, or even organism levels, the program will help improve our understanding of how complex and dynamic biological activities are generated and controlled by parallel functioning of molecular networks. Index Terms-Cellular automata, modeling, molecular network, object-oriented.

  18. Computational Modeling of Proteins based on Cellular Automata: A Method of HP Folding Approximation.

    PubMed

    Madain, Alia; Abu Dalhoum, Abdel Latif; Sleit, Azzam

    2018-06-01

    The design of a protein folding approximation algorithm is not straightforward even when a simplified model is used. The folding problem is a combinatorial problem, where approximation and heuristic algorithms are usually used to find near optimal folds of proteins primary structures. Approximation algorithms provide guarantees on the distance to the optimal solution. The folding approximation approach proposed here depends on two-dimensional cellular automata to fold proteins presented in a well-studied simplified model called the hydrophobic-hydrophilic model. Cellular automata are discrete computational models that rely on local rules to produce some overall global behavior. One-third and one-fourth approximation algorithms choose a subset of the hydrophobic amino acids to form H-H contacts. Those algorithms start with finding a point to fold the protein sequence into two sides where one side ignores H's at even positions and the other side ignores H's at odd positions. In addition, blocks or groups of amino acids fold the same way according to a predefined normal form. We intend to improve approximation algorithms by considering all hydrophobic amino acids and folding based on the local neighborhood instead of using normal forms. The CA does not assume a fixed folding point. The proposed approach guarantees one half approximation minus the H-H endpoints. This lower bound guaranteed applies to short sequences only. This is proved as the core and the folds of the protein will have two identical sides for all short sequences.

  19. Scalable asynchronous execution of cellular automata

    NASA Astrophysics Data System (ADS)

    Folino, Gianluigi; Giordano, Andrea; Mastroianni, Carlo

    2016-10-01

    The performance and scalability of cellular automata, when executed on parallel/distributed machines, are limited by the necessity of synchronizing all the nodes at each time step, i.e., a node can execute only after the execution of the previous step at all the other nodes. However, these synchronization requirements can be relaxed: a node can execute one step after synchronizing only with the adjacent nodes. In this fashion, different nodes can execute different time steps. This can be a notable advantageous in many novel and increasingly popular applications of cellular automata, such as smart city applications, simulation of natural phenomena, etc., in which the execution times can be different and variable, due to the heterogeneity of machines and/or data and/or executed functions. Indeed, a longer execution time at a node does not slow down the execution at all the other nodes but only at the neighboring nodes. This is particularly advantageous when the nodes that act as bottlenecks vary during the application execution. The goal of the paper is to analyze the benefits that can be achieved with the described asynchronous implementation of cellular automata, when compared to the classical all-to-all synchronization pattern. The performance and scalability have been evaluated through a Petri net model, as this model is very useful to represent the synchronization barrier among nodes. We examined the usual case in which the territory is partitioned into a number of regions, and the computation associated with a region is assigned to a computing node. We considered both the cases of mono-dimensional and two-dimensional partitioning. The results show that the advantage obtained through the asynchronous execution, when compared to the all-to-all synchronous approach is notable, and it can be as large as 90% in terms of speedup.

  20. Fuzzy cellular automata models in immunology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahmed, E.

    1996-10-01

    The self-nonself character of antigens is considered to be fuzzy. The Chowdhury et al. cellular automata model is generalized accordingly. New steady states are found. The first corresponds to a below-normal help and suppression and is proposed to be related to autoimmune diseases. The second corresponds to a below-normal B-cell level.

  1. Modeling cell adhesion and proliferation: a cellular-automata based approach.

    PubMed

    Vivas, J; Garzón-Alvarado, D; Cerrolaza, M

    Cell adhesion is a process that involves the interaction between the cell membrane and another surface, either a cell or a substrate. Unlike experimental tests, computer models can simulate processes and study the result of experiments in a shorter time and lower costs. One of the tools used to simulate biological processes is the cellular automata, which is a dynamic system that is discrete both in space and time. This work describes a computer model based on cellular automata for the adhesion process and cell proliferation to predict the behavior of a cell population in suspension and adhered to a substrate. The values of the simulated system were obtained through experimental tests on fibroblast monolayer cultures. The results allow us to estimate the cells settling time in culture as well as the adhesion and proliferation time. The change in the cells morphology as the adhesion over the contact surface progress was also observed. The formation of the initial link between cell and the substrate of the adhesion was observed after 100 min where the cell on the substrate retains its spherical morphology during the simulation. The cellular automata model developed is, however, a simplified representation of the steps in the adhesion process and the subsequent proliferation. A combined framework of experimental and computational simulation based on cellular automata was proposed to represent the fibroblast adhesion on substrates and changes in a macro-scale observed in the cell during the adhesion process. The approach showed to be simple and efficient.

  2. Astrobiological complexity with probabilistic cellular automata.

    PubMed

    Vukotić, Branislav; Ćirković, Milan M

    2012-08-01

    The search for extraterrestrial life and intelligence constitutes one of the major endeavors in science, but has yet been quantitatively modeled only rarely and in a cursory and superficial fashion. We argue that probabilistic cellular automata (PCA) represent the best quantitative framework for modeling the astrobiological history of the Milky Way and its Galactic Habitable Zone. The relevant astrobiological parameters are to be modeled as the elements of the input probability matrix for the PCA kernel. With the underlying simplicity of the cellular automata constructs, this approach enables a quick analysis of large and ambiguous space of the input parameters. We perform a simple clustering analysis of typical astrobiological histories with "Copernican" choice of input parameters and discuss the relevant boundary conditions of practical importance for planning and guiding empirical astrobiological and SETI projects. In addition to showing how the present framework is adaptable to more complex situations and updated observational databases from current and near-future space missions, we demonstrate how numerical results could offer a cautious rationale for continuation of practical SETI searches.

  3. Design of Efficient Mirror Adder in Quantum- Dot Cellular Automata

    NASA Astrophysics Data System (ADS)

    Mishra, Prashant Kumar; Chattopadhyay, Manju K.

    2018-03-01

    Lower power consumption is an essential demand for portable multimedia system using digital signal processing algorithms and architectures. Quantum dot cellular automata (QCA) is a rising nano technology for the development of high performance ultra-dense low power digital circuits. QCA based several efficient binary and decimal arithmetic circuits are implemented, however important improvements are still possible. This paper demonstrate Mirror Adder circuit design in QCA. We present comparative study of mirror adder cells designed using conventional CMOS technique and mirror adder cells designed using quantum-dot cellular automata. QCA based mirror adders are better in terms of area by order of three.

  4. Quantum-dot cellular automata: Review and recent experiments (invited)

    NASA Astrophysics Data System (ADS)

    Snider, G. L.; Orlov, A. O.; Amlani, I.; Zuo, X.; Bernstein, G. H.; Lent, C. S.; Merz, J. L.; Porod, W.

    1999-04-01

    An introduction to the operation of quantum-dot cellular automata is presented, along with recent experimental results. Quantum-dot cellular automata (QCA) is a transistorless computation paradigm that addresses the issues of device density and interconnection. The basic building blocks of the QCA architecture, such as AND, OR, and NOT are presented. The experimental device is a four-dot QCA cell with two electrometers. The dots are metal islands, which are coupled by capacitors and tunnel junctions. An improved design of the cell is presented in which all four dots of the cell are coupled by tunnel junctions. The operation of this basic cell is confirmed by the externally controlled polarization change of the cell.

  5. Perceptions of teaching and learning automata theory in a college-level computer science course

    NASA Astrophysics Data System (ADS)

    Weidmann, Phoebe Kay

    This dissertation identifies and describes student and instructor perceptions that contribute to effective teaching and learning of Automata Theory in a competitive college-level Computer Science program. Effective teaching is the ability to create an appropriate learning environment in order to provide effective learning. We define effective learning as the ability of a student to meet instructor set learning objectives, demonstrating this by passing the course, while reporting a good learning experience. We conducted our investigation through a detailed qualitative case study of two sections (118 students) of Automata Theory (CS 341) at The University of Texas at Austin taught by Dr. Lily Quilt. Because Automata Theory has a fixed curriculum in the sense that many curricula and textbooks agree on what Automata Theory contains, differences being depth and amount of material to cover in a single course, a case study would allow for generalizable findings. Automata Theory is especially problematic in a Computer Science curriculum since students are not experienced in abstract thinking before taking this course, fail to understand the relevance of the theory, and prefer classes with more concrete activities such as programming. This creates a special challenge for any instructor of Automata Theory as motivation becomes critical for student learning. Through the use of student surveys, instructor interviews, classroom observation, material and course grade analysis we sought to understand what students perceived, what instructors expected of students, and how those perceptions played out in the classroom in terms of structure and instruction. Our goal was to create suggestions that would lead to a better designed course and thus a higher student success rate in Automata Theory. We created a unique theoretical basis, pedagogical positivism, on which to study college-level courses. Pedagogical positivism states that through examining instructor and student perceptions

  6. Discovering Motifs in Biological Sequences Using the Micron Automata Processor.

    PubMed

    Roy, Indranil; Aluru, Srinivas

    2016-01-01

    Finding approximately conserved sequences, called motifs, across multiple DNA or protein sequences is an important problem in computational biology. In this paper, we consider the (l, d) motif search problem of identifying one or more motifs of length l present in at least q of the n given sequences, with each occurrence differing from the motif in at most d substitutions. The problem is known to be NP-complete, and the largest solved instance reported to date is (26,11). We propose a novel algorithm for the (l,d) motif search problem using streaming execution over a large set of non-deterministic finite automata (NFA). This solution is designed to take advantage of the micron automata processor, a new technology close to deployment that can simultaneously execute multiple NFA in parallel. We demonstrate the capability for solving much larger instances of the (l, d) motif search problem using the resources available within a single automata processor board, by estimating run-times for problem instances (39,18) and (40,17). The paper serves as a useful guide to solving problems using this new accelerator technology.

  7. Efficient Deterministic Finite Automata Minimization Based on Backward Depth Information.

    PubMed

    Liu, Desheng; Huang, Zhiping; Zhang, Yimeng; Guo, Xiaojun; Su, Shaojing

    2016-01-01

    Obtaining a minimal automaton is a fundamental issue in the theory and practical implementation of deterministic finite automatons (DFAs). A minimization algorithm is presented in this paper that consists of two main phases. In the first phase, the backward depth information is built, and the state set of the DFA is partitioned into many blocks. In the second phase, the state set is refined using a hash table. The minimization algorithm has a lower time complexity O(n) than a naive comparison of transitions O(n2). Few states need to be refined by the hash table, because most states have been partitioned by the backward depth information in the coarse partition. This method achieves greater generality than previous methods because building the backward depth information is independent of the topological complexity of the DFA. The proposed algorithm can be applied not only to the minimization of acyclic automata or simple cyclic automata, but also to automata with high topological complexity. Overall, the proposal has three advantages: lower time complexity, greater generality, and scalability. A comparison to Hopcroft's algorithm demonstrates experimentally that the algorithm runs faster than traditional algorithms.

  8. Efficient Deterministic Finite Automata Minimization Based on Backward Depth Information

    PubMed Central

    Liu, Desheng; Huang, Zhiping; Zhang, Yimeng; Guo, Xiaojun; Su, Shaojing

    2016-01-01

    Obtaining a minimal automaton is a fundamental issue in the theory and practical implementation of deterministic finite automatons (DFAs). A minimization algorithm is presented in this paper that consists of two main phases. In the first phase, the backward depth information is built, and the state set of the DFA is partitioned into many blocks. In the second phase, the state set is refined using a hash table. The minimization algorithm has a lower time complexity O(n) than a naive comparison of transitions O(n2). Few states need to be refined by the hash table, because most states have been partitioned by the backward depth information in the coarse partition. This method achieves greater generality than previous methods because building the backward depth information is independent of the topological complexity of the DFA. The proposed algorithm can be applied not only to the minimization of acyclic automata or simple cyclic automata, but also to automata with high topological complexity. Overall, the proposal has three advantages: lower time complexity, greater generality, and scalability. A comparison to Hopcroft’s algorithm demonstrates experimentally that the algorithm runs faster than traditional algorithms. PMID:27806102

  9. Superposition-Based Analysis of First-Order Probabilistic Timed Automata

    NASA Astrophysics Data System (ADS)

    Fietzke, Arnaud; Hermanns, Holger; Weidenbach, Christoph

    This paper discusses the analysis of first-order probabilistic timed automata (FPTA) by a combination of hierarchic first-order superposition-based theorem proving and probabilistic model checking. We develop the overall semantics of FPTAs and prove soundness and completeness of our method for reachability properties. Basically, we decompose FPTAs into their time plus first-order logic aspects on the one hand, and their probabilistic aspects on the other hand. Then we exploit the time plus first-order behavior by hierarchic superposition over linear arithmetic. The result of this analysis is the basis for the construction of a reachability equivalent (to the original FPTA) probabilistic timed automaton to which probabilistic model checking is finally applied. The hierarchic superposition calculus required for the analysis is sound and complete on the first-order formulas generated from FPTAs. It even works well in practice. We illustrate the potential behind it with a real-life DHCP protocol example, which we analyze by means of tool chain support.

  10. Simulation of interdiffusion and voids growth based on cellular automata

    NASA Astrophysics Data System (ADS)

    Zhang, Fan; Zhang, Boyan; Zhang, Nan; Du, Haishun; Zhang, Xinhong

    2017-02-01

    In the interdiffusion of two solid-state materials, if the diffusion coefficients of the two materials are not the same, the interface of the two materials will shift to the material with the lower diffusion coefficient. This effect is known as the Kirkendall effect. The Kirkendall effect leads to Kirkendall porosity. The pores act as sinks for vacancies and become voids. In this paper, the movement of the Kirkendall plane at interdiffusion is simulated based on cellular automata. The number of vacancies, the critical radius of voids nucleation and the nucleation rate are analysed. The vacancies diffusion, vacancies aggregation and voids growth are also simulated based on cellular automata.

  11. Conway's Game of Life is a near-critical metastable state in the multiverse of cellular automata.

    PubMed

    Reia, Sandro M; Kinouchi, Osame

    2014-05-01

    Conway's cellular automaton Game of Life has been conjectured to be a critical (or quasicritical) dynamical system. This criticality is generally seen as a continuous order-disorder transition in cellular automata (CA) rule space. Life's mean-field return map predicts an absorbing vacuum phase (ρ = 0) and an active phase density, with ρ = 0.37, which contrasts with Life's absorbing states in a square lattice, which have a stationary density of ρ(2D) ≈ 0.03. Here, we study and classify mean-field maps for 6144 outer-totalistic CA and compare them with the corresponding behavior found in the square lattice. We show that the single-site mean-field approach gives qualitative (and even quantitative) predictions for most of them. The transition region in rule space seems to correspond to a nonequilibrium discontinuous absorbing phase transition instead of a continuous order-disorder one. We claim that Life is a quasicritical nucleation process where vacuum phase domains invade the alive phase. Therefore, Life is not at the "border of chaos," but thrives on the "border of extinction."

  12. Conway's game of life is a near-critical metastable state in the multiverse of cellular automata

    NASA Astrophysics Data System (ADS)

    Reia, Sandro M.; Kinouchi, Osame

    2014-05-01

    Conway's cellular automaton Game of Life has been conjectured to be a critical (or quasicritical) dynamical system. This criticality is generally seen as a continuous order-disorder transition in cellular automata (CA) rule space. Life's mean-field return map predicts an absorbing vacuum phase (ρ =0) and an active phase density, with ρ =0.37, which contrasts with Life's absorbing states in a square lattice, which have a stationary density of ρ2D≈0.03. Here, we study and classify mean-field maps for 6144 outer-totalistic CA and compare them with the corresponding behavior found in the square lattice. We show that the single-site mean-field approach gives qualitative (and even quantitative) predictions for most of them. The transition region in rule space seems to correspond to a nonequilibrium discontinuous absorbing phase transition instead of a continuous order-disorder one. We claim that Life is a quasicritical nucleation process where vacuum phase domains invade the alive phase. Therefore, Life is not at the "border of chaos," but thrives on the "border of extinction."

  13. A 2D flood inundation model based on cellular automata approach

    NASA Astrophysics Data System (ADS)

    Dottori, Francesco; Todini, Ezio

    2010-05-01

    In the past years, the cellular automata approach has been successfully applied in two-dimensional modelling of flood events. When used in experimental applications, models based on such approach have provided good results, comparable to those obtained with more complex 2D models; moreover, CA models have proven significantly faster and easier to apply than most of existing models, and these features make them a valuable tool for flood analysis especially when dealing with large areas. However, to date the real degree of accuracy of such models has not been demonstrated, since they have been mainly used in experimental applications, while very few comparisons with theoretical solutions have been made. Also, the use of an explicit scheme of solution, which is inherent in cellular automata models, forces them to work only with small time steps, thus reducing model computation speed. The present work describes a cellular automata model based on the continuity and diffusive wave equations. Several model versions based on different solution schemes have been realized and tested in a number of numerical cases, both 1D and 2D, comparing the results with theoretical and numerical solutions. In all cases, the model performed well compared to the reference solutions, and proved to be both stable and accurate. Finally, the version providing the best results in terms of stability was tested in a real flood event and compared with different hydraulic models. Again, the cellular automata model provided very good results, both in term of computational speed and reproduction of the simulated event.

  14. A comparative analysis of electronic and molecular quantum dot cellular automata

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Umamahesvari, H., E-mail: umamaheswarihema@gmail.com, E-mail: ajithavijay1@gmail.com; Ajitha, D., E-mail: umamaheswarihema@gmail.com, E-mail: ajithavijay1@gmail.com

    This paper presents a comparative analysis of electronic quantum-dot cellular automata (EQCA) and Magnetic quantum dot Cellular Automata (MQCA). QCA is a computing paradigm that encodes and processes information by the position of individual electrons. To enhance the high dense and ultra-low power devices, various researches have been actively carried out to find an alternative way to continue and follow Moore’s law, so called “beyond CMOS technology”. There have been several proposals for physically implementing QCA, EQCA and MQCA are the two important QCAs reported so far. This paper provides a comparative study on these two QCAs.

  15. Distribution functions of probabilistic automata

    NASA Technical Reports Server (NTRS)

    Vatan, F.

    2001-01-01

    Each probabilistic automaton M over an alphabet A defines a probability measure Prob sub(M) on the set of all finite and infinite words over A. We can identify a k letter alphabet A with the set {0, 1,..., k-1}, and, hence, we can consider every finite or infinite word w over A as a radix k expansion of a real number X(w) in the interval [0, 1]. This makes X(w) a random variable and the distribution function of M is defined as usual: F(x) := Prob sub(M) { w: X(w) < x }. Utilizing the fixed-point semantics (denotational semantics), extended to probabilistic computations, we investigate the distribution functions of probabilistic automata in detail. Automata with continuous distribution functions are characterized. By a new, and much more easier method, it is shown that the distribution function F(x) is an analytic function if it is a polynomial. Finally, answering a question posed by D. Knuth and A. Yao, we show that a polynomial distribution function F(x) on [0, 1] can be generated by a prob abilistic automaton iff all the roots of F'(x) = 0 in this interval, if any, are rational numbers. For this, we define two dynamical systems on the set of polynomial distributions and study attracting fixed points of random composition of these two systems.

  16. Quantitative Analysis of Intra Urban Growth Modeling using socio economic agents by combining cellular automata model with agent based model

    NASA Astrophysics Data System (ADS)

    Singh, V. K.; Jha, A. K.; Gupta, K.; Srivastav, S. K.

    2017-12-01

    Recent studies indicate that there is a significant improvement in the urban land use dynamics through modeling at finer spatial resolutions. Geo-computational models such as cellular automata and agent based model have given evident proof regarding the quantification of the urban growth pattern with urban boundary. In recent studies, socio- economic factors such as demography, education rate, household density, parcel price of the current year, distance to road, school, hospital, commercial centers and police station are considered to the major factors influencing the Land Use Land Cover (LULC) pattern of the city. These factors have unidirectional approach to land use pattern which makes it difficult to analyze the spatial aspects of model results both quantitatively and qualitatively. In this study, cellular automata model is combined with generic model known as Agent Based Model to evaluate the impact of socio economic factors on land use pattern. For this purpose, Dehradun an Indian city is selected as a case study. Socio economic factors were collected from field survey, Census of India, Directorate of economic census, Uttarakhand, India. A 3X3 simulating window is used to consider the impact on LULC. Cellular automata model results are examined for the identification of hot spot areas within the urban area and agent based model will be using logistic based regression approach where it will identify the correlation between each factor on LULC and classify the available area into low density, medium density, high density residential or commercial area. In the modeling phase, transition rule, neighborhood effect, cell change factors are used to improve the representation of built-up classes. Significant improvement is observed in the built-up classes from 84 % to 89 %. However after incorporating agent based model with cellular automata model the accuracy improved from 89 % to 94 % in 3 classes of urban i.e. low density, medium density and commercial classes

  17. Larger than Life's Extremes: Rigorous Results for Simplified Rules and Speculation on the Phase Boundaries

    NASA Astrophysics Data System (ADS)

    Evans, Kellie Michele

    Larger than Life (LtL), is a four-parameter family of two-dimensional cellular automata that generalizes John Conway's Game of Life (Life) to large neighborhoods and general birth and survival thresholds. LtL was proposed by David Griffeath in the early 1990s to explore whether Life might be a clue to a critical phase point in the threshold-range scaling limit. The LtL family of rules includes Life as well as a rich set of two-dimensional rules, some of which exhibit dynamics vastly different from Life. In this chapter we present rigorous results and conjectures about the ergodic classifications of several sets of "simplified" LtL rules, each of which has a property that makes the rule easier to analyze. For example, these include symmetric rules such as the threshold voter automaton and the anti-voter automaton, monotone rules such as the threshold growth models, and others. We also provide qualitative results and speculation about LtL rules on various phase boundaries and summarize results and open questions about our favorite "Life-like" LtL rules.

  18. Canalization and Control in Automata Networks: Body Segmentation in Drosophila melanogaster

    PubMed Central

    Marques-Pita, Manuel; Rocha, Luis M.

    2013-01-01

    We present schema redescription as a methodology to characterize canalization in automata networks used to model biochemical regulation and signalling. In our formulation, canalization becomes synonymous with redundancy present in the logic of automata. This results in straightforward measures to quantify canalization in an automaton (micro-level), which is in turn integrated into a highly scalable framework to characterize the collective dynamics of large-scale automata networks (macro-level). This way, our approach provides a method to link micro- to macro-level dynamics – a crux of complexity. Several new results ensue from this methodology: uncovering of dynamical modularity (modules in the dynamics rather than in the structure of networks), identification of minimal conditions and critical nodes to control the convergence to attractors, simulation of dynamical behaviour from incomplete information about initial conditions, and measures of macro-level canalization and robustness to perturbations. We exemplify our methodology with a well-known model of the intra- and inter cellular genetic regulation of body segmentation in Drosophila melanogaster. We use this model to show that our analysis does not contradict any previous findings. But we also obtain new knowledge about its behaviour: a better understanding of the size of its wild-type attractor basin (larger than previously thought), the identification of novel minimal conditions and critical nodes that control wild-type behaviour, and the resilience of these to stochastic interventions. Our methodology is applicable to any complex network that can be modelled using automata, but we focus on biochemical regulation and signalling, towards a better understanding of the (decentralized) control that orchestrates cellular activity – with the ultimate goal of explaining how do cells and tissues ‘compute’. PMID:23520449

  19. Canalization and control in automata networks: body segmentation in Drosophila melanogaster.

    PubMed

    Marques-Pita, Manuel; Rocha, Luis M

    2013-01-01

    We present schema redescription as a methodology to characterize canalization in automata networks used to model biochemical regulation and signalling. In our formulation, canalization becomes synonymous with redundancy present in the logic of automata. This results in straightforward measures to quantify canalization in an automaton (micro-level), which is in turn integrated into a highly scalable framework to characterize the collective dynamics of large-scale automata networks (macro-level). This way, our approach provides a method to link micro- to macro-level dynamics--a crux of complexity. Several new results ensue from this methodology: uncovering of dynamical modularity (modules in the dynamics rather than in the structure of networks), identification of minimal conditions and critical nodes to control the convergence to attractors, simulation of dynamical behaviour from incomplete information about initial conditions, and measures of macro-level canalization and robustness to perturbations. We exemplify our methodology with a well-known model of the intra- and inter cellular genetic regulation of body segmentation in Drosophila melanogaster. We use this model to show that our analysis does not contradict any previous findings. But we also obtain new knowledge about its behaviour: a better understanding of the size of its wild-type attractor basin (larger than previously thought), the identification of novel minimal conditions and critical nodes that control wild-type behaviour, and the resilience of these to stochastic interventions. Our methodology is applicable to any complex network that can be modelled using automata, but we focus on biochemical regulation and signalling, towards a better understanding of the (decentralized) control that orchestrates cellular activity--with the ultimate goal of explaining how do cells and tissues 'compute'.

  20. Cellular automata and its applications in protein bioinformatics.

    PubMed

    Xiao, Xuan; Wang, Pu; Chou, Kuo-Chen

    2011-09-01

    With the explosion of protein sequences generated in the postgenomic era, it is highly desirable to develop high-throughput tools for rapidly and reliably identifying various attributes of uncharacterized proteins based on their sequence information alone. The knowledge thus obtained can help us timely utilize these newly found protein sequences for both basic research and drug discovery. Many bioinformatics tools have been developed by means of machine learning methods. This review is focused on the applications of a new kind of science (cellular automata) in protein bioinformatics. A cellular automaton (CA) is an open, flexible and discrete dynamic model that holds enormous potentials in modeling complex systems, in spite of the simplicity of the model itself. Researchers, scientists and practitioners from different fields have utilized cellular automata for visualizing protein sequences, investigating their evolution processes, and predicting their various attributes. Owing to its impressive power, intuitiveness and relative simplicity, the CA approach has great potential for use as a tool for bioinformatics.

  1. Creating Clinical Fuzzy Automata with Fuzzy Arden Syntax.

    PubMed

    de Bruin, Jeroen S; Steltzer, Heinz; Rappelsberger, Andrea; Adlassnig, Klaus-Peter

    2017-01-01

    Formal constructs for fuzzy sets and fuzzy logic are incorporated into Arden Syntax version 2.9 (Fuzzy Arden Syntax). With fuzzy sets, the relationships between measured or observed data and linguistic terms are expressed as degrees of compatibility that model the unsharpness of the boundaries of linguistic terms. Propositional uncertainty due to incomplete knowledge of relationships between clinical linguistic concepts is modeled with fuzzy logic. Fuzzy Arden Syntax also supports the construction of fuzzy state monitors. The latter are defined as monitors that employ fuzzy automata to observe gradual transitions between different stages of disease. As a use case, we re-implemented FuzzyARDS, a previously published clinical monitoring system for patients suffering from acute respiratory distress syndrome (ARDS). Using the re-implementation as an example, we show how key concepts of fuzzy automata, i.e., fuzzy states and parallel fuzzy state transitions, can be implemented in Fuzzy Arden Syntax. The results showed that fuzzy state monitors can be implemented in a straightforward manner.

  2. Strategies for using cellular automata to locate constrained layer damping on vibrating structures

    NASA Astrophysics Data System (ADS)

    Chia, C. M.; Rongong, J. A.; Worden, K.

    2009-01-01

    It is often hard to optimise constrained layer damping (CLD) for structures more complicated than simple beams and plates as its performance depends on its location, the shape of the applied patch, the mode shapes of the structure and the material properties. This paper considers the use of cellular automata (CA) in conjunction with finite element analysis to obtain an efficient coverage of CLD on structures. The effectiveness of several different sets of local rules governing the CA are compared against each other for a structure with known optimum coverage—namely a plate. The algorithm which attempts to replicate most closely known optimal configurations is considered the most successful. This algorithm is then used to generate an efficient CLD treatment that targets several modes of a curved composite panel. To validate the modelling approaches used, results are also presented of a comparison between theoretical and experimentally obtained modal properties of the damped curved panel.

  3. Cellular automata in photonic cavity arrays.

    PubMed

    Li, Jing; Liew, T C H

    2016-10-31

    We propose theoretically a photonic Turing machine based on cellular automata in arrays of nonlinear cavities coupled with artificial gauge fields. The state of the system is recorded making use of the bistability of driven cavities, in which losses are fully compensated by an external continuous drive. The sequential update of the automaton layers is achieved automatically, by the local switching of bistable states, without requiring any additional synchronization or temporal control.

  4. Universal map for cellular automata

    NASA Astrophysics Data System (ADS)

    García-Morales, V.

    2012-08-01

    A universal map is derived for all deterministic 1D cellular automata (CAs) containing no freely adjustable parameters and valid for any alphabet size and any neighborhood range (including non-symmetrical neighborhoods). The map can be extended to an arbitrary number of dimensions and topologies and to arbitrary order in time. Specific CA maps for the famous Conway's Game of Life and Wolfram's 256 elementary CAs are given. An induction method for CAs, based in the universal map, allows mathematical expressions for the orbits of a wide variety of elementary CAs to be systematically derived.

  5. Cellular Automata Ideas in Digital Circuits and Switching Theory.

    ERIC Educational Resources Information Center

    Siwak, Pawel P.

    1985-01-01

    Presents two examples which illustrate the usefulness of ideas from cellular automata. First, Lee's algorithm is recalled and its cellular nature shown. Then a problem from digraphs, which has arisen from analyzing predecessing configurations in the famous Conway's "game of life," is considered. (Author/JN)

  6. Cellular-automata-based learning network for pattern recognition

    NASA Astrophysics Data System (ADS)

    Tzionas, Panagiotis G.; Tsalides, Phillippos G.; Thanailakis, Adonios

    1991-11-01

    Most classification techniques either adopt an approach based directly on the statistical characteristics of the pattern classes involved, or they transform the patterns in a feature space and try to separate the point clusters in this space. An alternative approach based on memory networks has been presented, its novelty being that it can be implemented in parallel and it utilizes direct features of the patterns rather than statistical characteristics. This study presents a new approach for pattern classification using pseudo 2-D binary cellular automata (CA). This approach resembles the memory network classifier in the sense that it is based on an adaptive knowledge based formed during a training phase, and also in the fact that both methods utilize pattern features that are directly available. The main advantage of this approach is that the sensitivity of the pattern classifier can be controlled. The proposed pattern classifier has been designed using 1.5 micrometers design rules for an N-well CMOS process. Layout has been achieved using SOLO 1400. Binary pseudo 2-D hybrid additive CA (HACA) is described in the second section of this paper. The third section describes the operation of the pattern classifier and the fourth section presents some possible applications. The VLSI implementation of the pattern classifier is presented in the fifth section and, finally, the sixth section draws conclusions from the results obtained.

  7. Simulation of the 1992 Tessina landslide by a cellular automata model and future hazard scenarios

    NASA Astrophysics Data System (ADS)

    Avolio, MV; Di Gregorio, Salvatore; Mantovani, Franco; Pasuto, Alessandro; Rongo, Rocco; Silvano, Sandro; Spataro, William

    Cellular Automata are a powerful tool for modelling natural and artificial systems, which can be described in terms of local interactions of their constituent parts. Some types of landslides, such as debris/mud flows, match these requirements. The 1992 Tessina landslide has characteristics (slow mud flows) which make it appropriate for modelling by means of Cellular Automata, except for the initial phase of detachment, which is caused by a rotational movement that has no effect on the mud flow path. This paper presents the Cellular Automata approach for modelling slow mud/debris flows, the results of simulation of the 1992 Tessina landslide and future hazard scenarios based on the volumes of masses that could be mobilised in the future. They were obtained by adapting the Cellular Automata Model called SCIDDICA, which has been validated for very fast landslides. SCIDDICA was applied by modifying the general model to the peculiarities of the Tessina landslide. The simulations obtained by this initial model were satisfactory for forecasting the surface covered by mud. Calibration of the model, which was obtained from simulation of the 1992 event, was used for forecasting flow expansion during possible future reactivation. For this purpose two simulations concerning the collapse of about 1 million m 3 of material were tested. In one of these, the presence of a containment wall built in 1992 for the protection of the Tarcogna hamlet was inserted. The results obtained identified the conditions of high risk affecting the villages of Funes and Lamosano and show that this Cellular Automata approach can have a wide range of applications for different types of mud/debris flows.

  8. Quantifying a cellular automata simulation of electric vehicles

    NASA Astrophysics Data System (ADS)

    Hill, Graeme; Bell, Margaret; Blythe, Phil

    2014-12-01

    Within this work the Nagel-Schreckenberg (NS) cellular automata is used to simulate a basic cyclic road network. Results from SwitchEV, a real world Electric Vehicle trial which has collected more than two years of detailed electric vehicle data, are used to quantify the results of the NS automata, demonstrating similar power consumption behavior to that observed in the experimental results. In particular the efficiency of the electric vehicles reduces as the vehicle density increases, due in part to the reduced efficiency of EVs at low speeds, but also due to the energy consumption inherent in changing speeds. Further work shows the results from introducing spatially restricted speed restriction. In general it can be seen that induced congestion from spatially transient events propagates back through the road network and alters the energy and efficiency profile of the simulated vehicles, both before and after the speed restriction. Vehicles upstream from the restriction show a reduced energy usage and an increased efficiency, and vehicles downstream show an initial large increase in energy usage as they accelerate away from the speed restriction.

  9. Cellular automata model for urban road traffic flow considering pedestrian crossing street

    NASA Astrophysics Data System (ADS)

    Zhao, Han-Tao; Yang, Shuo; Chen, Xiao-Xu

    2016-11-01

    In order to analyze the effect of pedestrians' crossing street on vehicle flows, we investigated traffic characteristics of vehicles and pedestrians. Based on that, rules of lane changing, acceleration, deceleration, randomization and update are modified. Then we established two urban two-lane cellular automata models of traffic flow, one of which is about sections with non-signalized crosswalk and the other is on uncontrolled sections with pedestrians crossing street at random. MATLAB is used for numerical simulation of the different traffic conditions; meanwhile space-time diagram and relational graphs of traffic flow parameters are generated and then comparatively analyzed. Simulation results indicate that when vehicle density is lower than around 25 vehs/(km lane), pedestrians have modest impact on traffic flow, whereas when vehicle density is higher than about 60 vehs/(km lane), traffic speed and volume will decrease significantly especially on sections with non-signal-controlled crosswalk. The results illustrate that the proposed models reconstruct the traffic flow's characteristic with the situation where there are pedestrians crossing and can provide some practical reference for urban traffic management.

  10. Reasoning about real-time systems with temporal interval logic constraints on multi-state automata

    NASA Technical Reports Server (NTRS)

    Gabrielian, Armen

    1991-01-01

    Models of real-time systems using a single paradigm often turn out to be inadequate, whether the paradigm is based on states, rules, event sequences, or logic. A model-based approach to reasoning about real-time systems is presented in which a temporal interval logic called TIL is employed to define constraints on a new type of high level automata. The combination, called hierarchical multi-state (HMS) machines, can be used to model formally a real-time system, a dynamic set of requirements, the environment, heuristic knowledge about planning-related problem solving, and the computational states of the reasoning mechanism. In this framework, mathematical techniques were developed for: (1) proving the correctness of a representation; (2) planning of concurrent tasks to achieve goals; and (3) scheduling of plans to satisfy complex temporal constraints. HMS machines allow reasoning about a real-time system from a model of how truth arises instead of merely depending of what is true in a system.

  11. Hyper-heuristic Evolution of Dispatching Rules: A Comparison of Rule Representations.

    PubMed

    Branke, Jürgen; Hildebrandt, Torsten; Scholz-Reiter, Bernd

    2015-01-01

    Dispatching rules are frequently used for real-time, online scheduling in complex manufacturing systems. Design of such rules is usually done by experts in a time consuming trial-and-error process. Recently, evolutionary algorithms have been proposed to automate the design process. There are several possibilities to represent rules for this hyper-heuristic search. Because the representation determines the search neighborhood and the complexity of the rules that can be evolved, a suitable choice of representation is key for a successful evolutionary algorithm. In this paper we empirically compare three different representations, both numeric and symbolic, for automated rule design: A linear combination of attributes, a representation based on artificial neural networks, and a tree representation. Using appropriate evolutionary algorithms (CMA-ES for the neural network and linear representations, genetic programming for the tree representation), we empirically investigate the suitability of each representation in a dynamic stochastic job shop scenario. We also examine the robustness of the evolved dispatching rules against variations in the underlying job shop scenario, and visualize what the rules do, in order to get an intuitive understanding of their inner workings. Results indicate that the tree representation using an improved version of genetic programming gives the best results if many candidate rules can be evaluated, closely followed by the neural network representation that already leads to good results for small to moderate computational budgets. The linear representation is found to be competitive only for extremely small computational budgets.

  12. Synchronization, TIGoRS, and Information Flow in Complex Systems: Dispositional Cellular Automata.

    PubMed

    Sulis, William H

    2016-04-01

    Synchronization has a long history in physics where it refers to the phase matching of two identical oscillators. This notion has been extensively studied in physics as well as in biology, where it has been applied to such widely varying phenomena as the flashing of fireflies and firing of neurons in the brain. Human behavior, however, may be recurrent but it is not oscillatory even though many physiological systems do exhibit oscillatory tendencies. Moreover, much of human behaviour is collaborative and cooperative, where the individual behaviours may be distinct yet contemporaneous (if not simultaneous) and taken collectively express some functionality. In the context of behaviour, the important aspect is the repeated co-occurrence in time of behaviours that facilitate the propagation of information or of functionality, regardless of whether or not these behaviours are similar or identical. An example of this weaker notion of synchronization is transient induced global response synchronization (TIGoRS). Previous work has shown that TIGoRS is a ubiquitous phenomenon among complex systems, enabling them to stably parse environmental transients into salient units to which they stably respond. This leads to the notion of Sulis machines, which emergently generate a primitive linguistic structure through their dynamics. This article reviews the notion of TIGoRS and its expression in several complex systems models including tempered neural networks, driven cellular automata and cocktail party automata. The emergent linguistics of Sulis machines are discussed. A new class of complex systems model, the dispositional cellular automaton is introduced. A new metric for TIGoRS, the excess synchronization, is introduced and applied to the study of TIGoRS in dispositional cellular automata. It is shown that these automata exhibit a nonlinear synchronization response to certain perturbing transients.

  13. A Programmable Cellular-Automata Polarized Dirac Vacuum

    NASA Astrophysics Data System (ADS)

    Osoroma, Drahcir S.

    2013-09-01

    We explore properties of a `Least Cosmological Unit' (LCU) as an inherent spacetime raster tiling or tessellating the unique backcloth of Holographic Anthropic Multiverse (HAM) cosmology as an array of programmable cellular automata. The HAM vacuum is a scale-invariant HD extension of a covariant polarized Dirac vacuum with `bumps' and `holes' typically described by extended electromagnetic theory corresponding to an Einstein energy-dependent spacetime metric admitting a periodic photon mass. The new cosmology incorporates a unique form of M-Theoretic Calabi-Yau-Poincaré Dodecadedral-AdS5-DS5space (PDS) with mirror symmetry best described by an HD extension of Cramer's Transactional Interpretation when integrated also with an HD extension of the de Broglie-Bohm-Vigier causal interpretation of quantum theory. We incorporate a unique form of large-scale additional dimensionality (LSXD) bearing some similarity to that conceived by Randall and Sundrum; and extend the fundamental basis of our model to the Unified Field, UF. A Sagnac Effect rf-pulsed incursive resonance hierarchy is utilized to manipulate and ballistically program the geometric-topological properties of this putative LSXD space-spacetime network. The model is empirically testable; and it is proposed that a variety of new technologies will arise from ballistic programming of tessellated LCU vacuum cellular automata.

  14. Design of Improved Arithmetic Logic Unit in Quantum-Dot Cellular Automata

    NASA Astrophysics Data System (ADS)

    Heikalabad, Saeed Rasouli; Gadim, Mahya Rahimpour

    2018-06-01

    The quantum-dot cellular automata (QCA) can be replaced to overcome the limitation of CMOS technology. An arithmetic logic unit (ALU) is a basic structure of any computer devices. In this paper, design of improved single-bit arithmetic logic unit in quantum dot cellular automata is presented. The proposed structure for ALU has AND, OR, XOR and ADD operations. A unique 2:1 multiplexer, an ultra-efficient two-input XOR and a low complexity full adder are used in the proposed structure. Also, an extended design of this structure is provided for two-bit ALU in this paper. The proposed structure of ALU is simulated by QCADesigner and simulation result is evaluated. Evaluation results show that the proposed design has best performance in terms of area, complexity and delay compared to the previous designs.

  15. Design of Improved Arithmetic Logic Unit in Quantum-Dot Cellular Automata

    NASA Astrophysics Data System (ADS)

    Heikalabad, Saeed Rasouli; Gadim, Mahya Rahimpour

    2018-03-01

    The quantum-dot cellular automata (QCA) can be replaced to overcome the limitation of CMOS technology. An arithmetic logic unit (ALU) is a basic structure of any computer devices. In this paper, design of improved single-bit arithmetic logic unit in quantum dot cellular automata is presented. The proposed structure for ALU has AND, OR, XOR and ADD operations. A unique 2:1 multiplexer, an ultra-efficient two-input XOR and a low complexity full adder are used in the proposed structure. Also, an extended design of this structure is provided for two-bit ALU in this paper. The proposed structure of ALU is simulated by QCADesigner and simulation result is evaluated. Evaluation results show that the proposed design has best performance in terms of area, complexity and delay compared to the previous designs.

  16. Probabilistic arithmetic automata and their applications.

    PubMed

    Marschall, Tobias; Herms, Inke; Kaltenbach, Hans-Michael; Rahmann, Sven

    2012-01-01

    We present a comprehensive review on probabilistic arithmetic automata (PAAs), a general model to describe chains of operations whose operands depend on chance, along with two algorithms to numerically compute the distribution of the results of such probabilistic calculations. PAAs provide a unifying framework to approach many problems arising in computational biology and elsewhere. We present five different applications, namely 1) pattern matching statistics on random texts, including the computation of the distribution of occurrence counts, waiting times, and clump sizes under hidden Markov background models; 2) exact analysis of window-based pattern matching algorithms; 3) sensitivity of filtration seeds used to detect candidate sequence alignments; 4) length and mass statistics of peptide fragments resulting from enzymatic cleavage reactions; and 5) read length statistics of 454 and IonTorrent sequencing reads. The diversity of these applications indicates the flexibility and unifying character of the presented framework. While the construction of a PAA depends on the particular application, we single out a frequently applicable construction method: We introduce deterministic arithmetic automata (DAAs) to model deterministic calculations on sequences, and demonstrate how to construct a PAA from a given DAA and a finite-memory random text model. This procedure is used for all five discussed applications and greatly simplifies the construction of PAAs. Implementations are available as part of the MoSDi package. Its application programming interface facilitates the rapid development of new applications based on the PAA framework.

  17. Regulation Effects by Programmed Molecules for Transcription-Based Diagnostic Automata towards Therapeutic Use

    NASA Astrophysics Data System (ADS)

    Hirabayashi, Miki; Ohashi, Hirotada; Kubo, Tai

    We have presented experimental analysis on the controllability of our transcription-based diagnostic biomolecular automata by programmed molecules. Focusing on the noninvasive transcriptome diagnosis by salivary mRNAs, we already proposed the novel concept of diagnostic device using DNA computation. This system consists of the main computational element which has a stem shaped promoter region and a pseudo-loop shaped read-only memory region for transcription regulation through the conformation change caused by the recognition of disease-related biomarkers. We utilize the transcription of malachite green aptamer sequence triggered by the target recognition for observation of detection. This algorithm makes it possible to release RNA-aptamer drugs multiply, different from the digestion-based systems by the restriction enzyme which was proposed previously, for the in-vivo use, however, the controllability of aptamer release is not enough at the previous stage. In this paper, we verified the regulation effect on aptamer transcription by programmed molecules in basic conditions towards the developm! ent of therapeutic automata. These results would bring us one step closer to the realization of new intelligent diagnostic and therapeutic automata based on molecular circuits.

  18. Sampling from complex networks using distributed learning automata

    NASA Astrophysics Data System (ADS)

    Rezvanian, Alireza; Rahmati, Mohammad; Meybodi, Mohammad Reza

    2014-02-01

    A complex network provides a framework for modeling many real-world phenomena in the form of a network. In general, a complex network is considered as a graph of real world phenomena such as biological networks, ecological networks, technological networks, information networks and particularly social networks. Recently, major studies are reported for the characterization of social networks due to a growing trend in analysis of online social networks as dynamic complex large-scale graphs. Due to the large scale and limited access of real networks, the network model is characterized using an appropriate part of a network by sampling approaches. In this paper, a new sampling algorithm based on distributed learning automata has been proposed for sampling from complex networks. In the proposed algorithm, a set of distributed learning automata cooperate with each other in order to take appropriate samples from the given network. To investigate the performance of the proposed algorithm, several simulation experiments are conducted on well-known complex networks. Experimental results are compared with several sampling methods in terms of different measures. The experimental results demonstrate the superiority of the proposed algorithm over the others.

  19. On the derivation of approximations to cellular automata models and the assumption of independence.

    PubMed

    Davies, K J; Green, J E F; Bean, N G; Binder, B J; Ross, J V

    2014-07-01

    Cellular automata are discrete agent-based models, generally used in cell-based applications. There is much interest in obtaining continuum models that describe the mean behaviour of the agents in these models. Previously, continuum models have been derived for agents undergoing motility and proliferation processes, however, these models only hold under restricted conditions. In order to narrow down the reason for these restrictions, we explore three possible sources of error in deriving the model. These sources are the choice of limiting arguments, the use of a discrete-time model as opposed to a continuous-time model and the assumption of independence between the state of sites. We present a rigorous analysis in order to gain a greater understanding of the significance of these three issues. By finding a limiting regime that accurately approximates the conservation equation for the cellular automata, we are able to conclude that the inaccuracy between our approximation and the cellular automata is completely based on the assumption of independence. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Stochastic computing with biomolecular automata

    PubMed Central

    Adar, Rivka; Benenson, Yaakov; Linshiz, Gregory; Rosner, Amit; Tishby, Naftali; Shapiro, Ehud

    2004-01-01

    Stochastic computing has a broad range of applications, yet electronic computers realize its basic step, stochastic choice between alternative computation paths, in a cumbersome way. Biomolecular computers use a different computational paradigm and hence afford novel designs. We constructed a stochastic molecular automaton in which stochastic choice is realized by means of competition between alternative biochemical pathways, and choice probabilities are programmed by the relative molar concentrations of the software molecules coding for the alternatives. Programmable and autonomous stochastic molecular automata have been shown to perform direct analysis of disease-related molecular indicators in vitro and may have the potential to provide in situ medical diagnosis and cure. PMID:15215499

  1. Mining Distance Based Outliers in Near Linear Time with Randomization and a Simple Pruning Rule

    NASA Technical Reports Server (NTRS)

    Bay, Stephen D.; Schwabacher, Mark

    2003-01-01

    Defining outliers by their distance to neighboring examples is a popular approach to finding unusual examples in a data set. Recently, much work has been conducted with the goal of finding fast algorithms for this task. We show that a simple nested loop algorithm that in the worst case is quadratic can give near linear time performance when the data is in random order and a simple pruning rule is used. We test our algorithm on real high-dimensional data sets with millions of examples and show that the near linear scaling holds over several orders of magnitude. Our average case analysis suggests that much of the efficiency is because the time to process non-outliers, which are the majority of examples, does not depend on the size of the data set.

  2. Query Monitoring and Analysis for Database Privacy - A Security Automata Model Approach.

    PubMed

    Kumar, Anand; Ligatti, Jay; Tu, Yi-Cheng

    2015-11-01

    Privacy and usage restriction issues are important when valuable data are exchanged or acquired by different organizations. Standard access control mechanisms either restrict or completely grant access to valuable data. On the other hand, data obfuscation limits the overall usability and may result in loss of total value. There are no standard policy enforcement mechanisms for data acquired through mutual and copyright agreements. In practice, many different types of policies can be enforced in protecting data privacy. Hence there is the need for an unified framework that encapsulates multiple suites of policies to protect the data. We present our vision of an architecture named security automata model (SAM) to enforce privacy-preserving policies and usage restrictions. SAM analyzes the input queries and their outputs to enforce various policies, liberating data owners from the burden of monitoring data access. SAM allows administrators to specify various policies and enforces them to monitor queries and control the data access. Our goal is to address the problems of data usage control and protection through privacy policies that can be defined, enforced, and integrated with the existing access control mechanisms using SAM. In this paper, we lay out the theoretical foundation of SAM, which is based on an automata named Mandatory Result Automata. We also discuss the major challenges of implementing SAM in a real-world database environment as well as ideas to meet such challenges.

  3. A cardiac electrical activity model based on a cellular automata system in comparison with neural network model.

    PubMed

    Khan, Muhammad Sadiq Ali; Yousuf, Sidrah

    2016-03-01

    Cardiac Electrical Activity is commonly distributed into three dimensions of Cardiac Tissue (Myocardium) and evolves with duration of time. The indicator of heart diseases can occur randomly at any time of a day. Heart rate, conduction and each electrical activity during cardiac cycle should be monitor non-invasively for the assessment of "Action Potential" (regular) and "Arrhythmia" (irregular) rhythms. Many heart diseases can easily be examined through Automata model like Cellular Automata concepts. This paper deals with the different states of cardiac rhythms using cellular automata with the comparison of neural network also provides fast and highly effective stimulation for the contraction of cardiac muscles on the Atria in the result of genesis of electrical spark or wave. The specific formulated model named as "States of automaton Proposed Model for CEA (Cardiac Electrical Activity)" by using Cellular Automata Methodology is commonly shows the three states of cardiac tissues conduction phenomena (i) Resting (Relax and Excitable state), (ii) ARP (Excited but Absolutely refractory Phase i.e. Excited but not able to excite neighboring cells) (iii) RRP (Excited but Relatively Refractory Phase i.e. Excited and able to excite neighboring cells). The result indicates most efficient modeling with few burden of computation and it is Action Potential during the pumping of blood in cardiac cycle.

  4. Social interactions of eating behaviour among high school students: a cellular automata approach

    PubMed Central

    2012-01-01

    Background Overweight and obesity in children and adolescents is a global epidemic posing problems for both developed and developing nations. The prevalence is particularly alarming in developed nations, such as the United States, where approximately one in three school-aged adolescents (ages 12-19) are overweight or obese. Evidence suggests that weight gain in school-aged adolescents is related to energy imbalance exacerbated by the negative aspects of the school food environment, such as presence of unhealthy food choices. While a well-established connection exists between the food environment, presently there is a lack of studies investigating the impact of the social environment and associated interactions of school-age adolescents. This paper uses a mathematical modelling approach to explore how social interactions among high school adolescents can affect their eating behaviour and food choice. Methods In this paper we use a Cellular Automata (CA) modelling approach to explore how social interactions among school-age adolescents can affect eating behaviour, and food choice. Our CA model integrates social influences and transition rules to simulate the way individuals would interact in a social community (e.g., school cafeteria). To replicate these social interactions, we chose the Moore neighbourhood which allows all neighbours (eights cells in a two-dimensional square lattice) to influence the central cell. Our assumption is that individuals belong to any of four states; Bring Healthy, Bring Unhealthy, Purchase Healthy, and Purchase Unhealthy, and will influence each other according to parameter settings and transition rules. Simulations were run to explore how the different states interact under varying parameter settings. Results This study, through simulations, illustrates that students will change their eating behaviour from unhealthy to healthy as a result of positive social and environmental influences. In general, there is one common characteristic of

  5. Optimal placement of fast cut back units based on the theory of cellular automata and agent

    NASA Astrophysics Data System (ADS)

    Yan, Jun; Yan, Feng

    2017-06-01

    The thermal power generation units with the function of fast cut back could serve power for auxiliary system and keep island operation after a major blackout, so they are excellent substitute for the traditional black-start power sources. Different placement schemes for FCB units have different influence on the subsequent restoration process. Considering the locality of the emergency dispatching rules, the unpredictability of specific dispatching instructions and unexpected situations like failure of transmission line energization, a novel deduction model for network reconfiguration based on the theory of cellular automata and agent is established. Several indexes are then defined for evaluating the placement schemes for FCB units. The attribute weights determination method based on subjective and objective integration and grey relational analysis are combinatorically used to determine the optimal placement scheme for FCB unit. The effectiveness of the proposed method is validated by the test results on the New England 10-unit 39-bus power system.

  6. Three-dimensional cellular automata as a model of a seismic fault

    NASA Astrophysics Data System (ADS)

    Gálvez, G.; Muñoz, A.

    2017-01-01

    The Earth's crust is broken into a series of plates, whose borders are the seismic fault lines and it is where most of the earthquakes occur. This plating system can in principle be described by a set of nonlinear coupled equations describing the motion of the plates, its stresses, strains and other characteristics. Such a system of equations is very difficult to solve, and nonlinear parts leads to a chaotic behavior, which is not predictable. In 1989, Bak and Tang presented an earthquake model based on the sand pile cellular automata. The model though simple, provides similar results to those observed in actual earthquakes. In this work the cellular automata in three dimensions is proposed as a best model to approximate a seismic fault. It is noted that the three-dimensional model reproduces similar properties to those observed in real seismicity, especially, the Gutenberg-Richter law.

  7. A class of cellular automata modeling winnerless competition

    NASA Astrophysics Data System (ADS)

    Afraimovich, V.; Ordaz, F. C.; Urías, J.

    2002-06-01

    Neural units introduced by Rabinovich et al. ("Sensory coding with dynamically competitive networks," UCSD and CIT, February 1999) motivate a class of cellular automata (CA) where spatio-temporal encoding is feasible. The spatio-temporal information capacity of a CA is estimated by the information capacity of the attractor set, which happens to be finitely specified. Two-dimensional CA are studied in detail. An example is given for which the attractor is not a subshift.

  8. Finding and defining the natural automata acting in living plants: Toward the synthetic biology for robotics and informatics in vivo.

    PubMed

    Kawano, Tomonori; Bouteau, François; Mancuso, Stefano

    2012-11-01

    The automata theory is the mathematical study of abstract machines commonly studied in the theoretical computer science and highly interdisciplinary fields that combine the natural sciences and the theoretical computer science. In the present review article, as the chemical and biological basis for natural computing or informatics, some plants, plant cells or plant-derived molecules involved in signaling are listed and classified as natural sequential machines (namely, the Mealy machines or Moore machines) or finite state automata. By defining the actions (states and transition functions) of these natural automata, the similarity between the computational data processing and plant decision-making processes became obvious. Finally, their putative roles as the parts for plant-based computing or robotic systems are discussed.

  9. Query Monitoring and Analysis for Database Privacy - A Security Automata Model Approach

    PubMed Central

    Kumar, Anand; Ligatti, Jay; Tu, Yi-Cheng

    2015-01-01

    Privacy and usage restriction issues are important when valuable data are exchanged or acquired by different organizations. Standard access control mechanisms either restrict or completely grant access to valuable data. On the other hand, data obfuscation limits the overall usability and may result in loss of total value. There are no standard policy enforcement mechanisms for data acquired through mutual and copyright agreements. In practice, many different types of policies can be enforced in protecting data privacy. Hence there is the need for an unified framework that encapsulates multiple suites of policies to protect the data. We present our vision of an architecture named security automata model (SAM) to enforce privacy-preserving policies and usage restrictions. SAM analyzes the input queries and their outputs to enforce various policies, liberating data owners from the burden of monitoring data access. SAM allows administrators to specify various policies and enforces them to monitor queries and control the data access. Our goal is to address the problems of data usage control and protection through privacy policies that can be defined, enforced, and integrated with the existing access control mechanisms using SAM. In this paper, we lay out the theoretical foundation of SAM, which is based on an automata named Mandatory Result Automata. We also discuss the major challenges of implementing SAM in a real-world database environment as well as ideas to meet such challenges. PMID:26997936

  10. Distributed learning automata-based algorithm for community detection in complex networks

    NASA Astrophysics Data System (ADS)

    Khomami, Mohammad Mehdi Daliri; Rezvanian, Alireza; Meybodi, Mohammad Reza

    2016-03-01

    Community structure is an important and universal topological property of many complex networks such as social and information networks. The detection of communities of a network is a significant technique for understanding the structure and function of networks. In this paper, we propose an algorithm based on distributed learning automata for community detection (DLACD) in complex networks. In the proposed algorithm, each vertex of network is equipped with a learning automation. According to the cooperation among network of learning automata and updating action probabilities of each automaton, the algorithm interactively tries to identify high-density local communities. The performance of the proposed algorithm is investigated through a number of simulations on popular synthetic and real networks. Experimental results in comparison with popular community detection algorithms such as walk trap, Danon greedy optimization, Fuzzy community detection, Multi-resolution community detection and label propagation demonstrated the superiority of DLACD in terms of modularity, NMI, performance, min-max-cut and coverage.

  11. Supervisory control of (max,+) automata: extensions towards applications

    NASA Astrophysics Data System (ADS)

    Lahaye, Sébastien; Komenda, Jan; Boimond, Jean-Louis

    2015-12-01

    In this paper, supervisory control of (max,+) automata is studied. The synthesis of maximally permissive and just-in-time supervisor, as well as the synthesis of minimally permissive and just-after-time supervisor, are proposed. Results are also specialised to non-decreasing solutions, because only such supervisors can be realised in practice. The inherent issue of rationality raised recently is discussed. An illustration of concepts and results is presented through an example of a flexible manufacturing system.

  12. Stimulus-Response Theory of Finite Automata, Technical Report No. 133.

    ERIC Educational Resources Information Center

    Suppes, Patrick

    The central aim of this paper and its projected successors is to prove in detail that stimulus-response theory, or at least a mathematically precise version, can give an account of the learning of many phrase-structure grammars. Section 2 is concerned with standard notions of finite and probabilistic automata. An automaton is defined as a device…

  13. Finding and defining the natural automata acting in living plants: Toward the synthetic biology for robotics and informatics in vivo

    PubMed Central

    Kawano, Tomonori; Bouteau, François; Mancuso, Stefano

    2012-01-01

    The automata theory is the mathematical study of abstract machines commonly studied in the theoretical computer science and highly interdisciplinary fields that combine the natural sciences and the theoretical computer science. In the present review article, as the chemical and biological basis for natural computing or informatics, some plants, plant cells or plant-derived molecules involved in signaling are listed and classified as natural sequential machines (namely, the Mealy machines or Moore machines) or finite state automata. By defining the actions (states and transition functions) of these natural automata, the similarity between the computational data processing and plant decision-making processes became obvious. Finally, their putative roles as the parts for plant-based computing or robotic systems are discussed. PMID:23336016

  14. A Cellular Automata Model of Bone Formation

    PubMed Central

    Van Scoy, Gabrielle K.; George, Estee L.; Asantewaa, Flora Opoku; Kerns, Lucy; Saunders, Marnie M.; Prieto-Langarica, Alicia

    2017-01-01

    Bone remodeling is an elegantly orchestrated process by which osteocytes, osteoblasts and osteoclasts function as a syncytium to maintain or modify bone. On the microscopic level, bone consists of cells that create, destroy and monitor the bone matrix. These cells interact in a coordinated manner to maintain a tightly regulated homeostasis. It is this regulation that is responsible for the observed increase in bone gain in the dominant arm of a tennis player and the observed increase in bone loss associated with spaceflight and osteoporosis. The manner in which these cells interact to bring about a change in bone quality and quantity has yet to be fully elucidated. But efforts to understand the multicellular complexity can ultimately lead to eradication of metabolic bone diseases such as osteoporosis and improved implant longevity. Experimentally validated mathematical models that simulate functional activity and offer eventual predictive capabilities offer tremendous potential in understanding multicellular bone remodeling. Here we undertake the initial challenge to develop a mathematical model of bone formation validated with in vitro data obtained from osteoblastic bone cells induced to mineralize and quantified at 26 days of culture. A cellular automata model was constructed to simulate the in vitro characterization. Permutation tests were performed to compare the distribution of the mineralization in the cultures and the distribution of the mineralization in the mathematical models. The results of the permutation test show the distribution of mineralization from the characterization and mathematical model come from the same probability distribution, therefore validating the cellular automata model. PMID:28189632

  15. A cellular automata model of bone formation.

    PubMed

    Van Scoy, Gabrielle K; George, Estee L; Opoku Asantewaa, Flora; Kerns, Lucy; Saunders, Marnie M; Prieto-Langarica, Alicia

    2017-04-01

    Bone remodeling is an elegantly orchestrated process by which osteocytes, osteoblasts and osteoclasts function as a syncytium to maintain or modify bone. On the microscopic level, bone consists of cells that create, destroy and monitor the bone matrix. These cells interact in a coordinated manner to maintain a tightly regulated homeostasis. It is this regulation that is responsible for the observed increase in bone gain in the dominant arm of a tennis player and the observed increase in bone loss associated with spaceflight and osteoporosis. The manner in which these cells interact to bring about a change in bone quality and quantity has yet to be fully elucidated. But efforts to understand the multicellular complexity can ultimately lead to eradication of metabolic bone diseases such as osteoporosis and improved implant longevity. Experimentally validated mathematical models that simulate functional activity and offer eventual predictive capabilities offer tremendous potential in understanding multicellular bone remodeling. Here we undertake the initial challenge to develop a mathematical model of bone formation validated with in vitro data obtained from osteoblastic bone cells induced to mineralize and quantified at 26 days of culture. A cellular automata model was constructed to simulate the in vitro characterization. Permutation tests were performed to compare the distribution of the mineralization in the cultures and the distribution of the mineralization in the mathematical models. The results of the permutation test show the distribution of mineralization from the characterization and mathematical model come from the same probability distribution, therefore validating the cellular automata model. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Last-position elimination-based learning automata.

    PubMed

    Zhang, Junqi; Wang, Cheng; Zhou, MengChu

    2014-12-01

    An update scheme of the state probability vector of actions is critical for learning automata (LA). The most popular is the pursuit scheme that pursues the estimated optimal action and penalizes others. This paper proposes a reverse philosophy that leads to last-position elimination-based learning automata (LELA). The action graded last in terms of the estimated performance is penalized by decreasing its state probability and is eliminated when its state probability becomes zero. All active actions, that is, actions with nonzero state probability, equally share the penalized state probability from the last-position action at each iteration. The proposed LELA is characterized by the relaxed convergence condition for the optimal action, the accelerated step size of the state probability update scheme for the estimated optimal action, and the enriched sampling for the estimated nonoptimal actions. The proof of the ϵ-optimal property for the proposed algorithm is presented. Last-position elimination is a widespread philosophy in the real world and has proved to be also helpful for the update scheme of the learning automaton via the simulations of well-known benchmark environments. In the simulations, two versions of the LELA, using different selection strategies of the last action, are compared with the classical pursuit algorithms Discretized Pursuit Reward-Inaction (DP(RI)) and Discretized Generalized Pursuit Algorithm (DGPA). Simulation results show that the proposed schemes achieve significantly faster convergence and higher accuracy than the classical ones. Specifically, the proposed schemes reduce the interval to find the best parameter for a specific environment in the classical pursuit algorithms. Thus, they can have their parameter tuning easier to perform and can save much more time when applied to a practical case. Furthermore, the convergence curves and the corresponding variance coefficient curves of the contenders are illustrated to characterize their

  17. The detection and stabilisation of limit cycle for deterministic finite automata

    NASA Astrophysics Data System (ADS)

    Han, Xiaoguang; Chen, Zengqiang; Liu, Zhongxin; Zhang, Qing

    2018-04-01

    In this paper, the topological structure properties of deterministic finite automata (DFA), under the framework of the semi-tensor product of matrices, are investigated. First, the dynamics of DFA are converted into a new algebraic form as a discrete-time linear system by means of Boolean algebra. Using this algebraic description, the approach of calculating the limit cycles of different lengths is given. Second, we present two fundamental concepts, namely, domain of attraction of limit cycle and prereachability set. Based on the prereachability set, an explicit solution of calculating domain of attraction of a limit cycle is completely characterised. Third, we define the globally attractive limit cycle, and then the necessary and sufficient condition for verifying whether all state trajectories of a DFA enter a given limit cycle in a finite number of transitions is given. Fourth, the problem of whether a DFA can be stabilised to a limit cycle by the state feedback controller is discussed. Criteria for limit cycle-stabilisation are established. All state feedback controllers which implement the minimal length trajectories from each state to the limit cycle are obtained by using the proposed algorithm. Finally, an illustrative example is presented to show the theoretical results.

  18. Modelling robot's behaviour using finite automata

    NASA Astrophysics Data System (ADS)

    Janošek, Michal; Žáček, Jaroslav

    2017-07-01

    This paper proposes a model of a robot's behaviour described by finite automata. We split robot's knowledge into several knowledge bases which are used by the inference mechanism of the robot's expert system to make a logic deduction. Each knowledgebase is dedicated to the particular behaviour domain and the finite automaton helps us switching among these knowledge bases with the respect of actual situation. Our goal is to simplify and reduce complexity of one big knowledgebase splitting it into several pieces. The advantage of this model is that we can easily add new behaviour by adding new knowledgebase and add this behaviour into the finite automaton and define necessary states and transitions.

  19. Non-Condon equilibrium Fermi’s golden rule electronic transition rate constants via the linearized semiclassical method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Xiang; Geva, Eitan

    2016-06-28

    In this paper, we test the accuracy of the linearized semiclassical (LSC) expression for the equilibrium Fermi’s golden rule rate constant for electronic transitions in the presence of non-Condon effects. We do so by performing a comparison with the exact quantum-mechanical result for a model where the donor and acceptor potential energy surfaces are parabolic and identical except for shifts in the equilibrium energy and geometry, and the coupling between them is linear in the nuclear coordinates. Since non-Condon effects may or may not give rise to conical intersections, both possibilities are examined by considering: (1) A modified Garg-Onuchic-Ambegaokar modelmore » for charge transfer in the condensed phase, where the donor-acceptor coupling is linear in the primary mode coordinate, and for which non-Condon effects do not give rise to a conical intersection; (2) the linear vibronic coupling model for electronic transitions in gas phase molecules, where non-Condon effects give rise to conical intersections. We also present a comprehensive comparison between the linearized semiclassical expression and a progression of more approximate expressions. The comparison is performed over a wide range of frictions and temperatures for model (1) and over a wide range of temperatures for model (2). The linearized semiclassical method is found to reproduce the exact quantum-mechanical result remarkably well for both models over the entire range of parameters under consideration. In contrast, more approximate expressions are observed to deviate considerably from the exact result in some regions of parameter space.« less

  20. Stair evacuation simulation based on cellular automata considering evacuees’ walk preferences

    NASA Astrophysics Data System (ADS)

    Ding, Ning; Zhang, Hui; Chen, Tao; Peter, B. Luh

    2015-06-01

    As a physical model, the cellular automata (CA) model is widely used in many areas, such as stair evacuation. However, existing CA models do not consider evacuees’ walk preferences nor psychological status, and the structure of the basic model is unapplicable for the stair structure. This paper is to improve the stair evacuation simulation by addressing these issues, and a new cellular automata model is established. Several evacuees’ walk preference and how evacuee’s psychology influences their behaviors are introduced into this model. Evacuees’ speeds will be influenced by these features. To validate this simulation, two fire drills held in two high-rise buildings are video-recorded. It is found that the simulation results are similar to the fire drill results. The structure of this model is simple, and it is easy to further develop and utilize in different buildings with various kinds of occupants. Project supported by the National Basic Research Program of China (Grant No. 2012CB719705) and the National Natural Science Foundation of China (Grant Nos. 91224008, 91024032, and 71373139).

  1. A cellular automata approach for modeling surface water runoff

    NASA Astrophysics Data System (ADS)

    Jozefik, Zoltan; Nanu Frechen, Tobias; Hinz, Christoph; Schmidt, Heiko

    2015-04-01

    This abstract reports the development and application of a two-dimensional cellular automata based model, which couples the dynamics of overland flow, infiltration processes and surface evolution through sediment transport. The natural hill slopes are represented by their topographic elevation and spatially varying soil properties infiltration rates and surface roughness coefficients. This model allows modeling of Hortonian overland flow and infiltration during complex rainfall events. An advantage of the cellular automata approach over the kinematic wave equations is that wet/dry interfaces that often appear with rainfall overland flows can be accurately captured and are not a source of numerical instabilities. An adaptive explicit time stepping scheme allows for rainfall events to be adequately resolved in time, while large time steps are taken during dry periods to provide for simulation run time efficiency. The time step is constrained by the CFL condition and mass conservation considerations. The spatial discretization is shown to be first-order accurate. For validation purposes, hydrographs for non-infiltrating and infiltrating plates are compared to the kinematic wave analytic solutions and data taken from literature [1,2]. Results show that our cellular automata model quantitatively accurately reproduces hydrograph patterns. However, recent works have showed that even through the hydrograph is satisfyingly reproduced, the flow field within the plot might be inaccurate [3]. For a more stringent validation, we compare steady state velocity, water flux, and water depth fields to rainfall simulation experiments conducted in Thies, Senegal [3]. Comparisons show that our model is able to accurately capture these flow properties. Currently, a sediment transport and deposition module is being implemented and tested. [1] M. Rousseau, O. Cerdan, O. Delestre, F. Dupros, F. James, S. Cordier. Overland flow modeling with the Shallow Water Equation using a well balanced

  2. Stochastic cellular automata model of cell migration, proliferation and differentiation: validation with in vitro cultures of muscle satellite cells.

    PubMed

    Garijo, N; Manzano, R; Osta, R; Perez, M A

    2012-12-07

    Cell migration and proliferation has been modelled in the literature as a process similar to diffusion. However, using diffusion models to simulate the proliferation and migration of cells tends to create a homogeneous distribution in the cell density that does not correlate to empirical observations. In fact, the mechanism of cell dispersal is not diffusion. Cells disperse by crawling or proliferation, or are transported in a moving fluid. The use of cellular automata, particle models or cell-based models can overcome this limitation. This paper presents a stochastic cellular automata model to simulate the proliferation, migration and differentiation of cells. These processes are considered as completely stochastic as well as discrete. The model developed was applied to predict the behaviour of in vitro cell cultures performed with adult muscle satellite cells. Moreover, non homogeneous distribution of cells has been observed inside the culture well and, using the above mentioned stochastic cellular automata model, we have been able to predict this heterogeneous cell distribution and compute accurate quantitative results. Differentiation was also incorporated into the computational simulation. The results predicted the myotube formation that typically occurs with adult muscle satellite cells. In conclusion, we have shown how a stochastic cellular automata model can be implemented and is capable of reproducing the in vitro behaviour of adult muscle satellite cells. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Cellular Automata with Anticipation: Examples and Presumable Applications

    NASA Astrophysics Data System (ADS)

    Krushinsky, Dmitry; Makarenko, Alexander

    2010-11-01

    One of the most prospective new methodologies for modelling is the so-called cellular automata (CA) approach. According to this paradigm, the models are built from simple elements connected into regular structures with local interaction between neighbours. The patterns of connections usually have a simple geometry (lattices). As one of the classical examples of CA we mention the game `Life' by J. Conway. This paper presents two examples of CA with anticipation property. These examples include a modification of the game `Life' and a cellular model of crowd movement.

  4. Cellular automata model for drug release from binary matrix and reservoir polymeric devices.

    PubMed

    Johannes Laaksonen, Timo; Mikael Laaksonen, Hannu; Tapio Hirvonen, Jouni; Murtomäki, Lasse

    2009-04-01

    Kinetics of drug release from polymeric tablets, inserts and implants is an important and widely studied area. Here we present a new and widely applicable cellular automata model for diffusion and erosion processes occurring during drug release from polymeric drug release devices. The model divides a 2D representation of the release device into an array of cells. Each cell contains information about the material, drug, polymer or solvent that the domain contains. Cells are then allowed to rearrange according to statistical rules designed to match realistic drug release. Diffusion is modeled by a random walk of mobile cells and kinetics of chemical or physical processes by probabilities of conversion from one state to another. This is according to the basis of diffusion coefficients and kinetic rate constants, which are on fundamental level just probabilities for certain occurrences. The model is applied to three kinds of devices with different release mechanisms: erodable matrices, diffusion through channels or pores and membrane controlled release. The dissolution curves obtained are compared to analytical models from literature and the validity of the model is considered. The model is shown to be compatible with all three release devices, highlighting easy adaptability of the model to virtually any release system and geometry. Further extension and applications of the model are envisioned.

  5. Modeling of urban growth using cellular automata (CA) optimized by Particle Swarm Optimization (PSO)

    NASA Astrophysics Data System (ADS)

    Khalilnia, M. H.; Ghaemirad, T.; Abbaspour, R. A.

    2013-09-01

    In this paper, two satellite images of Tehran, the capital city of Iran, which were taken by TM and ETM+ for years 1988 and 2010 are used as the base information layers to study the changes in urban patterns of this metropolis. The patterns of urban growth for the city of Tehran are extracted in a period of twelve years using cellular automata setting the logistic regression functions as transition functions. Furthermore, the weighting coefficients of parameters affecting the urban growth, i.e. distance from urban centers, distance from rural centers, distance from agricultural centers, and neighborhood effects were selected using PSO. In order to evaluate the results of the prediction, the percent correct match index is calculated. According to the results, by combining optimization techniques with cellular automata model, the urban growth patterns can be predicted with accuracy up to 75 %.

  6. A dynamically reconfigurable logic cell: from artificial neural networks to quantum-dot cellular automata

    NASA Astrophysics Data System (ADS)

    Naqvi, Syed Rameez; Akram, Tallha; Iqbal, Saba; Haider, Sajjad Ali; Kamran, Muhammad; Muhammad, Nazeer

    2018-02-01

    Considering the lack of optimization support for Quantum-dot Cellular Automata, we propose a dynamically reconfigurable logic cell capable of implementing various logic operations by means of artificial neural networks. The cell can be reconfigured to any 2-input combinational logic gate by altering the strength of connections, called weights and biases. We demonstrate how these cells may appositely be organized to perform multi-bit arithmetic and logic operations. The proposed work is important in that it gives a standard implementation of an 8-bit arithmetic and logic unit for quantum-dot cellular automata with minimal area and latency overhead. We also compare the proposed design with a few existing arithmetic and logic units, and show that it is more area efficient than any equivalent available in literature. Furthermore, the design is adaptable to 16, 32, and 64 bit architectures.

  7. Simulation of elastic wave propagation using cellular automata and peridynamics, and comparison with experiments

    DOE PAGES

    Nishawala, Vinesh V.; Ostoja-Starzewski, Martin; Leamy, Michael J.; ...

    2015-09-10

    Peridynamics is a non-local continuum mechanics formulation that can handle spatial discontinuities as the governing equations are integro-differential equations which do not involve gradients such as strains and deformation rates. This paper employs bond-based peridynamics. Cellular Automata is a local computational method which, in its rectangular variant on interior domains, is mathematically equivalent to the central difference finite difference method. However, cellular automata does not require the derivation of the governing partial differential equations and provides for common boundary conditions based on physical reasoning. Both methodologies are used to solve a half-space subjected to a normal load, known as Lamb’smore » Problem. The results are compared with theoretical solution from classical elasticity and experimental results. Furthermore, this paper is used to validate our implementation of these methods.« less

  8. A verification strategy for web services composition using enhanced stacked automata model.

    PubMed

    Nagamouttou, Danapaquiame; Egambaram, Ilavarasan; Krishnan, Muthumanickam; Narasingam, Poonkuzhali

    2015-01-01

    Currently, Service-Oriented Architecture (SOA) is becoming the most popular software architecture of contemporary enterprise applications, and one crucial technique of its implementation is web services. Individual service offered by some service providers may symbolize limited business functionality; however, by composing individual services from different service providers, a composite service describing the intact business process of an enterprise can be made. Many new standards have been defined to decipher web service composition problem namely Business Process Execution Language (BPEL). BPEL provides an initial work for forming an Extended Markup Language (XML) specification language for defining and implementing business practice workflows for web services. The problems with most realistic approaches to service composition are the verification of composed web services. It has to depend on formal verification method to ensure the correctness of composed services. A few research works has been carried out in the literature survey for verification of web services for deterministic system. Moreover the existing models did not address the verification properties like dead transition, deadlock, reachability and safetyness. In this paper, a new model to verify the composed web services using Enhanced Stacked Automata Model (ESAM) has been proposed. The correctness properties of the non-deterministic system have been evaluated based on the properties like dead transition, deadlock, safetyness, liveness and reachability. Initially web services are composed using Business Process Execution Language for Web Service (BPEL4WS) and it is converted into ESAM (combination of Muller Automata (MA) and Push Down Automata (PDA)) and it is transformed into Promela language, an input language for Simple ProMeLa Interpreter (SPIN) tool. The model is verified using SPIN tool and the results revealed better recital in terms of finding dead transition and deadlock in contrast to the

  9. Dengue fever spreading based on probabilistic cellular automata with two lattices

    NASA Astrophysics Data System (ADS)

    Pereira, F. M. M.; Schimit, P. H. T.

    2018-06-01

    Modeling and simulation of mosquito-borne diseases have gained attention due to a growing incidence in tropical countries in the past few years. Here, we study the dengue spreading in a population modeled by cellular automata, where there are two lattices to model the human-mosquitointeraction: one lattice for human individuals, and one lattice for mosquitoes in order to enable different dynamics in populations. The disease considered is the dengue fever with one, two or three different serotypes coexisting in population. Although many regions exhibit the incidence of only one serotype, here we set a complete framework to also study the occurrence of two and three serotypes at the same time in a population. Furthermore, the flexibility of the model allows its use to other mosquito-borne diseases, like chikungunya, yellow fever and malaria. An approximation of the cellular automata is proposed in terms of ordinary differential equations; the spreading of mosquitoes is studied and the influence of some model parameters are analyzed with numerical simulations. Finally, a method to combat dengue spreading is simulated based on a reduction of mosquito birth and mosquito bites in population.

  10. From QCA (Quantum Cellular Automata) to Organocatalytic Reactions with Stabilized Carbenium Ions.

    PubMed

    Gualandi, Andrea; Mengozzi, Luca; Manoni, Elisabetta; Giorgio Cozzi, Pier

    2016-06-01

    What do quantum cellular automata (QCA), "on water" reactions, and SN 1-type organocatalytic transformations have in common? The link between these distant arguments is the practical access to useful intermediates and key products through the use of stabilized carbenium ions. Over 10 years, starting with a carbenium ion bearing a ferrocenyl group, to the 1,3-benzodithiolylium carbenium ion, our group has exploited the use of these intermediates in useful and practical synthetic transformations. In particular, we have applied the use of carbenium ions to stereoselective organocatalytic alkylation reactions, showing a possible solution for the "holy grail of organocatalysis". Examples of the use of these quite stabilized intermediates are now also considered in organometallic chemistry. On the other hand, the stable carbenium ions are also applied to tailored molecules adapted to quantum cellular automata, a new possible paradigm for computation. Carbenium ions are not a problem, they can be a/the solution! © 2016 The Chemical Society of Japan & Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. All-DNA finite-state automata with finite memory

    PubMed Central

    Wang, Zhen-Gang; Elbaz, Johann; Remacle, F.; Levine, R. D.; Willner, Itamar

    2010-01-01

    Biomolecular logic devices can be applied for sensing and nano-medicine. We built three DNA tweezers that are activated by the inputs H+/OH-; ; nucleic acid linker/complementary antilinker to yield a 16-states finite-state automaton. The outputs of the automata are the configuration of the respective tweezers (opened or closed) determined by observing fluorescence from a fluorophore/quencher pair at the end of the arms of the tweezers. The system exhibits a memory because each current state and output depend not only on the source configuration but also on past states and inputs. PMID:21135212

  12. [Simulation of urban ecological security pattern based on cellular automata: a case of Dongguan City, Guangdong Province of South China].

    PubMed

    Yang, Qing-Sheng; Qiao, Ji-Gang; Ai, Bin

    2013-09-01

    Taking the Dongguan City with rapid urbanization as a case, and selecting landscape ecological security level as evaluation criterion, the urbanization cellular number of 1 km x 1 km ecological security cells was obtained, and imbedded into the transition rules of cellular automata (CA) as the restraint term to control urban development, establish ecological security urban CA, and simulate ecological security urban development pattern. The results showed the integrated landscape ecological security index of the City decreased from 0.497 in 1998 to 0.395 in 2005, indicating that the ecological security at landscape scale was decreased. The CA-simulated integrated ecological security index of the City in 2005 was increased from the measured 0.395 to 0.479, showing that the simulated urban landscape ecological pressure from human became lesser, ecological security became better, and integrated landscape ecological security became higher. CA could be used as an effective tool in researching urban ecological security.

  13. Fuzzy automata and pattern matching

    NASA Technical Reports Server (NTRS)

    Setzer, C. B.; Warsi, N. A.

    1986-01-01

    A wide-ranging search for articles and books concerned with fuzzy automata and syntactic pattern recognition is presented. A number of survey articles on image processing and feature detection were included. Hough's algorithm is presented to illustrate the way in which knowledge about an image can be used to interpret the details of the image. It was found that in hand generated pictures, the algorithm worked well on following the straight lines, but had great difficulty turning corners. An algorithm was developed which produces a minimal finite automaton recognizing a given finite set of strings. One difficulty of the construction is that, in some cases, this minimal automaton is not unique for a given set of strings and a given maximum length. This algorithm compares favorably with other inference algorithms. More importantly, the algorithm produces an automaton with a rigorously described relationship to the original set of strings that does not depend on the algorithm itself.

  14. Using cellular automata to simulate forest fire propagation in Portugal

    NASA Astrophysics Data System (ADS)

    Freire, Joana; daCamara, Carlos

    2017-04-01

    Wildfires in the Mediterranean region have severe damaging effects mainly due to large fire events [1, 2]. When restricting to Portugal, wildfires have burned over 1:4 million ha in the last decade. Considering the increasing tendency in the extent and severity of wildfires [1, 2], the availability of modeling tools of fire episodes is of crucial importance. Two main types of mathematical models are generally available, namely deterministic and stochastic models. Deterministic models attempt a description of fires, fuel and atmosphere as multiphase continua prescribing mass, momentum and energy conservation, which typically leads to systems of coupled PDEs to be solved numerically on a grid. Simpler descriptions, such as FARSITE, neglect the interaction with atmosphere and propagate the fire front using wave techniques. One of the most important stochastic models are Cellular Automata (CA), in which space is discretized into cells, and physical quantities take on a finite set of values at each cell. The cells evolve in discrete time according to a set of transition rules, and the states of the neighboring cells. In the present work, we implement and then improve a simple and fast CA model designed to operationally simulate wildfires in Portugal. The reference CA model chosen [3] has the advantage of having been applied successfully in other Mediterranean ecosystems, namely to historical fires in Greece. The model is defined on a square grid with propagation to 8 nearest and next-nearest neighbors, where each cell is characterized by 4 possible discrete states, corresponding to burning, not-yet burned, fuel-free and completely burned cells, with 4 possible rules of evolution which take into account fuel properties, meteorological conditions, and topography. As a CA model, it offers the possibility to run a very high number of simulations in order to verify and apply the model, and is easily modified by implementing additional variables and different rules for the

  15. Energy dissipation dataset for reversible logic gates in quantum dot-cellular automata.

    PubMed

    Bahar, Ali Newaz; Rahman, Mohammad Maksudur; Nahid, Nur Mohammad; Hassan, Md Kamrul

    2017-02-01

    This paper presents an energy dissipation dataset of different reversible logic gates in quantum-dot cellular automata. The proposed circuits have been designed and verified using QCADesigner simulator. Besides, the energy dissipation has been calculated under three different tunneling energy level at temperature T =2 K. For estimating the energy dissipation of proposed gates; QCAPro tool has been employed.

  16. Characterizing emergent properties of immunological systems with multi-cellular rule-based computational modeling.

    PubMed

    Chavali, Arvind K; Gianchandani, Erwin P; Tung, Kenneth S; Lawrence, Michael B; Peirce, Shayn M; Papin, Jason A

    2008-12-01

    The immune system is comprised of numerous components that interact with one another to give rise to phenotypic behaviors that are sometimes unexpected. Agent-based modeling (ABM) and cellular automata (CA) belong to a class of discrete mathematical approaches in which autonomous entities detect local information and act over time according to logical rules. The power of this approach lies in the emergence of behavior that arises from interactions between agents, which would otherwise be impossible to know a priori. Recent work exploring the immune system with ABM and CA has revealed novel insights into immunological processes. Here, we summarize these applications to immunology and, particularly, how ABM can help formulate hypotheses that might drive further experimental investigations of disease mechanisms.

  17. Multilane Traffic Flow Modeling Using Cellular Automata Theory

    NASA Astrophysics Data System (ADS)

    Chechina, Antonina; Churbanova, Natalia; Trapeznikova, Marina

    2018-02-01

    The paper deals with the mathematical modeling of traffic flows on urban road networks using microscopic approach. The model is based on the cellular automata theory and presents a generalization of the Nagel-Schreckenberg model to a multilane case. The created program package allows to simulate traffic on various types of road fragments (T or X type intersection, strait road elements, etc.) and on road networks that consist of these elements. Besides that, it allows to predict the consequences of various decisions regarding road infrastructure changes, such as: number of lanes increasing/decreasing, putting new traffic lights into operation, building new roads, entrances/exits, road junctions.

  18. Reversible Flip-Flops in Quantum-Dot Cellular Automata

    NASA Astrophysics Data System (ADS)

    Rad, Samaneh Kazemi; Heikalabad, Saeed Rasouli

    2017-09-01

    Quantum-dot cellular automata is a new technology to design the efficient combinational and sequential circuits at the nano-scale. This technology has many desirable advantages compared to the CMOS technology such as low power consumption, less occupation area and low latency. These features make it suitable for use in flip-flop design. In this paper, with knowing the characteristics of reversible logic, we design new structures for flip-flops. The operations of these structures are evaluated with QCADesigner Version 2.0.3 simulator. In addition, we calculate the power dissipation of these structures by QCAPro tool. The results illustrated that proposed structures are efficient compared to the previous ones.

  19. Wolfram's class IV automata and a good life

    NASA Astrophysics Data System (ADS)

    McIntosh, Harold V.

    1990-09-01

    A comprehensive discussion of Wolfram's four classes of cellular automata is given, with the intention of relating them to Conway's criteria for a good game of Life. Although it is known that such classifications cannot be entirely rigorous, much information about the behavior of an automaton can be gleaned from the statistical properties of its transition table. Still more information can be deduced from the mean field approximation to its state densities, in particular, from the distribution of horizontal and diagonal tangents of the latter. In turn these characteristics can be related to the presence or absence of certain loops in the de Bruijn diagram of the automaton.

  20. Towards Time Automata and Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Hutzler, G.; Klaudel, H.; Wang, D. Y.

    2004-01-01

    The design of reactive systems must comply with logical correctness (the system does what it is supposed to do) and timeliness (the system has to satisfy a set of temporal constraints) criteria. In this paper, we propose a global approach for the design of adaptive reactive systems, i.e., systems that dynamically adapt their architecture depending on the context. We use the timed automata formalism for the design of the agents' behavior. This allows evaluating beforehand the properties of the system (regarding logical correctness and timeliness), thanks to model-checking and simulation techniques. This model is enhanced with tools that we developed for the automatic generation of code, allowing to produce very quickly a running multi-agent prototype satisfying the properties of the model.

  1. Traffic jam dynamics in stochastic cellular automata

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nagel, K.; Schreckenberg, M.

    1995-09-01

    Simple models for particles hopping on a grid (cellular automata) are used to simulate (single lane) traffic flow. Despite their simplicity, these models are astonishingly realistic in reproducing start-stop-waves and realistic fundamental diagrams. One can use these models to investigate traffic phenomena near maximum flow. A so-called phase transition at average maximum flow is visible in the life-times of jams. The resulting dynamic picture is consistent with recent fluid-dynamical results by Kuehne/Kerner/Konhaeuser, and with Treiterer`s hysteresis description. This places CA models between car-following models and fluid-dynamical models for traffic flow. CA models are tested in projects in Los Alamos (USA)more » and in NRW (Germany) for large scale microsimulations of network traffic.« less

  2. Quantum cellular automata

    NASA Astrophysics Data System (ADS)

    Porod, Wolfgang; Lent, Craig S.; Bernstein, Gary H.

    1994-06-01

    The Notre Dame group has developed a new paradigm for ultra-dense and ultra-fast information processing in nanoelectronic systems. These Quantum Cellular Automata (QCA's) are the first concrete proposal for a technology based on arrays of coupled quantum dots. The basic building block of these cellular arrays is the Notre Dame Logic Cell, as it has been called in the literature. The phenomenon of Coulomb exclusion, which is a synergistic interplay of quantum confinement and Coulomb interaction, leads to a bistable behavior of each cell which makes possible their use in large-scale cellular arrays. The physical interaction between neighboring cells has been exploited to implement logic functions. New functionality may be achieved in this fashion, and the Notre Dame group invented a versatile majority logic gate. In a series of papers, the feasibility of QCA wires, wire crossing, inverters, and Boolean logic gates was demonstrated. A major finding is that all logic functions may be integrated in a hierarchial fashion which allows the design of complicated QCA structures. The most complicated system which was simulated to date is a one-bit full adder consisting of some 200 cells. In addition to exploring these new concepts, efforts are under way to physically realize such structures both in semiconductor and metal systems. Extensive modeling work of semiconductor quantum dot structures has helped identify optimum design parameters for QCA experimental implementations.

  3. Achieving microaggregation for secure statistical databases using fixed-structure partitioning-based learning automata.

    PubMed

    Fayyoumi, Ebaa; Oommen, B John

    2009-10-01

    We consider the microaggregation problem (MAP) that involves partitioning a set of individual records in a microdata file into a number of mutually exclusive and exhaustive groups. This problem, which seeks for the best partition of the microdata file, is known to be NP-hard and has been tackled using many heuristic solutions. In this paper, we present the first reported fixed-structure-stochastic-automata-based solution to this problem. The newly proposed method leads to a lower value of the information loss (IL), obtains a better tradeoff between the IL and the disclosure risk (DR) when compared with state-of-the-art methods, and leads to a superior value of the scoring index, which is a criterion involving a combination of the IL and the DR. The scheme has been implemented, tested, and evaluated for different real-life and simulated data sets. The results clearly demonstrate the applicability of learning automata to the MAP and its ability to yield a solution that obtains the best tradeoff between IL and DR when compared with the state of the art.

  4. Sequential and Mixed Genetic Algorithm and Learning Automata (SGALA, MGALA) for Feature Selection in QSAR

    PubMed Central

    MotieGhader, Habib; Gharaghani, Sajjad; Masoudi-Sobhanzadeh, Yosef; Masoudi-Nejad, Ali

    2017-01-01

    Feature selection is of great importance in Quantitative Structure-Activity Relationship (QSAR) analysis. This problem has been solved using some meta-heuristic algorithms such as GA, PSO, ACO and so on. In this work two novel hybrid meta-heuristic algorithms i.e. Sequential GA and LA (SGALA) and Mixed GA and LA (MGALA), which are based on Genetic algorithm and learning automata for QSAR feature selection are proposed. SGALA algorithm uses advantages of Genetic algorithm and Learning Automata sequentially and the MGALA algorithm uses advantages of Genetic Algorithm and Learning Automata simultaneously. We applied our proposed algorithms to select the minimum possible number of features from three different datasets and also we observed that the MGALA and SGALA algorithms had the best outcome independently and in average compared to other feature selection algorithms. Through comparison of our proposed algorithms, we deduced that the rate of convergence to optimal result in MGALA and SGALA algorithms were better than the rate of GA, ACO, PSO and LA algorithms. In the end, the results of GA, ACO, PSO, LA, SGALA, and MGALA algorithms were applied as the input of LS-SVR model and the results from LS-SVR models showed that the LS-SVR model had more predictive ability with the input from SGALA and MGALA algorithms than the input from all other mentioned algorithms. Therefore, the results have corroborated that not only is the predictive efficiency of proposed algorithms better, but their rate of convergence is also superior to the all other mentioned algorithms. PMID:28979308

  5. Sequential and Mixed Genetic Algorithm and Learning Automata (SGALA, MGALA) for Feature Selection in QSAR.

    PubMed

    MotieGhader, Habib; Gharaghani, Sajjad; Masoudi-Sobhanzadeh, Yosef; Masoudi-Nejad, Ali

    2017-01-01

    Feature selection is of great importance in Quantitative Structure-Activity Relationship (QSAR) analysis. This problem has been solved using some meta-heuristic algorithms such as GA, PSO, ACO and so on. In this work two novel hybrid meta-heuristic algorithms i.e. Sequential GA and LA (SGALA) and Mixed GA and LA (MGALA), which are based on Genetic algorithm and learning automata for QSAR feature selection are proposed. SGALA algorithm uses advantages of Genetic algorithm and Learning Automata sequentially and the MGALA algorithm uses advantages of Genetic Algorithm and Learning Automata simultaneously. We applied our proposed algorithms to select the minimum possible number of features from three different datasets and also we observed that the MGALA and SGALA algorithms had the best outcome independently and in average compared to other feature selection algorithms. Through comparison of our proposed algorithms, we deduced that the rate of convergence to optimal result in MGALA and SGALA algorithms were better than the rate of GA, ACO, PSO and LA algorithms. In the end, the results of GA, ACO, PSO, LA, SGALA, and MGALA algorithms were applied as the input of LS-SVR model and the results from LS-SVR models showed that the LS-SVR model had more predictive ability with the input from SGALA and MGALA algorithms than the input from all other mentioned algorithms. Therefore, the results have corroborated that not only is the predictive efficiency of proposed algorithms better, but their rate of convergence is also superior to the all other mentioned algorithms.

  6. Urban Growth Modeling Using Cellular Automata with Multi-Temporal Remote Sensing Images Calibrated by the Artificial Bee Colony Optimization Algorithm.

    PubMed

    Naghibi, Fereydoun; Delavar, Mahmoud Reza; Pijanowski, Bryan

    2016-12-14

    Cellular Automata (CA) is one of the most common techniques used to simulate the urbanization process. CA-based urban models use transition rules to deliver spatial patterns of urban growth and urban dynamics over time. Determining the optimum transition rules of the CA is a critical step because of the heterogeneity and nonlinearities existing among urban growth driving forces. Recently, new CA models integrated with optimization methods based on swarm intelligence algorithms were proposed to overcome this drawback. The Artificial Bee Colony (ABC) algorithm is an advanced meta-heuristic swarm intelligence-based algorithm. Here, we propose a novel CA-based urban change model that uses the ABC algorithm to extract optimum transition rules. We applied the proposed ABC-CA model to simulate future urban growth in Urmia (Iran) with multi-temporal Landsat images from 1997, 2006 and 2015. Validation of the simulation results was made through statistical methods such as overall accuracy, the figure of merit and total operating characteristics (TOC). Additionally, we calibrated the CA model by ant colony optimization (ACO) to assess the performance of our proposed model versus similar swarm intelligence algorithm methods. We showed that the overall accuracy and the figure of merit of the ABC-CA model are 90.1% and 51.7%, which are 2.9% and 8.8% higher than those of the ACO-CA model, respectively. Moreover, the allocation disagreement of the simulation results for the ABC-CA model is 9.9%, which is 2.9% less than that of the ACO-CA model. Finally, the ABC-CA model also outperforms the ACO-CA model with fewer quantity and allocation errors and slightly more hits.

  7. Urban Growth Modeling Using Cellular Automata with Multi-Temporal Remote Sensing Images Calibrated by the Artificial Bee Colony Optimization Algorithm

    PubMed Central

    Naghibi, Fereydoun; Delavar, Mahmoud Reza; Pijanowski, Bryan

    2016-01-01

    Cellular Automata (CA) is one of the most common techniques used to simulate the urbanization process. CA-based urban models use transition rules to deliver spatial patterns of urban growth and urban dynamics over time. Determining the optimum transition rules of the CA is a critical step because of the heterogeneity and nonlinearities existing among urban growth driving forces. Recently, new CA models integrated with optimization methods based on swarm intelligence algorithms were proposed to overcome this drawback. The Artificial Bee Colony (ABC) algorithm is an advanced meta-heuristic swarm intelligence-based algorithm. Here, we propose a novel CA-based urban change model that uses the ABC algorithm to extract optimum transition rules. We applied the proposed ABC-CA model to simulate future urban growth in Urmia (Iran) with multi-temporal Landsat images from 1997, 2006 and 2015. Validation of the simulation results was made through statistical methods such as overall accuracy, the figure of merit and total operating characteristics (TOC). Additionally, we calibrated the CA model by ant colony optimization (ACO) to assess the performance of our proposed model versus similar swarm intelligence algorithm methods. We showed that the overall accuracy and the figure of merit of the ABC-CA model are 90.1% and 51.7%, which are 2.9% and 8.8% higher than those of the ACO-CA model, respectively. Moreover, the allocation disagreement of the simulation results for the ABC-CA model is 9.9%, which is 2.9% less than that of the ACO-CA model. Finally, the ABC-CA model also outperforms the ACO-CA model with fewer quantity and allocation errors and slightly more hits. PMID:27983633

  8. Simulating the conversion of rural settlements to town land based on multi-agent systems and cellular automata.

    PubMed

    Liu, Yaolin; Kong, Xuesong; Liu, Yanfang; Chen, Yiyun

    2013-01-01

    Rapid urbanization in China has triggered the conversion of land from rural to urban use, particularly the conversion of rural settlements to town land. This conversion is the result of the joint effects of the geographic environment and agents involving the government, investors, and farmers. To understand the dynamic interaction dominated by agents and to predict the future landscape of town expansion, a small town land-planning model is proposed based on the integration of multi-agent systems (MAS) and cellular automata (CA). The MAS-CA model links the decision-making behaviors of agents with the neighbor effect of CA. The interaction rules are projected by analyzing the preference conflicts among agents. To better illustrate the effects of the geographic environment, neighborhood, and agent behavior, a comparative analysis between the CA and MAS-CA models in three different towns is presented, revealing interesting patterns in terms of quantity, spatial characteristics, and the coordinating process. The simulation of rural settlements conversion to town land through modeling agent decision and human-environment interaction is very useful for understanding the mechanisms of rural-urban land-use change in developing countries. This process can assist town planners in formulating appropriate development plans.

  9. Evaluation of empirical rule of linearly correlated peptide selection (ERLPS) for proteotypic peptide-based quantitative proteomics.

    PubMed

    Liu, Kehui; Zhang, Jiyang; Fu, Bin; Xie, Hongwei; Wang, Yingchun; Qian, Xiaohong

    2014-07-01

    Precise protein quantification is essential in comparative proteomics. Currently, quantification bias is inevitable when using proteotypic peptide-based quantitative proteomics strategy for the differences in peptides measurability. To improve quantification accuracy, we proposed an "empirical rule for linearly correlated peptide selection (ERLPS)" in quantitative proteomics in our previous work. However, a systematic evaluation on general application of ERLPS in quantitative proteomics under diverse experimental conditions needs to be conducted. In this study, the practice workflow of ERLPS was explicitly illustrated; different experimental variables, such as, different MS systems, sample complexities, sample preparations, elution gradients, matrix effects, loading amounts, and other factors were comprehensively investigated to evaluate the applicability, reproducibility, and transferability of ERPLS. The results demonstrated that ERLPS was highly reproducible and transferable within appropriate loading amounts and linearly correlated response peptides should be selected for each specific experiment. ERLPS was used to proteome samples from yeast to mouse and human, and in quantitative methods from label-free to O18/O16-labeled and SILAC analysis, and enabled accurate measurements for all proteotypic peptide-based quantitative proteomics over a large dynamic range. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Mosquito population dynamics from cellular automata-based simulation

    NASA Astrophysics Data System (ADS)

    Syafarina, Inna; Sadikin, Rifki; Nuraini, Nuning

    2016-02-01

    In this paper we present an innovative model for simulating mosquito-vector population dynamics. The simulation consist of two stages: demography and dispersal dynamics. For demography simulation, we follow the existing model for modeling a mosquito life cycles. Moreover, we use cellular automata-based model for simulating dispersal of the vector. In simulation, each individual vector is able to move to other grid based on a random walk. Our model is also capable to represent immunity factor for each grid. We simulate the model to evaluate its correctness. Based on the simulations, we can conclude that our model is correct. However, our model need to be improved to find a realistic parameters to match real data.

  11. A cellular automata model of SARS epidemic spreading

    NASA Astrophysics Data System (ADS)

    Xu, Tian; Zhang, Peipei; Su, Beibei; Jiang, Yumai; He, Da-Ren

    2004-03-01

    We suggest a cellular automata model for a simulation on the process of SARS spreading in Beijing. Suppose a number of people are located in a two-dimensional lattice, in which a certain portion belongs to immune and others belong to acceptive. In every time step each of the acceptive people may become ill with a certain probability if one of his 8 neighbors is a SARS patient. At same time all the people have another possibility to change their positions. Each patient will recover or die after different number of days. A recovered patient becomes immune. The numerical simulation by this model leads to the results, which are in a good agreement with the practical statistical data.

  12. A Cellular Automata Model for the Study of Landslides

    NASA Astrophysics Data System (ADS)

    Liucci, Luisa; Suteanu, Cristian; Melelli, Laura

    2016-04-01

    Power-law scaling has been observed in the frequency distribution of landslide sizes in many regions of the world, for landslides triggered by different factors, and in both multi-temporal and post-event datasets, thus indicating the universal character of this property of landslides and suggesting that the same mechanisms drive the dynamics of mass wasting processes. The reasons for the scaling behavior of landslide sizes are widely debated, since their understanding would improve our knowledge of the spatial and temporal evolution of this phenomenon. Self-Organized Critical (SOC) dynamics and the key role of topography have been suggested as possible explanations. The scaling exponent of the landslide size-frequency distribution defines the probability of landslide magnitudes and it thus represents an important parameter for hazard assessment. Therefore, another - still unanswered - important question concerns the factors on which its value depends. This paper investigates these issues using a Cellular Automata (CA) model. The CA uses a real topographic surface acquired from a Digital Elevation Model to represent the initial state of the system, where the states of cells are defined in terms of altitude. The stability criterion is based on the slope gradient. The system is driven to instability through a temporal decrease of the stability condition of cells, which may be thought of as representing the temporal weakening of soil caused by factors like rainfall. A transition rule defines the way in which instabilities lead to discharge from unstable cells to the neighboring cells, deciding upon the landslide direction and the quantity of mass involved. Both the direction and the transferred mass depend on the local topographic features. The scaling properties of the area-frequency distributions of the resulting landslide series are investigated for several rates of weakening and for different time windows, in order to explore the response of the system to model

  13. Agent-based modeling of the immune system: NetLogo, a promising framework.

    PubMed

    Chiacchio, Ferdinando; Pennisi, Marzio; Russo, Giulia; Motta, Santo; Pappalardo, Francesco

    2014-01-01

    Several components that interact with each other to evolve a complex, and, in some cases, unexpected behavior, represents one of the main and fascinating features of the mammalian immune system. Agent-based modeling and cellular automata belong to a class of discrete mathematical approaches in which entities (agents) sense local information and undertake actions over time according to predefined rules. The strength of this approach is characterized by the appearance of a global behavior that emerges from interactions among agents. This behavior is unpredictable, as it does not follow linear rules. There are a lot of works that investigates the immune system with agent-based modeling and cellular automata. They have shown the ability to see clearly and intuitively into the nature of immunological processes. NetLogo is a multiagent programming language and modeling environment for simulating complex phenomena. It is designed for both research and education and is used across a wide range of disciplines and education levels. In this paper, we summarize NetLogo applications to immunology and, particularly, how this framework can help in the development and formulation of hypotheses that might drive further experimental investigations of disease mechanisms.

  14. Cellular automata approach for the dynamics of HIV infection under antiretroviral therapies: The role of the virus diffusion

    NASA Astrophysics Data System (ADS)

    González, Ramón E. R.; de Figueirêdo, Pedro Hugo; Coutinho, Sérgio

    2013-10-01

    We study a cellular automata model to test the timing of antiretroviral therapy strategies for the dynamics of infection with human immunodeficiency virus (HIV). We focus on the role of virus diffusion when its population is included in previous cellular automata model that describes the dynamics of the lymphocytes cells population during infection. This inclusion allows us to consider the spread of infection by the virus-cell interaction, beyond that which occurs by cell-cell contagion. The results show an acceleration of the infectious process in the absence of treatment, but show better efficiency in reducing the risk of the onset of AIDS when combined antiretroviral therapies are used even with drugs of low effectiveness. Comparison of results with clinical data supports the conclusions of this study.

  15. Exploring Spatio-temporal Dynamics of Cellular Automata for Pattern Recognition in Networks.

    PubMed

    Miranda, Gisele Helena Barboni; Machicao, Jeaneth; Bruno, Odemir Martinez

    2016-11-22

    Network science is an interdisciplinary field which provides an integrative approach for the study of complex systems. In recent years, network modeling has been used for the study of emergent phenomena in many real-world applications. Pattern recognition in networks has been drawing attention to the importance of network characterization, which may lead to understanding the topological properties that are related to the network model. In this paper, the Life-Like Network Automata (LLNA) method is introduced, which was designed for pattern recognition in networks. LLNA uses the network topology as a tessellation of Cellular Automata (CA), whose dynamics produces a spatio-temporal pattern used to extract the feature vector for network characterization. The method was evaluated using synthetic and real-world networks. In the latter, three pattern recognition applications were used: (i) identifying organisms from distinct domains of life through their metabolic networks, (ii) identifying online social networks and (iii) classifying stomata distribution patterns varying according to different lighting conditions. LLNA was compared to structural measurements and surpasses them in real-world applications, achieving improvement in the classification rate as high as 23%, 4% and 7% respectively. Therefore, the proposed method is a good choice for pattern recognition applications using networks and demonstrates potential for general applicability.

  16. Exploring Spatio-temporal Dynamics of Cellular Automata for Pattern Recognition in Networks

    PubMed Central

    Miranda, Gisele Helena Barboni; Machicao, Jeaneth; Bruno, Odemir Martinez

    2016-01-01

    Network science is an interdisciplinary field which provides an integrative approach for the study of complex systems. In recent years, network modeling has been used for the study of emergent phenomena in many real-world applications. Pattern recognition in networks has been drawing attention to the importance of network characterization, which may lead to understanding the topological properties that are related to the network model. In this paper, the Life-Like Network Automata (LLNA) method is introduced, which was designed for pattern recognition in networks. LLNA uses the network topology as a tessellation of Cellular Automata (CA), whose dynamics produces a spatio-temporal pattern used to extract the feature vector for network characterization. The method was evaluated using synthetic and real-world networks. In the latter, three pattern recognition applications were used: (i) identifying organisms from distinct domains of life through their metabolic networks, (ii) identifying online social networks and (iii) classifying stomata distribution patterns varying according to different lighting conditions. LLNA was compared to structural measurements and surpasses them in real-world applications, achieving improvement in the classification rate as high as 23%, 4% and 7% respectively. Therefore, the proposed method is a good choice for pattern recognition applications using networks and demonstrates potential for general applicability. PMID:27874024

  17. Subgrouping Automata: automatic sequence subgrouping using phylogenetic tree-based optimum subgrouping algorithm.

    PubMed

    Seo, Joo-Hyun; Park, Jihyang; Kim, Eun-Mi; Kim, Juhan; Joo, Keehyoung; Lee, Jooyoung; Kim, Byung-Gee

    2014-02-01

    Sequence subgrouping for a given sequence set can enable various informative tasks such as the functional discrimination of sequence subsets and the functional inference of unknown sequences. Because an identity threshold for sequence subgrouping may vary according to the given sequence set, it is highly desirable to construct a robust subgrouping algorithm which automatically identifies an optimal identity threshold and generates subgroups for a given sequence set. To meet this end, an automatic sequence subgrouping method, named 'Subgrouping Automata' was constructed. Firstly, tree analysis module analyzes the structure of tree and calculates the all possible subgroups in each node. Sequence similarity analysis module calculates average sequence similarity for all subgroups in each node. Representative sequence generation module finds a representative sequence using profile analysis and self-scoring for each subgroup. For all nodes, average sequence similarities are calculated and 'Subgrouping Automata' searches a node showing statistically maximum sequence similarity increase using Student's t-value. A node showing the maximum t-value, which gives the most significant differences in average sequence similarity between two adjacent nodes, is determined as an optimum subgrouping node in the phylogenetic tree. Further analysis showed that the optimum subgrouping node from SA prevents under-subgrouping and over-subgrouping. Copyright © 2013. Published by Elsevier Ltd.

  18. Exploring Spatio-temporal Dynamics of Cellular Automata for Pattern Recognition in Networks

    NASA Astrophysics Data System (ADS)

    Miranda, Gisele Helena Barboni; Machicao, Jeaneth; Bruno, Odemir Martinez

    2016-11-01

    Network science is an interdisciplinary field which provides an integrative approach for the study of complex systems. In recent years, network modeling has been used for the study of emergent phenomena in many real-world applications. Pattern recognition in networks has been drawing attention to the importance of network characterization, which may lead to understanding the topological properties that are related to the network model. In this paper, the Life-Like Network Automata (LLNA) method is introduced, which was designed for pattern recognition in networks. LLNA uses the network topology as a tessellation of Cellular Automata (CA), whose dynamics produces a spatio-temporal pattern used to extract the feature vector for network characterization. The method was evaluated using synthetic and real-world networks. In the latter, three pattern recognition applications were used: (i) identifying organisms from distinct domains of life through their metabolic networks, (ii) identifying online social networks and (iii) classifying stomata distribution patterns varying according to different lighting conditions. LLNA was compared to structural measurements and surpasses them in real-world applications, achieving improvement in the classification rate as high as 23%, 4% and 7% respectively. Therefore, the proposed method is a good choice for pattern recognition applications using networks and demonstrates potential for general applicability.

  19. Cells as strain-cued automata

    NASA Astrophysics Data System (ADS)

    Cox, Brian N.; Snead, Malcolm L.

    2016-02-01

    We argue in favor of representing living cells as automata and review demonstrations that autonomous cells can form patterns by responding to local variations in the strain fields that arise from their individual or collective motions. An autonomous cell's response to strain stimuli is assumed to be effected by internally-generated, internally-powered forces, which generally move the cell in directions other than those implied by external energy gradients. Evidence of cells acting as strain-cued automata have been inferred from patterns observed in nature and from experiments conducted in vitro. Simulations that mimic particular cases of pattern forming share the idealization that cells are assumed to pass information among themselves solely via mechanical boundary conditions, i.e., the tractions and displacements present at their membranes. This assumption opens three mechanisms for pattern formation in large cell populations: wavelike behavior, kinematic feedback in cell motility that can lead to sliding and rotational patterns, and directed migration during invasions. Wavelike behavior among ameloblast cells during amelogenesis (the formation of dental enamel) has been inferred from enamel microstructure, while strain waves in populations of epithelial cells have been observed in vitro. One hypothesized kinematic feedback mechanism, "enhanced shear motility", accounts successfully for the spontaneous formation of layered patterns during amelogenesis in the mouse incisor. Directed migration is exemplified by a theory of invader cells that sense and respond to the strains they themselves create in the host population as they invade it: analysis shows that the strain fields contain positional information that could aid the formation of cell network structures, stabilizing the slender geometry of branches and helping govern the frequency of branch bifurcation and branch coalescence (the formation of closed networks). In simulations of pattern formation in

  20. Integrating GIS, cellular automata, and genetic algorithm in urban spatial optimization: a case study of Lanzhou

    NASA Astrophysics Data System (ADS)

    Xu, Xibao; Zhang, Jianming; Zhou, Xiaojian

    2006-10-01

    This paper presents a model integrating GIS, cellular automata (CA) and genetic algorithm (GA) in urban spatial optimization. The model involves three objectives of the maximization of land-use efficiency, the maximization of urban spatial harmony and appropriate proportion of each land-use type. CA submodel is designed with standard Moore neighbor and three transition rules to maximize the land-use efficiency and urban spatial harmony, according to the land-use suitability and spatial harmony index. GA submodel is designed with four constraints and seven steps for the maximization of urban spatial harmony and appropriate proportion of each land-use type, including encoding, initializing, calculating fitness, selection, crossover, mutation and elitism. GIS is used to prepare for the input data sets for the model and perform spatial analysis on the results, while CA and GA are integrated to optimize urban spatial structure, programmed with Matlab 7 and coupled with GIS loosely. Lanzhou, a typical valley-basin city with fast urban development, is chosen as the case study. At the end, a detail analysis and evaluation of the spatial optimization with the model are made, and it proves to be a powerful tool in optimizing urban spatial structure and make supplement for urban planning and policy-makers.

  1. Cellular automata model for human articular chondrocytes migration, proliferation and cell death: An in vitro validation.

    PubMed

    Vaca-González, J J; Gutiérrez, M L; Guevara, J M; Garzón-Alvarado, D A

    2017-01-01

    Articular cartilage is characterized by low cell density of only one cell type, chondrocytes, and has limited self-healing properties. When articular cartilage is affected by traumatic injuries, a therapeutic strategy such as autologous chondrocyte implantation is usually proposed for its treatment. This approach requires in vitro chondrocyte expansion to yield high cell number for cell transplantation. To improve the efficiency of this procedure, it is necessary to assess cell dynamics such as migration, proliferation and cell death during culture. Computational models such as cellular automata can be used to simulate cell dynamics in order to enhance the result of cell culture procedures. This methodology has been implemented for several cell types; however, an experimental validation is required for each one. For this reason, in this research a cellular automata model, based on random-walk theory, was devised in order to predict articular chondrocyte behavior in monolayer culture during cell expansion. Results demonstrated that the cellular automata model corresponded to cell dynamics and computed-accurate quantitative results. Moreover, it was possible to observe that cell dynamics depend on weighted probabilities derived from experimental data and cell behavior varies according to the cell culture period. Thus, depending on whether cells were just seeded or proliferated exponentially, culture time probabilities differed in percentages in the CA model. Furthermore, in the experimental assessment a decreased chondrocyte proliferation was observed along with increased passage number. This approach is expected to having other uses as in enhancing articular cartilage therapies based on tissue engineering and regenerative medicine.

  2. A cellular automata model of land cover change to integrate urban growth with open space conservation

    EPA Science Inventory

    The preservation of riparian zones and other environmentally sensitive areas has long been recognized as one of the most cost-effective methods of managing stormwater and providing a broad range of ecosystem services. In this research, a cellular automata (CA)—Markov chain model ...

  3. Index Theory of One Dimensional Quantum Walks and Cellular Automata

    NASA Astrophysics Data System (ADS)

    Gross, D.; Nesme, V.; Vogts, H.; Werner, R. F.

    2012-03-01

    If a one-dimensional quantum lattice system is subject to one step of a reversible discrete-time dynamics, it is intuitive that as much "quantum information" as moves into any given block of cells from the left, has to exit that block to the right. For two types of such systems — namely quantum walks and cellular automata — we make this intuition precise by defining an index, a quantity that measures the "net flow of quantum information" through the system. The index supplies a complete characterization of two properties of the discrete dynamics. First, two systems S 1, S 2 can be "pieced together", in the sense that there is a system S which acts like S 1 in one region and like S 2 in some other region, if and only if S 1 and S 2 have the same index. Second, the index labels connected components of such systems: equality of the index is necessary and sufficient for the existence of a continuous deformation of S 1 into S 2. In the case of quantum walks, the index is integer-valued, whereas for cellular automata, it takes values in the group of positive rationals. In both cases, the map {S mapsto ind S} is a group homomorphism if composition of the discrete dynamics is taken as the group law of the quantum systems. Systems with trivial index are precisely those which can be realized by partitioned unitaries, and the prototypes of systems with non-trivial index are shifts.

  4. Coupling Cellular Automata Land Use Change with Distributed Hydrologic Models

    NASA Astrophysics Data System (ADS)

    Shu, L.; Duffy, C.

    2017-12-01

    There has been extensive research on LUC modeling with broad applications to simulating urban growth and changing demographic patterns across multiple scales. The importance of land conversion is a critical issue in watershed scale studies and is generally not treated in most watershed modeling approaches. In this study we apply spatially explicit hydrologic and landuse change models and the Conestoga Watershed in Lancaster County, Pennsylvania. The Penn State Integrated Hydrologic Model (PIHM) partitions the water balance in space and time over the urban catchment, the coupled Cellular Automata Land Use Change model (CALUC) dynamically simulates the evolution of land use classes based on physical measures associated with population change and land use demand factors. The CALUC model is based on iteratively applying discrete rules to each individual spatial cell. The essence the CA modeling involves calculation of the Transition Potential (TP) for conversion of a grid cell from one land use class to another. This potential includes five factors: random perturbation, suitability, accessibility, neighborhood effect, inertia effects and zonal factors. In spite of simplicity, this CALUC model has been shown to be very effective for simulating LUC leading to the emergence of complex spatial patterns. The components of TP are derived from present land use data for landuse reanalysis and for realistic future land use scenarios. For the CALUC we use early-settlement (circa 1790) initial land class values and final or present-day (2010) land classes to calibrate the model. CALUC- PIHM dynamically simulates the hydrologic response of conversion from pre-settlement to present landuse. The simulations highlight the capability and value of dynamic coupling of catchment hydrology with land use change over long time periods. Analysis of the simulation uses various metrics such as the distributed water balance, flow duration curves, etc. to show how deforestation, urbanization and

  5. A Cellular Automata Model of Infection Control on Medical Implants

    PubMed Central

    Prieto-Langarica, Alicia; Kojouharov, Hristo; Chen-Charpentier, Benito; Tang, Liping

    2011-01-01

    S. epidermidis infections on medically implanted devices are a common problem in modern medicine due to the abundance of the bacteria. Once inside the body, S. epidermidis gather in communities called biofilms and can become extremely hard to eradicate, causing the patient serious complications. We simulate the complex S. epidermidis-Neutrophils interactions in order to determine the optimum conditions for the immune system to be able to contain the infection and avoid implant rejection. Our cellular automata model can also be used as a tool for determining the optimal amount of antibiotics for combating biofilm formation on medical implants. PMID:23543851

  6. Computing aggregate properties of preimages for 2D cellular automata.

    PubMed

    Beer, Randall D

    2017-11-01

    Computing properties of the set of precursors of a given configuration is a common problem underlying many important questions about cellular automata. Unfortunately, such computations quickly become intractable in dimension greater than one. This paper presents an algorithm-incremental aggregation-that can compute aggregate properties of the set of precursors exponentially faster than naïve approaches. The incremental aggregation algorithm is demonstrated on two problems from the two-dimensional binary Game of Life cellular automaton: precursor count distributions and higher-order mean field theory coefficients. In both cases, incremental aggregation allows us to obtain new results that were previously beyond reach.

  7. Computing aggregate properties of preimages for 2D cellular automata

    NASA Astrophysics Data System (ADS)

    Beer, Randall D.

    2017-11-01

    Computing properties of the set of precursors of a given configuration is a common problem underlying many important questions about cellular automata. Unfortunately, such computations quickly become intractable in dimension greater than one. This paper presents an algorithm—incremental aggregation—that can compute aggregate properties of the set of precursors exponentially faster than naïve approaches. The incremental aggregation algorithm is demonstrated on two problems from the two-dimensional binary Game of Life cellular automaton: precursor count distributions and higher-order mean field theory coefficients. In both cases, incremental aggregation allows us to obtain new results that were previously beyond reach.

  8. Mixed-valence molecular four-dot unit for quantum cellular automata: Vibronic self-trapping and cell-cell response.

    PubMed

    Tsukerblat, Boris; Palii, Andrew; Clemente-Juan, Juan Modesto; Coronado, Eugenio

    2015-10-07

    Our interest in this article is prompted by the vibronic problem of charge polarized states in the four-dot molecular quantum cellular automata (mQCA), a paradigm for nanoelectronics, in which binary information is encoded in charge configuration of the mQCA cell. Here, we report the evaluation of the electronic levels and adiabatic potentials of mixed-valence (MV) tetra-ruthenium (2Ru(ii) + 2Ru(iii)) derivatives (assembled as two coupled Creutz-Taube complexes) for which molecular implementations of quantum cellular automata (QCA) was proposed. The cell based on this molecule includes two holes shared among four spinless sites and correspondingly we employ the model which takes into account the two relevant electron transfer processes (through the side and through the diagonal of the square) as well as the difference in Coulomb energies for different instant positions of localization of the hole pair. The combined Jahn-Teller (JT) and pseudo JT vibronic coupling is treated within the conventional Piepho-Krauzs-Schatz model adapted to a bi-electronic MV species with the square-planar topology. The adiabatic potentials are evaluated for the low lying Coulomb levels in which the antipodal sites are occupied, the case just actual for utilization in mQCA. The conditions for the vibronic self-trapping in spin-singlet and spin-triplet states are revealed in terms of the two actual transfer pathways parameters and the strength of the vibronic coupling. Spin related effects in degrees of the localization which are found for spin-singlet and spin-triplet states are discussed. The polarization of the cell is evaluated and we demonstrate how the partial delocalization caused by the joint action of the vibronic coupling and electron transfer processes influences polarization of a four-dot cell. The results obtained within the adiabatic approach are compared with those based on the numerical solution of the dynamic vibronic problem. Finally, the Coulomb interaction between

  9. An image encryption algorithm based on 3D cellular automata and chaotic maps

    NASA Astrophysics Data System (ADS)

    Del Rey, A. Martín; Sánchez, G. Rodríguez

    2015-05-01

    A novel encryption algorithm to cipher digital images is presented in this work. The digital image is rendering into a three-dimensional (3D) lattice and the protocol consists of two phases: the confusion phase where 24 chaotic Cat maps are applied and the diffusion phase where a 3D cellular automata is evolved. The encryption method is shown to be secure against the most important cryptanalytic attacks.

  10. Algebraic properties of automata associated to Petri nets and applications to computation in biological systems.

    PubMed

    Egri-Nagy, Attila; Nehaniv, Chrystopher L

    2008-01-01

    Biochemical and genetic regulatory networks are often modeled by Petri nets. We study the algebraic structure of the computations carried out by Petri nets from the viewpoint of algebraic automata theory. Petri nets comprise a formalized graphical modeling language, often used to describe computation occurring within biochemical and genetic regulatory networks, but the semantics may be interpreted in different ways in the realm of automata. Therefore, there are several different ways to turn a Petri net into a state-transition automaton. Here, we systematically investigate different conversion methods and describe cases where they may yield radically different algebraic structures. We focus on the existence of group components of the corresponding transformation semigroups, as these reflect symmetries of the computation occurring within the biological system under study. Results are illustrated by applications to the Petri net modelling of intermediary metabolism. Petri nets with inhibition are shown to be computationally rich, regardless of the particular interpretation method. Along these lines we provide a mathematical argument suggesting a reason for the apparent all-pervasiveness of inhibitory connections in living systems.

  11. The Design of Fault Tolerant Quantum Dot Cellular Automata Based Logic

    NASA Technical Reports Server (NTRS)

    Armstrong, C. Duane; Humphreys, William M.; Fijany, Amir

    2002-01-01

    As transistor geometries are reduced, quantum effects begin to dominate device performance. At some point, transistors cease to have the properties that make them useful computational components. New computing elements must be developed in order to keep pace with Moore s Law. Quantum dot cellular automata (QCA) represent an alternative paradigm to transistor-based logic. QCA architectures that are robust to manufacturing tolerances and defects must be developed. We are developing software that allows the exploration of fault tolerant QCA gate architectures by automating the specification, simulation, analysis and documentation processes.

  12. Collective dynamics in heterogeneous networks of neuronal cellular automata

    NASA Astrophysics Data System (ADS)

    Manchanda, Kaustubh; Bose, Amitabha; Ramaswamy, Ramakrishna

    2017-12-01

    We examine the collective dynamics of heterogeneous random networks of model neuronal cellular automata. Each automaton has b active states, a single silent state and r - b - 1 refractory states, and can show 'spiking' or 'bursting' behavior, depending on the values of b. We show that phase transitions that occur in the dynamical activity can be related to phase transitions in the structure of Erdõs-Rényi graphs as a function of edge probability. Different forms of heterogeneity allow distinct structural phase transitions to become relevant. We also show that the dynamics on the network can be described by a semi-annealed process and, as a result, can be related to the Boolean Lyapunov exponent.

  13. A Study of Chaos in Cellular Automata

    NASA Astrophysics Data System (ADS)

    Kamilya, Supreeti; Das, Sukanta

    This paper presents a study of chaos in one-dimensional cellular automata (CAs). The communication of information from one part of the system to another has been taken into consideration in this study. This communication is formalized as a binary relation over the set of cells. It is shown that this relation is an equivalence relation and all the cells form a single equivalence class when the cellular automaton (CA) is chaotic. However, the communication between two cells is sometimes blocked in some CAs by a subconfiguration which appears in between the cells during evolution. This blocking of communication by a subconfiguration has been analyzed in this paper with the help of de Bruijn graph. We identify two types of blocking — full and partial. Finally a parameter has been developed for the CAs. We show that the proposed parameter performs better than the existing parameters.

  14. An energy and cost efficient majority-based RAM cell in quantum-dot cellular automata

    NASA Astrophysics Data System (ADS)

    Khosroshahy, Milad Bagherian; Moaiyeri, Mohammad Hossein; Navi, Keivan; Bagherzadeh, Nader

    Nanotechnologies, notably quantum-dot cellular automata, have achieved major attentions for their prominent features as compared to the conventional CMOS circuitry. Quantum-dot cellular automata, particularly owning to its considerable reduction in size, high switching speed and ultra-low energy consumption, is considered as a potential alternative for the CMOS technology. As the memory unit is one of the most essential components in a digital system, designing a well-optimized QCA random access memory (RAM) cell is an important area of research. In this paper, a new five-input majority gate is presented which is suitable for implementing efficient single-layer QCA circuits. In addition, a new RAM cell with set and reset capabilities is designed based on the proposed majority gate, which has an efficient and low-energy structure. The functionality, performance and energy consumption of the proposed designs are evaluated based on the QCADesigner and QCAPro tools. According to the simulation results, the proposed RAM design leads to on average 38% lower total energy dissipation, 25% smaller area, 20% lower cell count, 28% lower delay and 60% lower QCA cost as compared to its previous counterparts.

  15. Simulating the Conversion of Rural Settlements to Town Land Based on Multi-Agent Systems and Cellular Automata

    PubMed Central

    Liu, Yaolin; Kong, Xuesong; Liu, Yanfang; Chen, Yiyun

    2013-01-01

    Rapid urbanization in China has triggered the conversion of land from rural to urban use, particularly the conversion of rural settlements to town land. This conversion is the result of the joint effects of the geographic environment and agents involving the government, investors, and farmers. To understand the dynamic interaction dominated by agents and to predict the future landscape of town expansion, a small town land-planning model is proposed based on the integration of multi-agent systems (MAS) and cellular automata (CA). The MAS-CA model links the decision-making behaviors of agents with the neighbor effect of CA. The interaction rules are projected by analyzing the preference conflicts among agents. To better illustrate the effects of the geographic environment, neighborhood, and agent behavior, a comparative analysis between the CA and MAS-CA models in three different towns is presented, revealing interesting patterns in terms of quantity, spatial characteristics, and the coordinating process. The simulation of rural settlements conversion to town land through modeling agent decision and human-environment interaction is very useful for understanding the mechanisms of rural-urban land-use change in developing countries. This process can assist town planners in formulating appropriate development plans. PMID:24244472

  16. Extending Linear Models to Non-Linear Contexts: An In-Depth Study about Two University Students' Mathematical Productions

    ERIC Educational Resources Information Center

    Esteley, Cristina; Villarreal, Monica; Alagia, Humberto

    2004-01-01

    This research report presents a study of the work of agronomy majors in which an extension of linear models to non-linear contexts can be observed. By linear models we mean the model y=a.x+b, some particular representations of direct proportionality and the diagram for the rule of three. Its presence and persistence in different types of problems…

  17. Learning and Tuning of Fuzzy Rules

    NASA Technical Reports Server (NTRS)

    Berenji, Hamid R.

    1997-01-01

    In this chapter, we review some of the current techniques for learning and tuning fuzzy rules. For clarity, we refer to the process of generating rules from data as the learning problem and distinguish it from tuning an already existing set of fuzzy rules. For learning, we touch on unsupervised learning techniques such as fuzzy c-means, fuzzy decision tree systems, fuzzy genetic algorithms, and linear fuzzy rules generation methods. For tuning, we discuss Jang's ANFIS architecture, Berenji-Khedkar's GARIC architecture and its extensions in GARIC-Q. We show that the hybrid techniques capable of learning and tuning fuzzy rules, such as CART-ANFIS, RNN-FLCS, and GARIC-RB, are desirable in development of a number of future intelligent systems.

  18. Simulations of Living Cell Origins Using a Cellular Automata Model

    NASA Astrophysics Data System (ADS)

    Ishida, Takeshi

    2014-04-01

    Understanding the generalized mechanisms of cell self-assembly is fundamental for applications in various fields, such as mass producing molecular machines in nanotechnology. Thus, the details of real cellular reaction networks and the necessary conditions for self-organized cells must be elucidated. We constructed a 2-dimensional cellular automata model to investigate the emergence of biological cell formation, which incorporated a looped membrane and a membrane-bound information system (akin to a genetic code and gene expression system). In particular, with an artificial reaction system coupled with a thermal system, the simultaneous formation of a looped membrane and an inner reaction process resulted in a more stable structure. These double structures inspired the primitive biological cell formation process from chemical evolution stage. With a model to simulate cellular self-organization in a 2-dimensional cellular automata model, 3 phenomena could be realized: (1) an inner reaction system developed as an information carrier precursor (akin to DNA); (2) a cell border emerged (akin to a cell membrane); and (3) these cell structures could divide into 2. This double-structured cell was considered to be a primary biological cell. The outer loop evolved toward a lipid bilayer membrane, and inner polymeric particles evolved toward precursor information carriers (evolved toward DNA). This model did not completely clarify all the necessary and sufficient conditions for biological cell self-organization. Further, our virtual cells remained unstable and fragile. However, the "garbage bag model" of Dyson proposed that the first living cells were deficient; thus, it would be reasonable that the earliest cells were more unstable and fragile than the simplest current unicellular organisms.

  19. Simulations of living cell origins using a cellular automata model.

    PubMed

    Ishida, Takeshi

    2014-04-01

    Understanding the generalized mechanisms of cell self-assembly is fundamental for applications in various fields, such as mass producing molecular machines in nanotechnology. Thus, the details of real cellular reaction networks and the necessary conditions for self-organized cells must be elucidated. We constructed a 2-dimensional cellular automata model to investigate the emergence of biological cell formation, which incorporated a looped membrane and a membrane-bound information system (akin to a genetic code and gene expression system). In particular, with an artificial reaction system coupled with a thermal system, the simultaneous formation of a looped membrane and an inner reaction process resulted in a more stable structure. These double structures inspired the primitive biological cell formation process from chemical evolution stage. With a model to simulate cellular self-organization in a 2-dimensional cellular automata model, 3 phenomena could be realized: (1) an inner reaction system developed as an information carrier precursor (akin to DNA); (2) a cell border emerged (akin to a cell membrane); and (3) these cell structures could divide into 2. This double-structured cell was considered to be a primary biological cell. The outer loop evolved toward a lipid bilayer membrane, and inner polymeric particles evolved toward precursor information carriers (evolved toward DNA). This model did not completely clarify all the necessary and sufficient conditions for biological cell self-organization. Further, our virtual cells remained unstable and fragile. However, the "garbage bag model" of Dyson proposed that the first living cells were deficient; thus, it would be reasonable that the earliest cells were more unstable and fragile than the simplest current unicellular organisms.

  20. The quasi-optimality criterion in the linear functional strategy

    NASA Astrophysics Data System (ADS)

    Kindermann, Stefan; Pereverzyev, Sergiy, Jr.; Pilipenko, Andrey

    2018-07-01

    The linear functional strategy for the regularization of inverse problems is considered. For selecting the regularization parameter therein, we propose the heuristic quasi-optimality principle and some modifications including the smoothness of the linear functionals. We prove convergence rates for the linear functional strategy with these heuristic rules taking into account the smoothness of the solution and the functionals and imposing a structural condition on the noise. Furthermore, we study these noise conditions in both a deterministic and stochastic setup and verify that for mildly-ill-posed problems and Gaussian noise, these conditions are satisfied almost surely, where on the contrary, in the severely-ill-posed case and in a similar setup, the corresponding noise condition fails to hold. Moreover, we propose an aggregation method for adaptively optimizing the parameter choice rule by making use of improved rates for linear functionals. Numerical results indicate that this method yields better results than the standard heuristic rule.

  1. Cramer's Rule Revisited

    ERIC Educational Resources Information Center

    Ayoub, Ayoub B.

    2005-01-01

    In 1750, the Swiss mathematician Gabriel Cramer published a well-written algebra book entitled "Introduction a l'Analyse des Lignes Courbes Algebriques." In the appendix to this book, Cramer gave, without proof, the rule named after him for solving a linear system of equations using determinants (Kosinki, 2001). Since then several derivations of…

  2. Amplitudes for multiphoton quantum processes in linear optics

    NASA Astrophysics Data System (ADS)

    Urías, Jesús

    2011-07-01

    The prominent role that linear optical networks have acquired in the engineering of photon states calls for physically intuitive and automatic methods to compute the probability amplitudes for the multiphoton quantum processes occurring in linear optics. A version of Wick's theorem for the expectation value, on any vector state, of products of linear operators, in general, is proved. We use it to extract the combinatorics of any multiphoton quantum processes in linear optics. The result is presented as a concise rule to write down directly explicit formulae for the probability amplitude of any multiphoton process in linear optics. The rule achieves a considerable simplification and provides an intuitive physical insight about quantum multiphoton processes. The methodology is applied to the generation of high-photon-number entangled states by interferometrically mixing coherent light with spontaneously down-converted light.

  3. Decomposition of timed automata for solving scheduling problems

    NASA Astrophysics Data System (ADS)

    Nishi, Tatsushi; Wakatake, Masato

    2014-03-01

    A decomposition algorithm for scheduling problems based on timed automata (TA) model is proposed. The problem is represented as an optimal state transition problem for TA. The model comprises of the parallel composition of submodels such as jobs and resources. The procedure of the proposed methodology can be divided into two steps. The first step is to decompose the TA model into several submodels by using decomposable condition. The second step is to combine individual solution of subproblems for the decomposed submodels by the penalty function method. A feasible solution for the entire model is derived through the iterated computation of solving the subproblem for each submodel. The proposed methodology is applied to solve flowshop and jobshop scheduling problems. Computational experiments demonstrate the effectiveness of the proposed algorithm compared with a conventional TA scheduling algorithm without decomposition.

  4. Simple and Flexible Self-Reproducing Structures in Asynchronous Cellular Automata and Their Dynamics

    NASA Astrophysics Data System (ADS)

    Huang, Xin; Lee, Jia; Yang, Rui-Long; Zhu, Qing-Sheng

    2013-03-01

    Self-reproduction on asynchronous cellular automata (ACAs) has attracted wide attention due to the evident artifacts induced by synchronous updating. Asynchronous updating, which allows cells to undergo transitions independently at random times, might be more compatible with the natural processes occurring at micro-scale, but the dark side of the coin is the increment in the complexity of an ACA in order to accomplish stable self-reproduction. This paper proposes a novel model of self-timed cellular automata (STCAs), a special type of ACAs, where unsheathed loops are able to duplicate themselves reliably in parallel. The removal of sheath cannot only allow various loops with more flexible and compact structures to replicate themselves, but also reduce the number of cell states of the STCA as compared to the previous model adopting sheathed loops [Y. Takada, T. Isokawa, F. Peper and N. Matsui, Physica D227, 26 (2007)]. The lack of sheath, on the other hand, often tends to cause much more complicated interactions among loops, when all of them struggle independently to stretch out their constructing arms at the same time. In particular, such intense collisions may even cause the emergence of a mess of twisted constructing arms in the cellular space. By using a simple and natural method, our self-reproducing loops (SRLs) are able to retract their arms successively, thereby disentangling from the mess successfully.

  5. Estimation of daily Snow Cover Area combining MODIS and LANDSAT information by using cellular automata

    NASA Astrophysics Data System (ADS)

    Pardo-Iguzquiza, Eulogio; Juan Collados Lara, Antonio; Pulido-Velazquez, David

    2016-04-01

    The snow availability in Alpine catchments is essential for the economy of these areas. It plays an important role in tourist development but also in the management of the Water Resources Snow is an important water resource in many river basins with mountains in the catchment area. The determination of the snow water equivalent requires the estimation of the evolution of the snow pack (cover area, thickness and snow density) along the time. Although there are complex physical models of the dynamics of the snow pack, sometimes the data available are scarce and a stochastic model like the cellular automata (CA) can be of great practical interest. CA can be used to model the dynamics of growth and wane of the snow pack. The CA is calibrated with historical data. This requires the determination of transition rules that are capable of modeling the evolution of the spatial pattern of snow cover area. Furthermore, CA requires the definition of states and neighborhoods. We have included topographical variables and climatological variables in order to define the state of each pixel. The evolution of snow cover in a pixel depends on its state, the state of the neighboring pixels and the transition rules. The calibration of the CA is done using daily MODIS data, available for the period 24/02/2002 to present with a spatial resolution of 500 m, and the LANDSAT information available with a sixteen-day periodicity from 1984 to the present and with spatial resolution of 30 m. The methodology has been applied to estimation of the snow cover area of Sierra Nevada mountain range in the Southern of Spain to obtain snow cover area daily information with 500 m spatial resolution for the period 1980-2014. Acknowledgments: This research has been partially supported by the GESINHIMPADAPT project (CGL2013-48424-C2-2-R) with Spanish MINECO funds. We would also like to thank NASA DAAC and LANDSAT project for the data provided for this study.

  6. Topometry optimization of sheet metal structures for crashworthiness design using hybrid cellular automata

    NASA Astrophysics Data System (ADS)

    Mozumder, Chandan K.

    The objective in crashworthiness design is to generate plastically deformable energy absorbing structures which can satisfy the prescribed force-displacement (FD) response. The FD behavior determines the reaction force, displacement and the internal energy that the structure should withstand. However, attempts to include this requirement in structural optimization problems remain scarce. The existing commercial optimization tools utilize models under static loading conditions because of the complexities associated with dynamic/impact loading. Due to the complexity of a crash event and the consequent time required to numerically analyze the dynamic response of the structure, classical methods (i.e., gradient-based and direct) are not well developed to solve this undertaking. This work presents an approach under the framework of the hybrid cellular automaton (HCA) method to solve the above challenge. The HCA method has been successfully applied to nonlinear transient topology optimization for crashworthiness design. In this work, the HCA algorithm has been utilized to develop an efficient methodology for synthesizing shell-based sheet metal structures with optimal material thickness distribution under a dynamic loading event using topometry optimization. This method utilizes the cellular automata (CA) computing paradigm and nonlinear transient finite element analysis (FEA) via ls-dyna. In this method, a set field variables is driven to their target states by changing a convenient set of design variables (e.g., thickness). These rules operate locally in cells within a lattice that only know local conditions. The field variables associated with the cells are driven to a setpoint to obtain the desired structure. This methodology is used to design for structures with controlled energy absorption with specified buckling zones. The peak reaction force and the maximum displacement are also constrained to meet the desired safety level according to passenger safety

  7. History dependent quantum random walks as quantum lattice gas automata

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shakeel, Asif, E-mail: asif.shakeel@gmail.com, E-mail: dmeyer@math.ucsd.edu, E-mail: plove@haverford.edu; Love, Peter J., E-mail: asif.shakeel@gmail.com, E-mail: dmeyer@math.ucsd.edu, E-mail: plove@haverford.edu; Meyer, David A., E-mail: asif.shakeel@gmail.com, E-mail: dmeyer@math.ucsd.edu, E-mail: plove@haverford.edu

    Quantum Random Walks (QRW) were first defined as one-particle sectors of Quantum Lattice Gas Automata (QLGA). Recently, they have been generalized to include history dependence, either on previous coin (internal, i.e., spin or velocity) states or on previous position states. These models have the goal of studying the transition to classicality, or more generally, changes in the performance of quantum walks in algorithmic applications. We show that several history dependent QRW can be identified as one-particle sectors of QLGA. This provides a unifying conceptual framework for these models in which the extra degrees of freedom required to store the historymore » information arise naturally as geometrical degrees of freedom on the lattice.« less

  8. Discovering Sentinel Rules for Business Intelligence

    NASA Astrophysics Data System (ADS)

    Middelfart, Morten; Pedersen, Torben Bach

    This paper proposes the concept of sentinel rules for multi-dimensional data that warns users when measure data concerning the external environment changes. For instance, a surge in negative blogging about a company could trigger a sentinel rule warning that revenue will decrease within two months, so a new course of action can be taken. Hereby, we expand the window of opportunity for organizations and facilitate successful navigation even though the world behaves chaotically. Since sentinel rules are at the schema level as opposed to the data level, and operate on data changes as opposed to absolute data values, we are able to discover strong and useful sentinel rules that would otherwise be hidden when using sequential pattern mining or correlation techniques. We present a method for sentinel rule discovery and an implementation of this method that scales linearly on large data volumes.

  9. Simulating Space Radiation-Induced Breast Tumor Incidence Using Automata.

    PubMed

    Heuskin, A C; Osseiran, A I; Tang, J; Costes, S V

    2016-07-01

    Estimating cancer risk from space radiation has been an ongoing challenge for decades primarily because most of the reported epidemiological data on radiation-induced risks are derived from studies of atomic bomb survivors who were exposed to an acute dose of gamma rays instead of chronic high-LET cosmic radiation. In this study, we introduce a formalism using cellular automata to model the long-term effects of ionizing radiation in human breast for different radiation qualities. We first validated and tuned parameters for an automata-based two-stage clonal expansion model simulating the age dependence of spontaneous breast cancer incidence in an unexposed U.S. We then tested the impact of radiation perturbation in the model by modifying parameters to reflect both targeted and nontargeted radiation effects. Targeted effects (TE) reflect the immediate impact of radiation on a cell's DNA with classic end points being gene mutations and cell death. They are well known and are directly derived from experimental data. In contrast, nontargeted effects (NTE) are persistent and affect both damaged and undamaged cells, are nonlinear with dose and are not well characterized in the literature. In this study, we introduced TE in our model and compared predictions against epidemiologic data of the atomic bomb survivor cohort. TE alone are not sufficient for inducing enough cancer. NTE independent of dose and lasting ∼100 days postirradiation need to be added to accurately predict dose dependence of breast cancer induced by gamma rays. Finally, by integrating experimental relative biological effectiveness (RBE) for TE and keeping NTE (i.e., radiation-induced genomic instability) constant with dose and LET, the model predicts that RBE for breast cancer induced by cosmic radiation would be maximum at 220 keV/μm. This approach lays the groundwork for further investigation into the impact of chronic low-dose exposure, inter-individual variation and more complex space radiation

  10. Dynamics of HIV infection on 2D cellular automata

    NASA Astrophysics Data System (ADS)

    Benyoussef, A.; HafidAllah, N. El; ElKenz, A.; Ez-Zahraouy, H.; Loulidi, M.

    2003-05-01

    We use a cellular automata approach to describe the interactions of the immune system with the human immunodeficiency virus (HIV). We study the evolution of HIV infection, particularly in the clinical latency period. The results we have obtained show the existence of four different behaviours in the plane of death rate of virus-death rate of infected T cell. These regions meet at a critical point, where the virus density and the infected T cell density remain invariant during the evolution of disease. We have introduced two kinds of treatments, the protease inhibitors and the RT inhibitors, in order to study their effects on the evolution of HIV infection. These treatments are powerful in decreasing the density of the virus in the blood and the delay of the AIDS onset.

  11. Exact results of 1D traffic cellular automata: The low-density behavior of the Fukui-Ishibashi model

    NASA Astrophysics Data System (ADS)

    Salcido, Alejandro; Hernández-Zapata, Ernesto; Carreón-Sierra, Susana

    2018-03-01

    The maximum entropy states of the cellular automata models for traffic flow in a single-lane with no anticipation are presented and discussed. The exact analytical solutions for the low-density behavior of the stochastic Fukui-Ishibashi traffic model were obtained and compared with computer simulations of the model. An excellent agreement was found.

  12. Mixed-valence molecular four-dot unit for quantum cellular automata: Vibronic self-trapping and cell-cell response

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsukerblat, Boris, E-mail: tsuker@bgu.ac.il, E-mail: andrew.palii@uv.es; Palii, Andrew, E-mail: tsuker@bgu.ac.il, E-mail: andrew.palii@uv.es; Clemente-Juan, Juan Modesto

    2015-10-07

    Our interest in this article is prompted by the vibronic problem of charge polarized states in the four-dot molecular quantum cellular automata (mQCA), a paradigm for nanoelectronics, in which binary information is encoded in charge configuration of the mQCA cell. Here, we report the evaluation of the electronic levels and adiabatic potentials of mixed-valence (MV) tetra-ruthenium (2Ru(II) + 2Ru(III)) derivatives (assembled as two coupled Creutz-Taube complexes) for which molecular implementations of quantum cellular automata (QCA) was proposed. The cell based on this molecule includes two holes shared among four spinless sites and correspondingly we employ the model which takes into accountmore » the two relevant electron transfer processes (through the side and through the diagonal of the square) as well as the difference in Coulomb energies for different instant positions of localization of the hole pair. The combined Jahn-Teller (JT) and pseudo JT vibronic coupling is treated within the conventional Piepho-Krauzs-Schatz model adapted to a bi-electronic MV species with the square-planar topology. The adiabatic potentials are evaluated for the low lying Coulomb levels in which the antipodal sites are occupied, the case just actual for utilization in mQCA. The conditions for the vibronic self-trapping in spin-singlet and spin-triplet states are revealed in terms of the two actual transfer pathways parameters and the strength of the vibronic coupling. Spin related effects in degrees of the localization which are found for spin-singlet and spin-triplet states are discussed. The polarization of the cell is evaluated and we demonstrate how the partial delocalization caused by the joint action of the vibronic coupling and electron transfer processes influences polarization of a four-dot cell. The results obtained within the adiabatic approach are compared with those based on the numerical solution of the dynamic vibronic problem. Finally, the Coulomb interaction

  13. Linear discriminant analysis with misallocation in training samples

    NASA Technical Reports Server (NTRS)

    Chhikara, R. (Principal Investigator); Mckeon, J.

    1982-01-01

    Linear discriminant analysis for a two-class case is studied in the presence of misallocation in training samples. A general appraoch to modeling of mislocation is formulated, and the mean vectors and covariance matrices of the mixture distributions are derived. The asymptotic distribution of the discriminant boundary is obtained and the asymptotic first two moments of the two types of error rate given. Certain numerical results for the error rates are presented by considering the random and two non-random misallocation models. It is shown that when the allocation procedure for training samples is objectively formulated, the effect of misallocation on the error rates of the Bayes linear discriminant rule can almost be eliminated. If, however, this is not possible, the use of Fisher rule may be preferred over the Bayes rule.

  14. Field validation of a free-agent cellular automata model of fire spread with fire–atmosphere coupling

    Treesearch

    Gary Achtemeier

    2012-01-01

    A cellular automata fire model represents ‘elements’ of fire by autonomous agents. A few simple algebraic expressions substituted for complex physical and meteorological processes and solved iteratively yield simulations for ‘super-diffusive’ fire spread and coupled surface-layer (2-m) fire–atmosphere processes. Pressure anomalies, which are integrals of the thermal...

  15. Fast and Epsilon-Optimal Discretized Pursuit Learning Automata.

    PubMed

    Zhang, JunQi; Wang, Cheng; Zhou, MengChu

    2015-10-01

    Learning automata (LA) are powerful tools for reinforcement learning. A discretized pursuit LA is the most popular one among them. During an iteration its operation consists of three basic phases: 1) selecting the next action; 2) finding the optimal estimated action; and 3) updating the state probability. However, when the number of actions is large, the learning becomes extremely slow because there are too many updates to be made at each iteration. The increased updates are mostly from phases 1 and 3. A new fast discretized pursuit LA with assured ε -optimality is proposed to perform both phases 1 and 3 with the computational complexity independent of the number of actions. Apart from its low computational complexity, it achieves faster convergence speed than the classical one when operating in stationary environments. This paper can promote the applications of LA toward the large-scale-action oriented area that requires efficient reinforcement learning tools with assured ε -optimality, fast convergence speed, and low computational complexity for each iteration.

  16. Stochastic cellular automata model for stock market dynamics

    NASA Astrophysics Data System (ADS)

    Bartolozzi, M.; Thomas, A. W.

    2004-04-01

    In the present work we introduce a stochastic cellular automata model in order to simulate the dynamics of the stock market. A direct percolation method is used to create a hierarchy of clusters of active traders on a two-dimensional grid. Active traders are characterized by the decision to buy, σi (t)=+1 , or sell, σi (t)=-1 , a stock at a certain discrete time step. The remaining cells are inactive, σi (t)=0 . The trading dynamics is then determined by the stochastic interaction between traders belonging to the same cluster. Extreme, intermittent events, such as crashes or bubbles, are triggered by a phase transition in the state of the bigger clusters present on the grid, where almost all the active traders come to share the same spin orientation. Most of the stylized aspects of the financial market time series, including multifractal proprieties, are reproduced by the model. A direct comparison is made with the daily closures of the S&P500 index.

  17. Local numerical modelling of ultrasonic guided waves in linear and nonlinear media

    NASA Astrophysics Data System (ADS)

    Packo, Pawel; Radecki, Rafal; Kijanka, Piotr; Staszewski, Wieslaw J.; Uhl, Tadeusz; Leamy, Michael J.

    2017-04-01

    Nonlinear ultrasonic techniques provide improved damage sensitivity compared to linear approaches. The combination of attractive properties of guided waves, such as Lamb waves, with unique features of higher harmonic generation provides great potential for characterization of incipient damage, particularly in plate-like structures. Nonlinear ultrasonic structural health monitoring techniques use interrogation signals at frequencies other than the excitation frequency to detect changes in structural integrity. Signal processing techniques used in non-destructive evaluation are frequently supported by modeling and numerical simulations in order to facilitate problem solution. This paper discusses known and newly-developed local computational strategies for simulating elastic waves, and attempts characterization of their numerical properties in the context of linear and nonlinear media. A hybrid numerical approach combining advantages of the Local Interaction Simulation Approach (LISA) and Cellular Automata for Elastodynamics (CAFE) is proposed for unique treatment of arbitrary strain-stress relations. The iteration equations of the method are derived directly from physical principles employing stress and displacement continuity, leading to an accurate description of the propagation in arbitrarily complex media. Numerical analysis of guided wave propagation, based on the newly developed hybrid approach, is presented and discussed in the paper for linear and nonlinear media. Comparisons to Finite Elements (FE) are also discussed.

  18. A note on probabilistic models over strings: the linear algebra approach.

    PubMed

    Bouchard-Côté, Alexandre

    2013-12-01

    Probabilistic models over strings have played a key role in developing methods that take into consideration indels as phylogenetically informative events. There is an extensive literature on using automata and transducers on phylogenies to do inference on these probabilistic models, in which an important theoretical question is the complexity of computing the normalization of a class of string-valued graphical models. This question has been investigated using tools from combinatorics, dynamic programming, and graph theory, and has practical applications in Bayesian phylogenetics. In this work, we revisit this theoretical question from a different point of view, based on linear algebra. The main contribution is a set of results based on this linear algebra view that facilitate the analysis and design of inference algorithms on string-valued graphical models. As an illustration, we use this method to give a new elementary proof of a known result on the complexity of inference on the "TKF91" model, a well-known probabilistic model over strings. Compared to previous work, our proving method is easier to extend to other models, since it relies on a novel weak condition, triangular transducers, which is easy to establish in practice. The linear algebra view provides a concise way of describing transducer algorithms and their compositions, opens the possibility of transferring fast linear algebra libraries (for example, based on GPUs), as well as low rank matrix approximation methods, to string-valued inference problems.

  19. Linearly Adjustable International Portfolios

    NASA Astrophysics Data System (ADS)

    Fonseca, R. J.; Kuhn, D.; Rustem, B.

    2010-09-01

    We present an approach to multi-stage international portfolio optimization based on the imposition of a linear structure on the recourse decisions. Multiperiod decision problems are traditionally formulated as stochastic programs. Scenario tree based solutions however can become intractable as the number of stages increases. By restricting the space of decision policies to linear rules, we obtain a conservative tractable approximation to the original problem. Local asset prices and foreign exchange rates are modelled separately, which allows for a direct measure of their impact on the final portfolio value.

  20. Evolution of cellular automata with memory: The Density Classification Task.

    PubMed

    Stone, Christopher; Bull, Larry

    2009-08-01

    The Density Classification Task is a well known test problem for two-state discrete dynamical systems. For many years researchers have used a variety of evolutionary computation approaches to evolve solutions to this problem. In this paper, we investigate the evolvability of solutions when the underlying Cellular Automaton is augmented with a type of memory based on the Least Mean Square algorithm. To obtain high performance solutions using a simple non-hybrid genetic algorithm, we design a novel representation based on the ternary representation used for Learning Classifier Systems. The new representation is found able to produce superior performance to the bit string traditionally used for representing Cellular automata. Moreover, memory is shown to improve evolvability of solutions and appropriate memory settings are able to be evolved as a component part of these solutions.

  1. Real-Time Extended Interface Automata for Software Testing Cases Generation

    PubMed Central

    Yang, Shunkun; Xu, Jiaqi; Man, Tianlong; Liu, Bin

    2014-01-01

    Testing and verification of the interface between software components are particularly important due to the large number of complex interactions, which requires the traditional modeling languages to overcome the existing shortcomings in the aspects of temporal information description and software testing input controlling. This paper presents the real-time extended interface automata (RTEIA) which adds clearer and more detailed temporal information description by the application of time words. We also establish the input interface automaton for every input in order to solve the problems of input controlling and interface covering nimbly when applied in the software testing field. Detailed definitions of the RTEIA and the testing cases generation algorithm are provided in this paper. The feasibility and efficiency of this method have been verified in the testing of one real aircraft braking system. PMID:24892080

  2. Dynamic and quantitative method of analyzing service consistency evolution based on extended hierarchical finite state automata.

    PubMed

    Fan, Linjun; Tang, Jun; Ling, Yunxiang; Li, Benxian

    2014-01-01

    This paper is concerned with the dynamic evolution analysis and quantitative measurement of primary factors that cause service inconsistency in service-oriented distributed simulation applications (SODSA). Traditional methods are mostly qualitative and empirical, and they do not consider the dynamic disturbances among factors in service's evolution behaviors such as producing, publishing, calling, and maintenance. Moreover, SODSA are rapidly evolving in terms of large-scale, reusable, compositional, pervasive, and flexible features, which presents difficulties in the usage of traditional analysis methods. To resolve these problems, a novel dynamic evolution model extended hierarchical service-finite state automata (EHS-FSA) is constructed based on finite state automata (FSA), which formally depict overall changing processes of service consistency states. And also the service consistency evolution algorithms (SCEAs) based on EHS-FSA are developed to quantitatively assess these impact factors. Experimental results show that the bad reusability (17.93% on average) is the biggest influential factor, the noncomposition of atomic services (13.12%) is the second biggest one, and the service version's confusion (1.2%) is the smallest one. Compared with previous qualitative analysis, SCEAs present good effectiveness and feasibility. This research can guide the engineers of service consistency technologies toward obtaining a higher level of consistency in SODSA.

  3. Dynamic and Quantitative Method of Analyzing Service Consistency Evolution Based on Extended Hierarchical Finite State Automata

    PubMed Central

    Fan, Linjun; Tang, Jun; Ling, Yunxiang; Li, Benxian

    2014-01-01

    This paper is concerned with the dynamic evolution analysis and quantitative measurement of primary factors that cause service inconsistency in service-oriented distributed simulation applications (SODSA). Traditional methods are mostly qualitative and empirical, and they do not consider the dynamic disturbances among factors in service's evolution behaviors such as producing, publishing, calling, and maintenance. Moreover, SODSA are rapidly evolving in terms of large-scale, reusable, compositional, pervasive, and flexible features, which presents difficulties in the usage of traditional analysis methods. To resolve these problems, a novel dynamic evolution model extended hierarchical service-finite state automata (EHS-FSA) is constructed based on finite state automata (FSA), which formally depict overall changing processes of service consistency states. And also the service consistency evolution algorithms (SCEAs) based on EHS-FSA are developed to quantitatively assess these impact factors. Experimental results show that the bad reusability (17.93% on average) is the biggest influential factor, the noncomposition of atomic services (13.12%) is the second biggest one, and the service version's confusion (1.2%) is the smallest one. Compared with previous qualitative analysis, SCEAs present good effectiveness and feasibility. This research can guide the engineers of service consistency technologies toward obtaining a higher level of consistency in SODSA. PMID:24772033

  4. On the topological sensitivity of cellular automata

    NASA Astrophysics Data System (ADS)

    Baetens, Jan M.; De Baets, Bernard

    2011-06-01

    Ever since the conceptualization of cellular automata (CA), much attention has been paid to the dynamical properties of these discrete dynamical systems, and, more in particular, to their sensitivity to the initial condition from which they are evolved. Yet, the sensitivity of CA to the topology upon which they are based has received only minor attention, such that a clear insight in this dependence is still lacking and, furthermore, a quantification of this so-called topological sensitivity has not yet been proposed. The lack of attention for this issue is rather surprising since CA are spatially explicit, which means that their dynamics is directly affected by their topology. To overcome these shortcomings, we propose topological Lyapunov exponents that measure the divergence of two close trajectories in phase space originating from a topological perturbation, and we relate them to a measure grasping the sensitivity of CA to their topology that relies on the concept of topological derivatives, which is introduced in this paper. The validity of the proposed methodology is illustrated for the 256 elementary CA and for a family of two-state irregular totalistic CA.

  5. A Bayesian model averaging method for the derivation of reservoir operating rules

    NASA Astrophysics Data System (ADS)

    Zhang, Jingwen; Liu, Pan; Wang, Hao; Lei, Xiaohui; Zhou, Yanlai

    2015-09-01

    Because the intrinsic dynamics among optimal decision making, inflow processes and reservoir characteristics are complex, functional forms of reservoir operating rules are always determined subjectively. As a result, the uncertainty of selecting form and/or model involved in reservoir operating rules must be analyzed and evaluated. In this study, we analyze the uncertainty of reservoir operating rules using the Bayesian model averaging (BMA) model. Three popular operating rules, namely piecewise linear regression, surface fitting and a least-squares support vector machine, are established based on the optimal deterministic reservoir operation. These individual models provide three-member decisions for the BMA combination, enabling the 90% release interval to be estimated by the Markov Chain Monte Carlo simulation. A case study of China's the Baise reservoir shows that: (1) the optimal deterministic reservoir operation, superior to any reservoir operating rules, is used as the samples to derive the rules; (2) the least-squares support vector machine model is more effective than both piecewise linear regression and surface fitting; (3) BMA outperforms any individual model of operating rules based on the optimal trajectories. It is revealed that the proposed model can reduce the uncertainty of operating rules, which is of great potential benefit in evaluating the confidence interval of decisions.

  6. Application of local linearization and the transonic equivalence rule to the flow about slender analytic bodies at Mach numbers near 1.0

    NASA Technical Reports Server (NTRS)

    Tyson, R. W.; Muraca, R. J.

    1975-01-01

    The local linearization method for axisymmetric flow is combined with the transonic equivalence rule to calculate pressure distribution on slender bodies at free-stream Mach numbers from .8 to 1.2. This is an approximate solution to the transonic flow problem which yields results applicable during the preliminary design stages of a configuration development. The method can be used to determine the aerodynamic loads on parabolic arc bodies having either circular or elliptical cross sections. It is particularly useful in predicting pressure distributions and normal force distributions along the body at small angles of attack. The equations discussed may be extended to include wing-body combinations.

  7. Electoral surveys’ influence on the voting processes: a cellular automata model

    NASA Astrophysics Data System (ADS)

    Alves, S. G.; Oliveira Neto, N. M.; Martins, M. L.

    2002-12-01

    Nowadays, in societies threatened by atomization, selfishness, short-term thinking, and alienation from political life, there is a renewed debate about classical questions concerning the quality of democratic decision making. In this work a cellular automata model for the dynamics of free elections, based on the social impact theory is proposed. By using computer simulations, power-law distributions for the size of electoral clusters and decision time have been obtained. The major role of broadcasted electoral surveys in guiding opinion formation and stabilizing the “status quo” was demonstrated. Furthermore, it was shown that in societies where these surveys are manipulated within the universally accepted statistical error bars, even a majoritary opposition could be hindered from reaching power through the electoral path.

  8. Investigation of phase diagrams for cylindrical Ising nanotube using cellular automata

    NASA Astrophysics Data System (ADS)

    Astaraki, M.; Ghaemi, M.; Afzali, K.

    2018-05-01

    Recent developments in the field of applied nanoscience and nanotechnology have heightened the need for categorizing various characteristics of nanostructures. In this regard, this paper establishes a novel method to investigate magnetic properties (phase diagram and spontaneous magnetization) of a cylindrical Ising nanotube. Using a two-layer Ising model and the core-shell concept, the interactions within nanotube has been modelled. In the model, both ferromagnetic and antiferromagnetic cases have been considered. Furthermore, the effect of nanotube's length on the critical temperature is investigated. The model has been simulated using cellular automata approach and phase diagrams were constructed for different values of inter- and intra-layer couplings. For the antiferromagnetic case, the possibility of existence of compensation point is observed.

  9. Quantum-Dot Cellular Automata

    NASA Astrophysics Data System (ADS)

    Snider, Gregory

    2000-03-01

    Quantum-dot Cellular Automata (QCA) [1] is a promising architecture which employs quantum dots for digital computation. It is a revolutionary approach that holds the promise of high device density and low power dissipation. A basic QCA cell consists of four quantum dots coupled capacitively and by tunnel barriers. The cell is biased to contain two excess electrons within the four dots, which are forced to opposite "corners" of the four-dot cell by mutual Coulomb repulsion. These two possible polarization states of the cell will represent logic "0" and "1". Properly arranged, arrays of these basic cells can implement Boolean logic functions. Experimental results from functional QCA devices built of nanoscale metal dots defined by tunnel barriers will be presented. The experimental devices to be presented consist of Al islands, which we will call quantum dots, interconnected by tunnel junctions and lithographically defined capacitors. Aluminum/ aluminum-oxide/aluminum tunnel junctions were fabricated using a standard e-beam lithography and shadow evaporation technique. The experiments were performed in a dilution refrigerator at a temperature of 70 mK. The operation of a cell is evaluated by direct measurements of the charge state of dots within a cell as the input voltage is changed. The experimental demonstration of a functioning cell will be presented. A line of three cells demonstrates that there are no metastable switching states in a line of cells. A QCA majority gate will also be presented, which is a programmable AND/OR gate and represents the basic building block of QCA systems. The results of recent experiments will be presented. 1. C.S. Lent, P.D. Tougaw, W. Porod, and G.H. Bernstein, Nanotechnology, 4, 49 (1993).

  10. Authorship attribution based on Life-Like Network Automata

    PubMed Central

    Machicao, Jeaneth; Corrêa, Edilson A.; Miranda, Gisele H. B.; Amancio, Diego R.

    2018-01-01

    The authorship attribution is a problem of considerable practical and technical interest. Several methods have been designed to infer the authorship of disputed documents in multiple contexts. While traditional statistical methods based solely on word counts and related measurements have provided a simple, yet effective solution in particular cases; they are prone to manipulation. Recently, texts have been successfully modeled as networks, where words are represented by nodes linked according to textual similarity measurements. Such models are useful to identify informative topological patterns for the authorship recognition task. However, there is no consensus on which measurements should be used. Thus, we proposed a novel method to characterize text networks, by considering both topological and dynamical aspects of networks. Using concepts and methods from cellular automata theory, we devised a strategy to grasp informative spatio-temporal patterns from this model. Our experiments revealed an outperformance over structural analysis relying only on topological measurements, such as clustering coefficient, betweenness and shortest paths. The optimized results obtained here pave the way for a better characterization of textual networks. PMID:29566100

  11. Authorship attribution based on Life-Like Network Automata.

    PubMed

    Machicao, Jeaneth; Corrêa, Edilson A; Miranda, Gisele H B; Amancio, Diego R; Bruno, Odemir M

    2018-01-01

    The authorship attribution is a problem of considerable practical and technical interest. Several methods have been designed to infer the authorship of disputed documents in multiple contexts. While traditional statistical methods based solely on word counts and related measurements have provided a simple, yet effective solution in particular cases; they are prone to manipulation. Recently, texts have been successfully modeled as networks, where words are represented by nodes linked according to textual similarity measurements. Such models are useful to identify informative topological patterns for the authorship recognition task. However, there is no consensus on which measurements should be used. Thus, we proposed a novel method to characterize text networks, by considering both topological and dynamical aspects of networks. Using concepts and methods from cellular automata theory, we devised a strategy to grasp informative spatio-temporal patterns from this model. Our experiments revealed an outperformance over structural analysis relying only on topological measurements, such as clustering coefficient, betweenness and shortest paths. The optimized results obtained here pave the way for a better characterization of textual networks.

  12. A cellular automata model for traffic flow based on kinetics theory, vehicles capabilities and driver reactions

    NASA Astrophysics Data System (ADS)

    Guzmán, H. A.; Lárraga, M. E.; Alvarez-Icaza, L.; Carvajal, J.

    2018-02-01

    In this paper, a reliable cellular automata model oriented to faithfully reproduce deceleration and acceleration according to realistic reactions of drivers, when vehicles with different deceleration capabilities are considered is presented. The model focuses on describing complex traffic phenomena by coding in its rules the basic mechanisms of drivers behavior, vehicles capabilities and kinetics, while preserving simplicity. In particular, vehiclés kinetics is based on uniform accelerated motion, rather than in impulsive accelerated motion as in most existing CA models. Thus, the proposed model calculates in an analytic way three safe preserving distances to determine the best action a follower vehicle can take under a worst case scenario. Besides, the prediction analysis guarantees that under the proper assumptions, collision between vehicles may not happen at any future time. Simulations results indicate that all interactions of heterogeneous vehicles (i.e., car-truck, truck-car, car-car and truck-truck) are properly reproduced by the model. In addition, the model overcomes one of the major limitations of CA models for traffic modeling: the inability to perform smooth approach to slower or stopped vehicles. Moreover, the model is also capable of reproducing most empirical findings including the backward speed of the downstream front of the traffic jam, and different congested traffic patterns induced by a system with open boundary conditions with an on-ramp. Like most CA models, integer values are used to make the model run faster, which makes the proposed model suitable for real time traffic simulation of large networks.

  13. Consequences of Landscape Fragmentation on Lyme Disease Risk: A Cellular Automata Approach

    PubMed Central

    Li, Sen; Hartemink, Nienke; Speybroeck, Niko; Vanwambeke, Sophie O.

    2012-01-01

    The abundance of infected Ixodid ticks is an important component of human risk of Lyme disease, and various empirical studies have shown that this is associated, at least in part, to landscape fragmentation. In this study, we aimed at exploring how varying woodland fragmentation patterns affect the risk of Lyme disease, through infected tick abundance. A cellular automata model was developed, incorporating a heterogeneous landscape with three interactive components: an age-structured tick population, a classical disease transmission function, and hosts. A set of simplifying assumptions were adopted with respect to the study objective and field data limitations. In the model, the landscape influences both tick survival and host movement. The validation of the model was performed with an empirical study. Scenarios of various landscape configurations (focusing on woodland fragmentation) were simulated and compared. Lyme disease risk indices (density and infection prevalence of nymphs) differed considerably between scenarios: (i) the risk could be higher in highly fragmented woodlands, which is supported by a number of recently published empirical studies, and (ii) grassland could reduce the risk in adjacent woodland, which suggests landscape fragmentation studies of zoonotic diseases should not focus on the patch-level woodland patterns only, but also on landscape-level adjacent land cover patterns. Further analysis of the simulation results indicated strong correlations between Lyme disease risk indices and the density, shape and aggregation level of woodland patches. These findings highlight the strong effect of the spatial patterns of local host population and movement on the spatial dynamics of Lyme disease risks, which can be shaped by woodland fragmentation. In conclusion, using a cellular automata approach is beneficial for modelling complex zoonotic transmission systems as it can be combined with either real world landscapes for exploring direct spatial

  14. Early warning of illegal development for protected areas by integrating cellular automata with neural networks.

    PubMed

    Li, Xia; Lao, Chunhua; Liu, Yilun; Liu, Xiaoping; Chen, Yimin; Li, Shaoying; Ai, Bing; He, Zijian

    2013-11-30

    Ecological security has become a major issue under fast urbanization in China. As the first two cities in this country, Shenzhen and Dongguan issued the ordinance of Eco-designated Line of Control (ELC) to "wire" ecologically important areas for strict protection in 2005 and 2009 respectively. Early warning systems (EWS) are a useful tool for assisting the implementation ELC. In this study, a multi-model approach is proposed for the early warning of illegal development by integrating cellular automata (CA) and artificial neural networks (ANN). The objective is to prevent the ecological risks or catastrophe caused by such development at an early stage. The integrated model is calibrated by using the empirical information from both remote sensing and handheld GPS (global positioning systems). The MAR indicator which is the ratio of missing alarms to all the warnings is proposed for better assessment of the model performance. It is found that the fast urban development has caused significant threats to natural-area protection in the study area. The integration of CA, ANN and GPS provides a powerful tool for describing and predicting illegal development which is in highly non-linear and fragmented forms. The comparison shows that this multi-model approach has much better performances than the single-model approach for the early warning. Compared with the single models of CA and ANN, this integrated multi-model can improve the value of MAR by 65.48% and 5.17% respectively. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. The two populations’ cellular automata model with predation based on the Penna model

    NASA Astrophysics Data System (ADS)

    He, Mingfeng; Lin, Jing; Jiang, Heng; Liu, Xin

    2002-09-01

    In Penna's single-species asexual bit-string model of biological ageing, the Verhulst factor has too strong a restraining effect on the development of the population. Danuta Makowiec gave an improved model based on the lattice, where the restraining factor of the four neighbours take the place of the Verhulst factor. Here, we discuss the two populations’ Penna model with predation on the planar lattice of two dimensions. A cellular automata model containing movable wolves and sheep has been built. The results show that both the quantity of the wolves and the sheep fluctuate in accordance with the law that one quantity increases while the other one decreases.

  16. Reconstruction of DNA sequences using genetic algorithms and cellular automata: towards mutation prediction?

    PubMed

    Mizas, Ch; Sirakoulis, G Ch; Mardiris, V; Karafyllidis, I; Glykos, N; Sandaltzopoulos, R

    2008-04-01

    Change of DNA sequence that fuels evolution is, to a certain extent, a deterministic process because mutagenesis does not occur in an absolutely random manner. So far, it has not been possible to decipher the rules that govern DNA sequence evolution due to the extreme complexity of the entire process. In our attempt to approach this issue we focus solely on the mechanisms of mutagenesis and deliberately disregard the role of natural selection. Hence, in this analysis, evolution refers to the accumulation of genetic alterations that originate from mutations and are transmitted through generations without being subjected to natural selection. We have developed a software tool that allows modelling of a DNA sequence as a one-dimensional cellular automaton (CA) with four states per cell which correspond to the four DNA bases, i.e. A, C, T and G. The four states are represented by numbers of the quaternary number system. Moreover, we have developed genetic algorithms (GAs) in order to determine the rules of CA evolution that simulate the DNA evolution process. Linear evolution rules were considered and square matrices were used to represent them. If DNA sequences of different evolution steps are available, our approach allows the determination of the underlying evolution rule(s). Conversely, once the evolution rules are deciphered, our tool may reconstruct the DNA sequence in any previous evolution step for which the exact sequence information was unknown. The developed tool may be used to test various parameters that could influence evolution. We describe a paradigm relying on the assumption that mutagenesis is governed by a near-neighbour-dependent mechanism. Based on the satisfactory performance of our system in the deliberately simplified example, we propose that our approach could offer a starting point for future attempts to understand the mechanisms that govern evolution. The developed software is open-source and has a user-friendly graphical input interface.

  17. A simple attitude control of quadrotor helicopter based on Ziegler-Nichols rules for tuning PD parameters.

    PubMed

    He, ZeFang; Zhao, Long

    2014-01-01

    An attitude control strategy based on Ziegler-Nichols rules for tuning PD (proportional-derivative) parameters of quadrotor helicopters is presented to solve the problem that quadrotor tends to be instable. This problem is caused by the narrow definition domain of attitude angles of quadrotor helicopters. The proposed controller is nonlinear and consists of a linear part and a nonlinear part. The linear part is a PD controller with PD parameters tuned by Ziegler-Nichols rules and acts on the quadrotor decoupled linear system after feedback linearization; the nonlinear part is a feedback linearization item which converts a nonlinear system into a linear system. It can be seen from the simulation results that the attitude controller proposed in this paper is highly robust, and its control effect is better than the other two nonlinear controllers. The nonlinear parts of the other two nonlinear controllers are the same as the attitude controller proposed in this paper. The linear part involves a PID (proportional-integral-derivative) controller with the PID controller parameters tuned by Ziegler-Nichols rules and a PD controller with the PD controller parameters tuned by GA (genetic algorithms). Moreover, this attitude controller is simple and easy to implement.

  18. Modelling of Microstructure Changes in Hot Deformed Materials Using Cellular Automata

    NASA Astrophysics Data System (ADS)

    Kuc, Dariusz; Gawąd, Jerzy

    2011-01-01

    The paper is focused on application of multi-scale 2D method. Model approach consists of Cellular Automata (CA) model of microstructure development and the finite element code to solve thermo-mechanical problem. Dynamic recrystallization phenomenon is taken into account in 2D CA model which takes advantage of explicit representation of microstructure, including individual grains and grain boundaries. Flow stress is the main material parameter in mechanical part of FE and is calculated on the basis of average dislocation density obtained from CA model. The results attained from the model were validated with the experimental data. In the present study, austenitic steel X3CrNi18-10 was investigated. The examination of microstructure for the initial and final microstructures was carried out, using light microscopy and transmission electron microscopy.

  19. A stochastic automata network for earthquake simulation and hazard estimation

    NASA Astrophysics Data System (ADS)

    Belubekian, Maya Ernest

    1998-11-01

    This research develops a model for simulation of earthquakes on seismic faults with available earthquake catalog data. The model allows estimation of the seismic hazard at a site of interest and assessment of the potential damage and loss in a region. There are two approaches for studying the earthquakes: mechanistic and stochastic. In the mechanistic approach, seismic processes, such as changes in stress or slip on faults, are studied in detail. In the stochastic approach, earthquake occurrences are simulated as realizations of a certain stochastic process. In this dissertation, a stochastic earthquake occurrence model is developed that uses the results from dislocation theory for the estimation of slip released in earthquakes. The slip accumulation and release laws and the event scheduling mechanism adopted in the model result in a memoryless Poisson process for the small and moderate events and in a time- and space-dependent process for large events. The minimum and maximum of the hazard are estimated by the model when the initial conditions along the faults correspond to a situation right after a largest event and after a long seismic gap, respectively. These estimates are compared with the ones obtained from a Poisson model. The Poisson model overestimates the hazard after the maximum event and underestimates it in the period of a long seismic quiescence. The earthquake occurrence model is formulated as a stochastic automata network. Each fault is divided into cells, or automata, that interact by means of information exchange. The model uses a statistical method called bootstrap for the evaluation of the confidence bounds on its results. The parameters of the model are adjusted to the target magnitude patterns obtained from the catalog. A case study is presented for the city of Palo Alto, where the hazard is controlled by the San Andreas, Hayward and Calaveras faults. The results of the model are used to evaluate the damage and loss distribution in Palo Alto

  20. Optimal Sequential Rules for Computer-Based Instruction.

    ERIC Educational Resources Information Center

    Vos, Hans J.

    1998-01-01

    Formulates sequential rules for adapting the appropriate amount of instruction to learning needs in the context of computer-based instruction. Topics include Bayesian decision theory, threshold and linear-utility structure, psychometric model, optimal sequential number of test questions, and an empirical example of sequential instructional…

  1. Developing Novel Reservoir Rule Curves Using Seasonal Inflow Projections

    NASA Astrophysics Data System (ADS)

    Tseng, Hsin-yi; Tung, Ching-pin

    2015-04-01

    Due to significant seasonal rainfall variations, reservoirs and their flexible operational rules are indispensable to Taiwan. Furthermore, with the intensifying impacts of climate change on extreme climate, the frequency of droughts in Taiwan has been increasing in recent years. Drought is a creeping phenomenon, the slow onset character of drought makes it difficult to detect at an early stage, and causes delays on making the best decision of allocating water. For these reasons, novel reservoir rule curves using projected seasonal streamflow are proposed in this study, which can potentially reduce the adverse effects of drought. This study dedicated establishing new rule curves which consider both current available storage and anticipated monthly inflows with leading time of two months to reduce the risk of water shortage. The monthly inflows are projected based on the seasonal climate forecasts from Central Weather Bureau (CWB), which a weather generation model is used to produce daily weather data for the hydrological component of the GWLF. To incorporate future monthly inflow projections into rule curves, this study designs a decision flow index which is a linear combination of current available storage and inflow projections with leading time of 2 months. By optimizing linear relationship coefficients of decision flow index, the shape of rule curves and the percent of water supply in each zone, the best rule curves to decrease water shortage risk and impacts can be developed. The Shimen Reservoir in the northern Taiwan is used as a case study to demonstrate the proposed method. Existing rule curves (M5 curves) of Shimen Reservoir are compared with two cases of new rule curves, including hindcast simulations and historic seasonal forecasts. The results show new rule curves can decrease the total water shortage ratio, and in addition, it can also allocate shortage amount to preceding months to avoid extreme shortage events. Even though some uncertainties in

  2. Dynamic Simulation of 1D Cellular Automata in the Active aTAM.

    PubMed

    Jonoska, Nataša; Karpenko, Daria; Seki, Shinnosuke

    2015-07-01

    The Active aTAM is a tile based model for self-assembly where tiles are able to transfer signals and change identities according to the signals received. We extend Active aTAM to include deactivation signals and thereby allow detachment of tiles. We show that the model allows a dynamic simulation of cellular automata with assemblies that do not record the entire computational history but only the current updates of the states, and thus provide a way for (a) algorithmic dynamical structural changes in the assembly and (b) reusable space in self-assembly. The simulation is such that at a given location the sequence of tiles that attach and detach corresponds precisely to the sequence of states the synchronous cellular automaton generates at that location.

  3. Dynamic Simulation of 1D Cellular Automata in the Active aTAM

    PubMed Central

    Jonoska, Nataša; Karpenko, Daria; Seki, Shinnosuke

    2016-01-01

    The Active aTAM is a tile based model for self-assembly where tiles are able to transfer signals and change identities according to the signals received. We extend Active aTAM to include deactivation signals and thereby allow detachment of tiles. We show that the model allows a dynamic simulation of cellular automata with assemblies that do not record the entire computational history but only the current updates of the states, and thus provide a way for (a) algorithmic dynamical structural changes in the assembly and (b) reusable space in self-assembly. The simulation is such that at a given location the sequence of tiles that attach and detach corresponds precisely to the sequence of states the synchronous cellular automaton generates at that location. PMID:27789918

  4. Using the automata processor for fast pattern recognition in high energy physics experiments. A proof of concept

    DOE PAGES

    Michael H. L. S. Wang; Cancelo, Gustavo; Green, Christopher; ...

    2016-06-25

    Here, we explore the Micron Automata Processor (AP) as a suitable commodity technology that can address the growing computational needs of pattern recognition in High Energy Physics (HEP) experiments. A toy detector model is developed for which an electron track confirmation trigger based on the Micron AP serves as a test case. Although primarily meant for high speed text-based searches, we demonstrate a proof of concept for the use of the Micron AP in a HEP trigger application.

  5. Using the automata processor for fast pattern recognition in high energy physics experiments. A proof of concept

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael H. L. S. Wang; Cancelo, Gustavo; Green, Christopher

    Here, we explore the Micron Automata Processor (AP) as a suitable commodity technology that can address the growing computational needs of pattern recognition in High Energy Physics (HEP) experiments. A toy detector model is developed for which an electron track confirmation trigger based on the Micron AP serves as a test case. Although primarily meant for high speed text-based searches, we demonstrate a proof of concept for the use of the Micron AP in a HEP trigger application.

  6. Facilitating arrhythmia simulation: the method of quantitative cellular automata modeling and parallel running

    PubMed Central

    Zhu, Hao; Sun, Yan; Rajagopal, Gunaretnam; Mondry, Adrian; Dhar, Pawan

    2004-01-01

    Background Many arrhythmias are triggered by abnormal electrical activity at the ionic channel and cell level, and then evolve spatio-temporally within the heart. To understand arrhythmias better and to diagnose them more precisely by their ECG waveforms, a whole-heart model is required to explore the association between the massively parallel activities at the channel/cell level and the integrative electrophysiological phenomena at organ level. Methods We have developed a method to build large-scale electrophysiological models by using extended cellular automata, and to run such models on a cluster of shared memory machines. We describe here the method, including the extension of a language-based cellular automaton to implement quantitative computing, the building of a whole-heart model with Visible Human Project data, the parallelization of the model on a cluster of shared memory computers with OpenMP and MPI hybrid programming, and a simulation algorithm that links cellular activity with the ECG. Results We demonstrate that electrical activities at channel, cell, and organ levels can be traced and captured conveniently in our extended cellular automaton system. Examples of some ECG waveforms simulated with a 2-D slice are given to support the ECG simulation algorithm. A performance evaluation of the 3-D model on a four-node cluster is also given. Conclusions Quantitative multicellular modeling with extended cellular automata is a highly efficient and widely applicable method to weave experimental data at different levels into computational models. This process can be used to investigate complex and collective biological activities that can be described neither by their governing differentiation equations nor by discrete parallel computation. Transparent cluster computing is a convenient and effective method to make time-consuming simulation feasible. Arrhythmias, as a typical case, can be effectively simulated with the methods described. PMID:15339335

  7. Cellular automata simulation of topological effects on the dynamics of feed-forward motifs

    PubMed Central

    Apte, Advait A; Cain, John W; Bonchev, Danail G; Fong, Stephen S

    2008-01-01

    Background Feed-forward motifs are important functional modules in biological and other complex networks. The functionality of feed-forward motifs and other network motifs is largely dictated by the connectivity of the individual network components. While studies on the dynamics of motifs and networks are usually devoted to the temporal or spatial description of processes, this study focuses on the relationship between the specific architecture and the overall rate of the processes of the feed-forward family of motifs, including double and triple feed-forward loops. The search for the most efficient network architecture could be of particular interest for regulatory or signaling pathways in biology, as well as in computational and communication systems. Results Feed-forward motif dynamics were studied using cellular automata and compared with differential equation modeling. The number of cellular automata iterations needed for a 100% conversion of a substrate into a target product was used as an inverse measure of the transformation rate. Several basic topological patterns were identified that order the specific feed-forward constructions according to the rate of dynamics they enable. At the same number of network nodes and constant other parameters, the bi-parallel and tri-parallel motifs provide higher network efficacy than single feed-forward motifs. Additionally, a topological property of isodynamicity was identified for feed-forward motifs where different network architectures resulted in the same overall rate of the target production. Conclusion It was shown for classes of structural motifs with feed-forward architecture that network topology affects the overall rate of a process in a quantitatively predictable manner. These fundamental results can be used as a basis for simulating larger networks as combinations of smaller network modules with implications on studying synthetic gene circuits, small regulatory systems, and eventually dynamic whole-cell models

  8. Origin of nonsaturating linear magnetoresistivity

    NASA Astrophysics Data System (ADS)

    Kisslinger, Ferdinand; Ott, Christian; Weber, Heiko B.

    2017-01-01

    The observation of nonsaturating classical linear magnetoresistivity has been an enigmatic phenomenon in solid-state physics. We present a study of a two-dimensional ohmic conductor, including local Hall effect and a self-consistent consideration of the environment. An equivalent-circuit scheme delivers a simple and convincing argument why the magnetoresistivity is linear in strong magnetic field, provided that current and biasing electric field are misaligned by a nonlocal mechanism. A finite-element model of a two-dimensional conductor is suited to display the situations that create such deviating currents. Besides edge effects next to electrodes, charge carrier density fluctuations are efficiently generating this effect. However, mobility fluctuations that have frequently been related to linear magnetoresistivity are barely relevant. Despite its rare observation, linear magnetoresitivity is rather the rule than the exception in a regime of low charge carrier densities, misaligned current pathways and strong magnetic field.

  9. A Simple Attitude Control of Quadrotor Helicopter Based on Ziegler-Nichols Rules for Tuning PD Parameters

    PubMed Central

    He, ZeFang

    2014-01-01

    An attitude control strategy based on Ziegler-Nichols rules for tuning PD (proportional-derivative) parameters of quadrotor helicopters is presented to solve the problem that quadrotor tends to be instable. This problem is caused by the narrow definition domain of attitude angles of quadrotor helicopters. The proposed controller is nonlinear and consists of a linear part and a nonlinear part. The linear part is a PD controller with PD parameters tuned by Ziegler-Nichols rules and acts on the quadrotor decoupled linear system after feedback linearization; the nonlinear part is a feedback linearization item which converts a nonlinear system into a linear system. It can be seen from the simulation results that the attitude controller proposed in this paper is highly robust, and its control effect is better than the other two nonlinear controllers. The nonlinear parts of the other two nonlinear controllers are the same as the attitude controller proposed in this paper. The linear part involves a PID (proportional-integral-derivative) controller with the PID controller parameters tuned by Ziegler-Nichols rules and a PD controller with the PD controller parameters tuned by GA (genetic algorithms). Moreover, this attitude controller is simple and easy to implement. PMID:25614879

  10. Stochastic modeling for dynamics of HIV-1 infection using cellular automata: A review.

    PubMed

    Precharattana, Monamorn

    2016-02-01

    Recently, the description of immune response by discrete models has emerged to play an important role to study the problems in the area of human immunodeficiency virus type 1 (HIV-1) infection, leading to AIDS. As infection of target immune cells by HIV-1 mainly takes place in the lymphoid tissue, cellular automata (CA) models thus represent a significant step in understanding when the infected population is dispersed. Motivated by these, the studies of the dynamics of HIV-1 infection using CA in memory have been presented to recognize how CA have been developed for HIV-1 dynamics, which issues have been studied already and which issues still are objectives in future studies.

  11. Automata learning algorithms and processes for providing more complete systems requirements specification by scenario generation, CSP-based syntax-oriented model construction, and R2D2C system requirements transformation

    NASA Technical Reports Server (NTRS)

    Margaria, Tiziana (Inventor); Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor); Steffen, Bernard (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments, automata learning algorithms and techniques are implemented to generate a more complete set of scenarios for requirements based programming. More specifically, a CSP-based, syntax-oriented model construction, which requires the support of a theorem prover, is complemented by model extrapolation, via automata learning. This may support the systematic completion of the requirements, the nature of the requirement being partial, which provides focus on the most prominent scenarios. This may generalize requirement skeletons by extrapolation and may indicate by way of automatically generated traces where the requirement specification is too loose and additional information is required.

  12. Experimental Studies of Quasi-Adiabatic Quantum-dot Cellular Automata

    NASA Astrophysics Data System (ADS)

    Orlov, Alexei; Amlani, Islamshah; Kummamuru, Ravi; Toth, Geza; Bernstein, Gary; Lent, Craig; Snider, Gregory

    2000-03-01

    The computational approach known as Quantum-dot Cellular Automata (QCA) uses interacting quantum dots to encode and process binary information. The first realization of a functioning QCA cell has already been reported. Recently, quasi-adiabatic switching of QCA in a metal dot system near the instantaneous ground state was proposed [1]. The advantage if this approach is that it allows both logic and addressable memory to be implemented within the QCA framework. We report on the fabrication and measurement of such a device in the Al-AlOx tunnel junction system. This basic building block consists of three metal islands connected in series by tunnel junctions, where an electron can be moved between islands by means of electrostatic perturbation on either control electrodes or adjacent cells. The cell can have three operational modes, i.e. active, locked and null, which provide a solution for ground state computing that is not susceptible to metastable states. [1] G. Toth and C. S. Lent, J. appl. Phys. 85 5, 2977-2984, 1999.

  13. Critical Behavior in Cellular Automata Animal Disease Transmission Model

    NASA Astrophysics Data System (ADS)

    Morley, P. D.; Chang, Julius

    Using cellular automata model, we simulate the British Government Policy (BGP) in the 2001 foot and mouth epidemic in Great Britain. When clinical symptoms of the disease appeared in a farm, there is mandatory slaughter (culling) of all livestock in an infected premise (IP). Those farms in the neighboring of an IP (contiguous premise, CP), are also culled, aka nearest neighbor interaction. Farms where the disease may be prevalent from animal, human, vehicle or airborne transmission (dangerous contact, DC), are additionally culled, aka next-to-nearest neighbor interactions and lightning factor. The resulting mathematical model possesses a phase transition, whereupon if the physical disease transmission kernel exceeds a critical value, catastrophic loss of animals ensues. The nonlocal disease transport probability can be as low as 0.01% per day and the disease can still be in the high mortality phase. We show that the fundamental equation for sustainable disease transport is the criticality equation for neutron fission cascade. Finally, we calculate that the percentage of culled animals that are actually healthy is ≈30%.

  14. Behavioral Modeling Based on Probabilistic Finite Automata: An Empirical Study.

    PubMed

    Tîrnăucă, Cristina; Montaña, José L; Ontañón, Santiago; González, Avelino J; Pardo, Luis M

    2016-06-24

    Imagine an agent that performs tasks according to different strategies. The goal of Behavioral Recognition (BR) is to identify which of the available strategies is the one being used by the agent, by simply observing the agent's actions and the environmental conditions during a certain period of time. The goal of Behavioral Cloning (BC) is more ambitious. In this last case, the learner must be able to build a model of the behavior of the agent. In both settings, the only assumption is that the learner has access to a training set that contains instances of observed behavioral traces for each available strategy. This paper studies a machine learning approach based on Probabilistic Finite Automata (PFAs), capable of achieving both the recognition and cloning tasks. We evaluate the performance of PFAs in the context of a simulated learning environment (in this case, a virtual Roomba vacuum cleaner robot), and compare it with a collection of other machine learning approaches.

  15. Ca-Pri a Cellular Automata Phenomenological Research Investigation: Simulation Results

    NASA Astrophysics Data System (ADS)

    Iannone, G.; Troisi, A.

    2013-05-01

    Following the introduction of a phenomenological cellular automata (CA) model capable to reproduce city growth and urban sprawl, we develop a toy model simulation considering a realistic framework. The main characteristic of our approach is an evolution algorithm based on inhabitants preferences. The control of grown cells is obtained by means of suitable functions which depend on the initial condition of the simulation. New born urban settlements are achieved by means of a logistic evolution of the urban pattern while urban sprawl is controlled by means of the population evolution function. In order to compare model results with a realistic urban framework we have considered, as the area of study, the island of Capri (Italy) in the Mediterranean Sea. Two different phases of the urban evolution on the island have been taken into account: a new born initial growth as induced by geographic suitability and the simulation of urban spread after 1943 induced by the population evolution after this date.

  16. Accurate reliability analysis method for quantum-dot cellular automata circuits

    NASA Astrophysics Data System (ADS)

    Cui, Huanqing; Cai, Li; Wang, Sen; Liu, Xiaoqiang; Yang, Xiaokuo

    2015-10-01

    Probabilistic transfer matrix (PTM) is a widely used model in the reliability research of circuits. However, PTM model cannot reflect the impact of input signals on reliability, so it does not completely conform to the mechanism of the novel field-coupled nanoelectronic device which is called quantum-dot cellular automata (QCA). It is difficult to get accurate results when PTM model is used to analyze the reliability of QCA circuits. To solve this problem, we present the fault tree models of QCA fundamental devices according to different input signals. After that, the binary decision diagram (BDD) is used to quantitatively investigate the reliability of two QCA XOR gates depending on the presented models. By employing the fault tree models, the impact of input signals on reliability can be identified clearly and the crucial components of a circuit can be found out precisely based on the importance values (IVs) of components. So this method is contributive to the construction of reliable QCA circuits.

  17. The Development of Design Tools for Fault Tolerant Quantum Dot Cellular Automata Based Logic

    NASA Technical Reports Server (NTRS)

    Armstrong, Curtis D.; Humphreys, William M.

    2003-01-01

    We are developing software to explore the fault tolerance of quantum dot cellular automata gate architectures in the presence of manufacturing variations and device defects. The Topology Optimization Methodology using Applied Statistics (TOMAS) framework extends the capabilities of the A Quantum Interconnected Network Array Simulator (AQUINAS) by adding front-end and back-end software and creating an environment that integrates all of these components. The front-end tools establish all simulation parameters, configure the simulation system, automate the Monte Carlo generation of simulation files, and execute the simulation of these files. The back-end tools perform automated data parsing, statistical analysis and report generation.

  18. Robust linear discriminant analysis with distance based estimators

    NASA Astrophysics Data System (ADS)

    Lim, Yai-Fung; Yahaya, Sharipah Soaad Syed; Ali, Hazlina

    2017-11-01

    Linear discriminant analysis (LDA) is one of the supervised classification techniques concerning relationship between a categorical variable and a set of continuous variables. The main objective of LDA is to create a function to distinguish between populations and allocating future observations to previously defined populations. Under the assumptions of normality and homoscedasticity, the LDA yields optimal linear discriminant rule (LDR) between two or more groups. However, the optimality of LDA highly relies on the sample mean and pooled sample covariance matrix which are known to be sensitive to outliers. To alleviate these conflicts, a new robust LDA using distance based estimators known as minimum variance vector (MVV) has been proposed in this study. The MVV estimators were used to substitute the classical sample mean and classical sample covariance to form a robust linear discriminant rule (RLDR). Simulation and real data study were conducted to examine on the performance of the proposed RLDR measured in terms of misclassification error rates. The computational result showed that the proposed RLDR is better than the classical LDR and was comparable with the existing robust LDR.

  19. Constructing Compact Takagi-Sugeno Rule Systems: Identification of Complex Interactions in Epidemiological Data

    PubMed Central

    Zhou, Shang-Ming; Lyons, Ronan A.; Brophy, Sinead; Gravenor, Mike B.

    2012-01-01

    The Takagi-Sugeno (TS) fuzzy rule system is a widely used data mining technique, and is of particular use in the identification of non-linear interactions between variables. However the number of rules increases dramatically when applied to high dimensional data sets (the curse of dimensionality). Few robust methods are available to identify important rules while removing redundant ones, and this results in limited applicability in fields such as epidemiology or bioinformatics where the interaction of many variables must be considered. Here, we develop a new parsimonious TS rule system. We propose three statistics: R, L, and ω-values, to rank the importance of each TS rule, and a forward selection procedure to construct a final model. We use our method to predict how key components of childhood deprivation combine to influence educational achievement outcome. We show that a parsimonious TS model can be constructed, based on a small subset of rules, that provides an accurate description of the relationship between deprivation indices and educational outcomes. The selected rules shed light on the synergistic relationships between the variables, and reveal that the effect of targeting specific domains of deprivation is crucially dependent on the state of the other domains. Policy decisions need to incorporate these interactions, and deprivation indices should not be considered in isolation. The TS rule system provides a basis for such decision making, and has wide applicability for the identification of non-linear interactions in complex biomedical data. PMID:23272108

  20. Constructing compact Takagi-Sugeno rule systems: identification of complex interactions in epidemiological data.

    PubMed

    Zhou, Shang-Ming; Lyons, Ronan A; Brophy, Sinead; Gravenor, Mike B

    2012-01-01

    The Takagi-Sugeno (TS) fuzzy rule system is a widely used data mining technique, and is of particular use in the identification of non-linear interactions between variables. However the number of rules increases dramatically when applied to high dimensional data sets (the curse of dimensionality). Few robust methods are available to identify important rules while removing redundant ones, and this results in limited applicability in fields such as epidemiology or bioinformatics where the interaction of many variables must be considered. Here, we develop a new parsimonious TS rule system. We propose three statistics: R, L, and ω-values, to rank the importance of each TS rule, and a forward selection procedure to construct a final model. We use our method to predict how key components of childhood deprivation combine to influence educational achievement outcome. We show that a parsimonious TS model can be constructed, based on a small subset of rules, that provides an accurate description of the relationship between deprivation indices and educational outcomes. The selected rules shed light on the synergistic relationships between the variables, and reveal that the effect of targeting specific domains of deprivation is crucially dependent on the state of the other domains. Policy decisions need to incorporate these interactions, and deprivation indices should not be considered in isolation. The TS rule system provides a basis for such decision making, and has wide applicability for the identification of non-linear interactions in complex biomedical data.

  1. On the spatial dynamics and oscillatory behavior of a predator-prey model based on cellular automata and local particle swarm optimization.

    PubMed

    Molina, Mario Martínez; Moreno-Armendáriz, Marco A; Carlos Seck Tuoh Mora, Juan

    2013-11-07

    A two-dimensional lattice model based on Cellular Automata theory and swarm intelligence is used to study the spatial and population dynamics of a theoretical ecosystem. It is found that the social interactions among predators provoke the formation of clusters, and that by increasing the mobility of predators the model enters into an oscillatory behavior. © 2013 Elsevier Ltd. All rights reserved.

  2. Modeling the Land Use/Cover Change in an Arid Region Oasis City Constrained by Water Resource and Environmental Policy Change using Cellular Automata Model

    NASA Astrophysics Data System (ADS)

    Hu, X.; Li, X.; Lu, L.

    2017-12-01

    Land use/cover change (LUCC) is an important subject in the research of global environmental change and sustainable development, while spatial simulation on land use/cover change is one of the key content of LUCC and is also difficult due to the complexity of the system. The cellular automata (CA) model had an irreplaceable role in simulating of land use/cover change process due to the powerful spatial computing power. However, the majority of current CA land use/cover models were binary-state model that could not provide more general information about the overall spatial pattern of land use/cover change. Here, a multi-state logistic-regression-based Markov cellular automata (MLRMCA) model and a multi-state artificial-neural-network-based Markov cellular automata (MANNMCA) model were developed and were used to simulate complex land use/cover evolutionary process in an arid region oasis city constrained by water resource and environmental policy change, the Zhangye city during the period of 1990-2010. The results indicated that the MANNMCA model was superior to MLRMCA model in simulated accuracy. These indicated that by combining the artificial neural network with CA could more effectively capture the complex relationships between the land use/cover change and a set of spatial variables. Although the MLRMCA model were also some advantages, the MANNMCA model was more appropriate for simulating complex land use/cover dynamics. The two proposed models were effective and reliable, and could reflect the spatial evolution of regional land use/cover changes. These have also potential implications for the impact assessment of water resources, ecological restoration, and the sustainable urban development in arid areas.

  3. Coupling of Markov chains and cellular automata spatial models to predict land cover changes (case study: upper Ci Leungsi catchment area)

    NASA Astrophysics Data System (ADS)

    Marko, K.; Zulkarnain, F.; Kusratmoko, E.

    2016-11-01

    Land cover changes particular in urban catchment area has been rapidly occur. Land cover changes occur as a result of increasing demand for built-up area. Various kinds of environmental and hydrological problems e.g. floods and urban heat island can happen if the changes are uncontrolled. This study aims to predict land cover changes using coupling of Markov chains and cellular automata. One of the most rapid land cover changes is occurs at upper Ci Leungsi catchment area that located near Bekasi City and Jakarta Metropolitan Area. Markov chains has a good ability to predict the probability of change statistically while cellular automata believed as a powerful method in reading the spatial patterns of change. Temporal land cover data was obtained by remote sensing satellite imageries. In addition, this study also used multi-criteria analysis to determine which driving factor that could stimulate the changes such as proximity, elevation, and slope. Coupling of these two methods could give better prediction model rather than just using it separately. The prediction model was validated using existing 2015 land cover data and shown a satisfactory kappa coefficient. The most significant increasing land cover is built-up area from 24% to 53%.

  4. Using Cellular Automata for Parking Recommendations in Smart Environments

    PubMed Central

    Horng, Gwo-Jiun

    2014-01-01

    In this work, we propose an innovative adaptive recommendation mechanism for smart parking. The cognitive RF module will transmit the vehicle location information and the parking space requirements to the parking congestion computing center (PCCC) when the driver must find a parking space. Moreover, for the parking spaces, we use a cellular automata (CA) model mechanism that can adjust to full and not full parking lot situations. Here, the PCCC can compute the nearest parking lot, the parking lot status and the current or opposite driving direction with the vehicle location information. By considering the driving direction, we can determine when the vehicles must turn around and thus reduce road congestion and speed up finding a parking space. The recommendation will be sent to the drivers through a wireless communication cognitive radio (CR) model after the computation and analysis by the PCCC. The current study evaluates the performance of this approach by conducting computer simulations. The simulation results show the strengths of the proposed smart parking mechanism in terms of avoiding increased congestion and decreasing the time to find a parking space. PMID:25153671

  5. A scale-invariant cellular-automata model for distributed seismicity

    NASA Technical Reports Server (NTRS)

    Barriere, Benoit; Turcotte, Donald L.

    1991-01-01

    In the standard cellular-automata model for a fault an element of stress is randomly added to a grid of boxes until a box has four elements, these are then redistributed to the adjacent boxes on the grid. The redistribution can result in one or more of these boxes having four or more elements in which case further redistributions are required. On the average added elements are lost from the edges of the grid. The model is modified so that the boxes have a scale-invariant distribution of sizes. The objective is to model a scale-invariant distribution of fault sizes. When a redistribution from a box occurs it is equivalent to a characteristic earthquake on the fault. A redistribution from a small box (a foreshock) can trigger an instability in a large box (the main shock). A redistribution from a large box always triggers many instabilities in the smaller boxes (aftershocks). The frequency-size statistics for both main shocks and aftershocks satisfy the Gutenberg-Richter relation with b = 0.835 for main shocks and b = 0.635 for aftershocks. Model foreshocks occur 28 percent of the time.

  6. Integrating the ECG power-line interference removal methods with rule-based system.

    PubMed

    Kumaravel, N; Senthil, A; Sridhar, K S; Nithiyanandam, N

    1995-01-01

    The power-line frequency interference in electrocardiographic signals is eliminated to enhance the signal characteristics for diagnosis. The power-line frequency normally varies +/- 1.5 Hz from its standard value of 50 Hz. In the present work, the performances of the linear FIR filter, Wave digital filter (WDF) and adaptive filter for the power-line frequency variations from 48.5 to 51.5 Hz in steps of 0.5 Hz are studied. The advantage of the LMS adaptive filter in the removal of power-line frequency interference even if the frequency of interference varies by +/- 1.5 Hz from its normal value of 50 Hz over other fixed frequency filters is very well justified. A novel method of integrating rule-based system approach with linear FIR filter and also with Wave digital filter are proposed. The performances of Rule-based FIR filter and Rule-based Wave digital filter are compared with the LMS adaptive filter.

  7. An Orthogonal Evolutionary Algorithm With Learning Automata for Multiobjective Optimization.

    PubMed

    Dai, Cai; Wang, Yuping; Ye, Miao; Xue, Xingsi; Liu, Hailin

    2016-12-01

    Research on multiobjective optimization problems becomes one of the hottest topics of intelligent computation. In order to improve the search efficiency of an evolutionary algorithm and maintain the diversity of solutions, in this paper, the learning automata (LA) is first used for quantization orthogonal crossover (QOX), and a new fitness function based on decomposition is proposed to achieve these two purposes. Based on these, an orthogonal evolutionary algorithm with LA for complex multiobjective optimization problems with continuous variables is proposed. The experimental results show that in continuous states, the proposed algorithm is able to achieve accurate Pareto-optimal sets and wide Pareto-optimal fronts efficiently. Moreover, the comparison with the several existing well-known algorithms: nondominated sorting genetic algorithm II, decomposition-based multiobjective evolutionary algorithm, decomposition-based multiobjective evolutionary algorithm with an ensemble of neighborhood sizes, multiobjective optimization by LA, and multiobjective immune algorithm with nondominated neighbor-based selection, on 15 multiobjective benchmark problems, shows that the proposed algorithm is able to find more accurate and evenly distributed Pareto-optimal fronts than the compared ones.

  8. In silico characterization of cell-cell interactions using a cellular automata model of cell culture.

    PubMed

    Kihara, Takanori; Kashitani, Kosuke; Miyake, Jun

    2017-07-14

    Cell proliferation is a key characteristic of eukaryotic cells. During cell proliferation, cells interact with each other. In this study, we developed a cellular automata model to estimate cell-cell interactions using experimentally obtained images of cultured cells. We used four types of cells; HeLa cells, human osteosarcoma (HOS) cells, rat mesenchymal stem cells (MSCs), and rat smooth muscle A7r5 cells. These cells were cultured and stained daily. The obtained cell images were binarized and clipped into squares containing about 10 4 cells. These cells showed characteristic cell proliferation patterns. The growth curves of these cells were generated from the cell proliferation images and we determined the doubling time of these cells from the growth curves. We developed a simple cellular automata system with an easily accessible graphical user interface. This system has five variable parameters, namely, initial cell number, doubling time, motility, cell-cell adhesion, and cell-cell contact inhibition (of proliferation). Within these parameters, we obtained initial cell numbers and doubling times experimentally. We set the motility at a constant value because the effect of the parameter for our simulation was restricted. Therefore, we simulated cell proliferation behavior with cell-cell adhesion and cell-cell contact inhibition as variables. By comparing growth curves and proliferation cell images, we succeeded in determining the cell-cell interaction properties of each cell. Simulated HeLa and HOS cells exhibited low cell-cell adhesion and weak cell-cell contact inhibition. Simulated MSCs exhibited high cell-cell adhesion and positive cell-cell contact inhibition. Simulated A7r5 cells exhibited low cell-cell adhesion and strong cell-cell contact inhibition. These simulated results correlated with the experimental growth curves and proliferation images. Our simulation approach is an easy method for evaluating the cell-cell interaction properties of cells.

  9. Hybrid automata models of cardiac ventricular electrophysiology for real-time computational applications.

    PubMed

    Andalam, Sidharta; Ramanna, Harshavardhan; Malik, Avinash; Roop, Parthasarathi; Patel, Nitish; Trew, Mark L

    2016-08-01

    Virtual heart models have been proposed for closed loop validation of safety-critical embedded medical devices, such as pacemakers. These models must react in real-time to off-the-shelf medical devices. Real-time performance can be obtained by implementing models in computer hardware, and methods of compiling classes of Hybrid Automata (HA) onto FPGA have been developed. Models of ventricular cardiac cell electrophysiology have been described using HA which capture the complex nonlinear behavior of biological systems. However, many models that have been used for closed-loop validation of pacemakers are highly abstract and do not capture important characteristics of the dynamic rate response. We developed a new HA model of cardiac cells which captures dynamic behavior and we implemented the model in hardware. This potentially enables modeling the heart with over 1 million dynamic cells, making the approach ideal for closed loop testing of medical devices.

  10. The Convallis Rule for Unsupervised Learning in Cortical Networks

    PubMed Central

    Yger, Pierre; Harris, Kenneth D.

    2013-01-01

    The phenomenology and cellular mechanisms of cortical synaptic plasticity are becoming known in increasing detail, but the computational principles by which cortical plasticity enables the development of sensory representations are unclear. Here we describe a framework for cortical synaptic plasticity termed the “Convallis rule”, mathematically derived from a principle of unsupervised learning via constrained optimization. Implementation of the rule caused a recurrent cortex-like network of simulated spiking neurons to develop rate representations of real-world speech stimuli, enabling classification by a downstream linear decoder. Applied to spike patterns used in in vitro plasticity experiments, the rule reproduced multiple results including and beyond STDP. However STDP alone produced poorer learning performance. The mathematical form of the rule is consistent with a dual coincidence detector mechanism that has been suggested by experiments in several synaptic classes of juvenile neocortex. Based on this confluence of normative, phenomenological, and mechanistic evidence, we suggest that the rule may approximate a fundamental computational principle of the neocortex. PMID:24204224

  11. Stochastic cellular automata model of neurosphere growth: Roles of proliferative potential, contact inhibition, cell death, and phagocytosis.

    PubMed

    Sipahi, Rifat; Zupanc, Günther K H

    2018-05-14

    Neural stem and progenitor cells isolated from the central nervous system form, under specific culture conditions, clonal cell clusters known as neurospheres. The neurosphere assay has proven to be a powerful in vitro system to study the behavior of such cells and the development of their progeny. However, the theory of neurosphere growth has remained poorly understood. To overcome this limitation, we have, in the present paper, developed a cellular automata model, with which we examined the effects of proliferative potential, contact inhibition, cell death, and clearance of dead cells on growth rate, final size, and composition of neurospheres. Simulations based on this model indicated that the proliferative potential of the founder cell and its progenitors has a major influence on neurosphere size. On the other hand, contact inhibition of proliferation limits the final size, and reduces the growth rate, of neurospheres. The effect of this inhibition is particularly dramatic when a stem cell becomes encapsulated by differentiated or other non-proliferating cells, thereby suppressing any further mitotic division - despite the existing proliferative potential of the stem cell. Conversely, clearance of dead cells through phagocytosis is predicted to accelerate growth by reducing contact inhibition. A surprising prediction derived from our model is that cell death, while resulting in a decrease in growth rate and final size of neurospheres, increases the degree of differentiation of neurosphere cells. It is likely that the cellular automata model developed as part of the present investigation is applicable to the study of tissue growth in a wide range of systems. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Evaluating a Novel Cellular Automata-Based Distributed Power Management Approach for Mobile Wireless Sensor Networks

    NASA Astrophysics Data System (ADS)

    Adabi, Sepideh; Adabi, Sahar; Rezaee, Ali

    According to the traditional definition of Wireless Sensor Networks (WSNs), static sensors have limited the feasibility of WSNs in some kind of approaches, so the mobility was introduced in WSN. Mobile nodes in a WSN come equipped with battery and from the point of deployment, this battery reserve becomes a valuable resource since it cannot be replenished. Hence, maximizing the network lifetime by minimizing the energy is an important challenge in Mobile WSN. Energy conservation can be accomplished by different approaches. In this paper, we presented an energy conservation solution based on Cellular Automata. The main objective of this solution is based on dynamically adjusting the transmission range and switching between operational states of the sensor nodes.

  13. Connecting clinical and actuarial prediction with rule-based methods.

    PubMed

    Fokkema, Marjolein; Smits, Niels; Kelderman, Henk; Penninx, Brenda W J H

    2015-06-01

    Meta-analyses comparing the accuracy of clinical versus actuarial prediction have shown actuarial methods to outperform clinical methods, on average. However, actuarial methods are still not widely used in clinical practice, and there has been a call for the development of actuarial prediction methods for clinical practice. We argue that rule-based methods may be more useful than the linear main effect models usually employed in prediction studies, from a data and decision analytic as well as a practical perspective. In addition, decision rules derived with rule-based methods can be represented as fast and frugal trees, which, unlike main effects models, can be used in a sequential fashion, reducing the number of cues that have to be evaluated before making a prediction. We illustrate the usability of rule-based methods by applying RuleFit, an algorithm for deriving decision rules for classification and regression problems, to a dataset on prediction of the course of depressive and anxiety disorders from Penninx et al. (2011). The RuleFit algorithm provided a model consisting of 2 simple decision rules, requiring evaluation of only 2 to 4 cues. Predictive accuracy of the 2-rule model was very similar to that of a logistic regression model incorporating 20 predictor variables, originally applied to the dataset. In addition, the 2-rule model required, on average, evaluation of only 3 cues. Therefore, the RuleFit algorithm appears to be a promising method for creating decision tools that are less time consuming and easier to apply in psychological practice, and with accuracy comparable to traditional actuarial methods. (c) 2015 APA, all rights reserved).

  14. A SIMPLE CELLULAR AUTOMATON MODEL FOR HIGH-LEVEL VEGETATION DYNAMICS

    EPA Science Inventory

    We have produced a simple two-dimensional (ground-plan) cellular automata model of vegetation dynamics specifically to investigate high-level community processes. The model is probabilistic, with individual plant behavior determined by physiologically-based rules derived from a w...

  15. On complexity and homogeneity measures in predicting biological aggressiveness of prostate cancer; Implication of the cellular automata model of tumor growth.

    PubMed

    Tanase, Mihai; Waliszewski, Przemyslaw

    2015-12-01

    We propose a novel approach for the quantitative evaluation of aggressiveness in prostate carcinomas. The spatial distribution of cancer cell nuclei was characterized by the global spatial fractal dimensions D0, D1, and D2. Two hundred eighteen prostate carcinomas were stratified into the classes of equivalence using results of ROC analysis. A simulation of the cellular automata mix defined a theoretical frame for a specific geometric representation of the cell nuclei distribution called a local structure correlation diagram (LSCD). The LSCD and dispersion Hd were computed for each carcinoma. Data mining generated some quantitative criteria describing tumor aggressiveness. Alterations in tumor architecture along progression were associated with some changes in both shape and the quantitative characteristics of the LSCD consistent with those in the automata mix model. Low-grade prostate carcinomas with low complexity and very low biological aggressiveness are defined by the condition D0 < 1.545 and Hd < 38. High-grade carcinomas with high complexity and very high biological aggressiveness are defined by the condition D0 > 1.764 and Hd < 38. The novel homogeneity measure Hd identifies carcinomas with very low aggressiveness within the class of complexity C1 or carcinomas with very high aggressiveness in the class C7. © 2015 Wiley Periodicals, Inc.

  16. A novel FPGA-programmable switch matrix interconnection element in quantum-dot cellular automata

    NASA Astrophysics Data System (ADS)

    Hashemi, Sara; Rahimi Azghadi, Mostafa; Zakerolhosseini, Ali; Navi, Keivan

    2015-04-01

    The Quantum-dot cellular automata (QCA) is a novel nanotechnology, promising extra low-power, extremely dense and very high-speed structure for the construction of logical circuits at a nanoscale. In this paper, initially previous works on QCA-based FPGA's routing elements are investigated, and then an efficient, symmetric and reliable QCA programmable switch matrix (PSM) interconnection element is introduced. This element has a simple structure and offers a complete routing capability. It is implemented using a bottom-up design approach that starts from a dense and high-speed 2:1 multiplexer and utilise it to build the target PSM interconnection element. In this study, simulations of the proposed circuits are carried out using QCAdesigner, a layout and simulation tool for QCA circuits. The results demonstrate high efficiency of the proposed designs in QCA-based FPGA routing.

  17. Optimal Hedging Rule for Reservoir Refill Operation

    NASA Astrophysics Data System (ADS)

    Wan, W.; Zhao, J.; Lund, J. R.; Zhao, T.; Lei, X.; Wang, H.

    2015-12-01

    This paper develops an optimal reservoir Refill Hedging Rule (RHR) for combined water supply and flood operation using mathematical analysis. A two-stage model is developed to formulate the trade-off between operations for conservation benefit and flood damage in the reservoir refill season. Based on the probability distribution of the maximum refill water availability at the end of the second stage, three zones are characterized according to the relationship among storage capacity, expected storage buffer (ESB), and maximum safety excess discharge (MSED). The Karush-Kuhn-Tucker conditions of the model show that the optimality of the refill operation involves making the expected marginal loss of conservation benefit from unfilling (i.e., ending storage of refill period less than storage capacity) as nearly equal to the expected marginal flood damage from levee overtopping downstream as possible while maintaining all constraints. This principle follows and combines the hedging rules for water supply and flood management. A RHR curve is drawn analogously to water supply hedging and flood hedging rules, showing the trade-off between the two objectives. The release decision result has a linear relationship with the current water availability, implying the linearity of RHR for a wide range of water conservation functions (linear, concave, or convex). A demonstration case shows the impacts of factors. Larger downstream flood conveyance capacity and empty reservoir capacity allow a smaller current release and more water can be conserved. Economic indicators of conservation benefit and flood damage compete with each other on release, the greater economic importance of flood damage is, the more water should be released in the current stage, and vice versa. Below a critical value, improving forecasts yields less water release, but an opposing effect occurs beyond this critical value. Finally, the Danjiangkou Reservoir case study shows that the RHR together with a rolling

  18. Linear solvation energy relationships (LSER): 'rules of thumb' for Vi/100, π*, Βm, and αm estimation and use in aquatic toxicology

    USGS Publications Warehouse

    Hickey, James P.

    1996-01-01

    This chapter provides a listing of the increasing variety of organic moieties and heteroatom group for which Linear Solvation Energy Relationship (LSER) values are available, and the LSER variable estimation rules. The listings include values for typical nitrogen-, sulfur- and phosphorus-containing moieties, and general organosilicon and organotin groups. The contributions by an ion pair situation to the LSER values are also offered in Table 1, allowing estimation of parameters for salts and zwitterions. The guidelines permit quick estimation of values for the four primary LSER variables Vi/100, π*, Βm, and αm by summing the contribtuions from its components. The use of guidelines and Table 1 significantly simplifies computation of values for the LSER variables for most possible organic comppounds in the environment, including the larger compounds of environmental and biological interest.

  19. A comparison of Heuristic method and Llewellyn’s rules for identification of redundant constraints

    NASA Astrophysics Data System (ADS)

    Estiningsih, Y.; Farikhin; Tjahjana, R. H.

    2018-03-01

    Important techniques in linear programming is modelling and solving practical optimization. Redundant constraints are consider for their effects on general linear programming problems. Identification and reduce redundant constraints are for avoidance of all the calculations associated when solving an associated linear programming problems. Many researchers have been proposed for identification redundant constraints. This paper a compararison of Heuristic method and Llewellyn’s rules for identification of redundant constraints.

  20. Highly scalable and robust rule learner: performance evaluation and comparison.

    PubMed

    Kurgan, Lukasz A; Cios, Krzysztof J; Dick, Scott

    2006-02-01

    Business intelligence and bioinformatics applications increasingly require the mining of datasets consisting of millions of data points, or crafting real-time enterprise-level decision support systems for large corporations and drug companies. In all cases, there needs to be an underlying data mining system, and this mining system must be highly scalable. To this end, we describe a new rule learner called DataSqueezer. The learner belongs to the family of inductive supervised rule extraction algorithms. DataSqueezer is a simple, greedy, rule builder that generates a set of production rules from labeled input data. In spite of its relative simplicity, DataSqueezer is a very effective learner. The rules generated by the algorithm are compact, comprehensible, and have accuracy comparable to rules generated by other state-of-the-art rule extraction algorithms. The main advantages of DataSqueezer are very high efficiency, and missing data resistance. DataSqueezer exhibits log-linear asymptotic complexity with the number of training examples, and it is faster than other state-of-the-art rule learners. The learner is also robust to large quantities of missing data, as verified by extensive experimental comparison with the other learners. DataSqueezer is thus well suited to modern data mining and business intelligence tasks, which commonly involve huge datasets with a large fraction of missing data.

  1. The role of the interaction network in the emergence of diversity of behavior

    PubMed Central

    Tabacof, Pedro; Von Zuben, Fernando J.

    2017-01-01

    How can systems in which individuals’ inner workings are very similar to each other, as neural networks or ant colonies, produce so many qualitatively different behaviors, giving rise to roles and specialization? In this work, we bring new perspectives to this question by focusing on the underlying network that defines how individuals in these systems interact. We applied a genetic algorithm to optimize rules and connections of cellular automata in order to solve the density classification task, a classical problem used to study emergent behaviors in decentralized computational systems. The networks used were all generated by the introduction of shortcuts in an originally regular topology, following the small-world model. Even though all cells follow the exact same rules, we observed the existence of different classes of cells’ behaviors in the best cellular automata found—most cells were responsible for memory and others for integration of information. Through the analysis of structural measures and patterns of connections (motifs) in successful cellular automata, we observed that the distribution of shortcuts between distant regions and the speed in which a cell can gather information from different parts of the system seem to be the main factors for the specialization we observed, demonstrating how heterogeneity in a network can create heterogeneity of behavior. PMID:28234962

  2. Using learning automata to determine proper subset size in high-dimensional spaces

    NASA Astrophysics Data System (ADS)

    Seyyedi, Seyyed Hossein; Minaei-Bidgoli, Behrouz

    2017-03-01

    In this paper, we offer a new method called FSLA (Finding the best candidate Subset using Learning Automata), which combines the filter and wrapper approaches for feature selection in high-dimensional spaces. Considering the difficulties of dimension reduction in high-dimensional spaces, FSLA's multi-objective functionality is to determine, in an efficient manner, a feature subset that leads to an appropriate tradeoff between the learning algorithm's accuracy and efficiency. First, using an existing weighting function, the feature list is sorted and selected subsets of the list of different sizes are considered. Then, a learning automaton verifies the performance of each subset when it is used as the input space of the learning algorithm and estimates its fitness upon the algorithm's accuracy and the subset size, which determines the algorithm's efficiency. Finally, FSLA introduces the fittest subset as the best choice. We tested FSLA in the framework of text classification. The results confirm its promising performance of attaining the identified goal.

  3. Study on queueing behavior in pedestrian evacuation by extended cellular automata model

    NASA Astrophysics Data System (ADS)

    Hu, Jun; You, Lei; Zhang, Hong; Wei, Juan; Guo, Yangyong

    2018-01-01

    This paper proposes a pedestrian evacuation model for effective simulation of evacuation efficiency based on extended cellular automata. In the model, pedestrians' momentary transition probability to a target position is defined in terms of the floor field and queueing time, and the critical time is defined as the waiting time threshold in a queue. Queueing time and critical time are derived using Fractal Brownian Motion through analysis of pedestrian arrival characteristics. Simulations using the platform and actual evacuations were conducted to study the relationships among system evacuation time, average system velocity, pedestrian density, flow rate, and critical time. The results demonstrate that at low pedestrian density, evacuation efficiency can be improved through adoption of the shortest route strategy, and critical time has an inverse relationship with average system velocity. Conversely, at higher pedestrian densities, it is better to adopt the shortest queueing time strategy, and critical time is inversely related to flow rate.

  4. Simple rules for a "simple" nervous system? Molecular and biomathematical approaches to enteric nervous system formation and malformation.

    PubMed

    Newgreen, Donald F; Dufour, Sylvie; Howard, Marthe J; Landman, Kerry A

    2013-10-01

    We review morphogenesis of the enteric nervous system from migratory neural crest cells, and defects of this process such as Hirschsprung disease, centering on cell motility and assembly, and cell adhesion and extracellular matrix molecules, along with cell proliferation and growth factors. We then review continuum and agent-based (cellular automata) models with rules of cell movement and logistical proliferation. Both movement and proliferation at the individual cell level are modeled with stochastic components from which stereotyped outcomes emerge at the population level. These models reproduced the wave-like colonization of the intestine by enteric neural crest cells, and several new properties emerged, such as colonization by frontal expansion, which were later confirmed biologically. These models predict a surprising level of clonal heterogeneity both in terms of number and distribution of daughter cells. Biologically, migrating cells form stable chains made up of unstable cells, but this is not seen in the initial model. We outline additional rules for cell differentiation into neurons, axon extension, cell-axon and cell-cell adhesions, chemotaxis and repulsion which can reproduce chain migration. After the migration stage, the cells re-arrange as a network of ganglia. Changes in cell adhesion molecules parallel this, and we describe additional rules based on Steinberg's Differential Adhesion Hypothesis, reflecting changing levels of adhesion in neural crest cells and neurons. This was able to reproduce enteric ganglionation in a model. Mouse mutants with disturbances of enteric nervous system morphogenesis are discussed, and these suggest future refinement of the models. The modeling suggests a relatively simple set of cell behavioral rules could account for complex patterns of morphogenesis. The model has allowed the proposal that Hirschsprung disease is mostly an enteric neural crest cell proliferation defect, not a defect of cell migration. In addition

  5. A solution to the biodiversity paradox by logical deterministic cellular automata.

    PubMed

    Kalmykov, Lev V; Kalmykov, Vyacheslav L

    2015-06-01

    The paradox of biological diversity is the key problem of theoretical ecology. The paradox consists in the contradiction between the competitive exclusion principle and the observed biodiversity. The principle is important as the basis for ecological theory. On a relatively simple model we show a mechanism of indefinite coexistence of complete competitors which violates the known formulations of the competitive exclusion principle. This mechanism is based on timely recovery of limiting resources and their spatio-temporal allocation between competitors. Because of limitations of the black-box modeling there was a problem to formulate the exclusion principle correctly. Our white-box multiscale model of two-species competition is based on logical deterministic individual-based cellular automata. This approach provides an automatic deductive inference on the basis of a system of axioms, and gives a direct insight into mechanisms of the studied system. It is one of the most promising methods of artificial intelligence. We reformulate and generalize the competitive exclusion principle and explain why this formulation provides a solution of the biodiversity paradox. In addition, we propose a principle of competitive coexistence.

  6. Statistical learning and the challenge of syntax: Beyond finite state automata

    NASA Astrophysics Data System (ADS)

    Elman, Jeff

    2003-10-01

    Over the past decade, it has been clear that even very young infants are sensitive to the statistical structure of language input presented to them, and use the distributional regularities to induce simple grammars. But can such statistically-driven learning also explain the acquisition of more complex grammar, particularly when the grammar includes recursion? Recent claims (e.g., Hauser, Chomsky, and Fitch, 2002) have suggested that the answer is no, and that at least recursion must be an innate capacity of the human language acquisition device. In this talk evidence will be presented that indicates that, in fact, statistically-driven learning (embodied in recurrent neural networks) can indeed enable the learning of complex grammatical patterns, including those that involve recursion. When the results are generalized to idealized machines, it is found that the networks are at least equivalent to Push Down Automata. Perhaps more interestingly, with limited and finite resources (such as are presumed to exist in the human brain) these systems demonstrate patterns of performance that resemble those in humans.

  7. Simultaneous Optimization of Decisions Using a Linear Utility Function.

    ERIC Educational Resources Information Center

    Vos, Hans J.

    1990-01-01

    An approach is presented to simultaneously optimize decision rules for combinations of elementary decisions through a framework derived from Bayesian decision theory. The developed linear utility model for selection-mastery decisions was applied to a sample of 43 first year medical students to illustrate the procedure. (SLD)

  8. A stochastic parameterization for deep convection using cellular automata

    NASA Astrophysics Data System (ADS)

    Bengtsson, L.; Steinheimer, M.; Bechtold, P.; Geleyn, J.

    2012-12-01

    Cumulus parameterizations used in most operational weather and climate models today are based on the mass-flux concept which took form in the early 1970's. In such schemes it is assumed that a unique relationship exists between the ensemble-average of the sub-grid convection, and the instantaneous state of the atmosphere in a vertical grid box column. However, such a relationship is unlikely to be described by a simple deterministic function (Palmer, 2011). Thus, because of the statistical nature of the parameterization challenge, it has been recognized by the community that it is important to introduce stochastic elements to the parameterizations (for instance: Plant and Craig, 2008, Khouider et al. 2010, Frenkel et al. 2011, Bentsson et al. 2011, but the list is far from exhaustive). There are undoubtedly many ways in which stochastisity can enter new developments. In this study we use a two-way interacting cellular automata (CA), as its intrinsic nature possesses many qualities interesting for deep convection parameterization. In the one-dimensional entraining plume approach, there is no parameterization of horizontal transport of heat, moisture or momentum due to cumulus convection. In reality, mass transport due to gravity waves that propagate in the horizontal can trigger new convection, important for the organization of deep convection (Huang, 1988). The self-organizational characteristics of the CA allows for lateral communication between adjacent NWP model grid-boxes, and temporal memory. Thus the CA scheme used in this study contain three interesting components for representation of cumulus convection, which are not present in the traditional one-dimensional bulk entraining plume method: horizontal communication, memory and stochastisity. The scheme is implemented in the high resolution regional NWP model ALARO, and simulations show enhanced organization of convective activity along squall-lines. Probabilistic evaluation demonstrate an enhanced spread in

  9. Simple and multiple linear regression: sample size considerations.

    PubMed

    Hanley, James A

    2016-11-01

    The suggested "two subjects per variable" (2SPV) rule of thumb in the Austin and Steyerberg article is a chance to bring out some long-established and quite intuitive sample size considerations for both simple and multiple linear regression. This article distinguishes two of the major uses of regression models that imply very different sample size considerations, neither served well by the 2SPV rule. The first is etiological research, which contrasts mean Y levels at differing "exposure" (X) values and thus tends to focus on a single regression coefficient, possibly adjusted for confounders. The second research genre guides clinical practice. It addresses Y levels for individuals with different covariate patterns or "profiles." It focuses on the profile-specific (mean) Y levels themselves, estimating them via linear compounds of regression coefficients and covariates. By drawing on long-established closed-form variance formulae that lie beneath the standard errors in multiple regression, and by rearranging them for heuristic purposes, one arrives at quite intuitive sample size considerations for both research genres. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Identifying Flow Networks in a Karstified Aquifer by Application of the Cellular Automata-Based Deterministic Inversion Method (Lez Aquifer, France)

    NASA Astrophysics Data System (ADS)

    Fischer, P.; Jardani, A.; Wang, X.; Jourde, H.; Lecoq, N.

    2017-12-01

    The distributed modeling of flow paths within karstic and fractured fields remains a complex task because of the high dependence of the hydraulic responses to the relative locations between observational boreholes and interconnected fractures and karstic conduits that control the main flow of the hydrosystem. The inverse problem in a distributed model is one alternative approach to interpret the hydraulic test data by mapping the karstic networks and fractured areas. In this work, we developed a Bayesian inversion approach, the Cellular Automata-based Deterministic Inversion (CADI) algorithm to infer the spatial distribution of hydraulic properties in a structurally constrained model. This method distributes hydraulic properties along linear structures (i.e., flow conduits) and iteratively modifies the structural geometry of this conduit network to progressively match the observed hydraulic data to the modeled ones. As a result, this method produces a conductivity model that is composed of a discrete conduit network embedded in the background matrix, capable of producing the same flow behavior as the investigated hydrologic system. The method is applied to invert a set of multiborehole hydraulic tests collected from a hydraulic tomography experiment conducted at the Terrieu field site in the Lez aquifer, Southern France. The emergent model shows a high consistency to field observation of hydraulic connections between boreholes. Furthermore, it provides a geologically realistic pattern of flow conduits. This method is therefore of considerable value toward an enhanced distributed modeling of the fractured and karstified aquifers.

  11. Design of a fault-tolerant reversible control unit in molecular quantum-dot cellular automata

    NASA Astrophysics Data System (ADS)

    Bahadori, Golnaz; Houshmand, Monireh; Zomorodi-Moghadam, Mariam

    Quantum-dot cellular automata (QCA) is a promising emerging nanotechnology that has been attracting considerable attention due to its small feature size, ultra-low power consuming, and high clock frequency. Therefore, there have been many efforts to design computational units based on this technology. Despite these advantages of the QCA-based nanotechnologies, their implementation is susceptible to a high error rate. On the other hand, using the reversible computing leads to zero bit erasures and no energy dissipation. As the reversible computation does not lose information, the fault detection happens with a high probability. In this paper, first we propose a fault-tolerant control unit using reversible gates which improves on the previous design. The proposed design is then synthesized to the QCA technology and is simulated by the QCADesigner tool. Evaluation results indicate the performance of the proposed approach.

  12. Crash energy management on the base of Movable cellular automata method

    NASA Astrophysics Data System (ADS)

    Psakhie, Serguei; Dmitriev, Andrei; Shilko, Evgueni; Tatarintsev, Evgueni; Korostelev, Serguei

    2001-06-01

    One of the main problems of materials science is increasing of structure's viability under dynamic loading. In general, a solution is the management of transformation of the energy of loading to the energy of destroying of the least important parts and details of the structure. It has to be noted that similar problem also exists in materials science, since a majority of modern materials are heterogeneous and have a complex internal structure. To optimize this structure for working under dynamic loading it is necessary to take into account the redistribution of elastic energy including phase transformation, generation and accumulation of micro-damages, etc. As far as real experiments on destroying the complex objects are sufficiently expensive and getting of detailed information is often associates with essential difficulties, the methods of computer modeling are used in solving the similar problems. As a rule, these are the methods of continuum mechanics. Although essential achievements have been obtained on the basis of these methods the continuum approach has several limitations, connected first of all with the possibility of description of generation of damages, formation and development of cracks and mass mixing effects. These problems may be solved on the basis of the Movable Cellular Automata (MCA) method, which has been successfully used for modeling fracture of the different material and structures In the paper behavior and peculiarities of failure of complex structures and materials under dynamic loading are studied on the basis of computer modeling. The results shown that sometimes even small changes of the internal structure leads to the significant increasing of the viability of the complex structures and materials. It is due to the elastic energy flux change over during the dynamical loading. This effect may be explained by the fact that elastic energy fluxes define the current stress concentration. Namely, because the area of inclusions are subjected

  13. Missing-value estimation using linear and non-linear regression with Bayesian gene selection.

    PubMed

    Zhou, Xiaobo; Wang, Xiaodong; Dougherty, Edward R

    2003-11-22

    Data from microarray experiments are usually in the form of large matrices of expression levels of genes under different experimental conditions. Owing to various reasons, there are frequently missing values. Estimating these missing values is important because they affect downstream analysis, such as clustering, classification and network design. Several methods of missing-value estimation are in use. The problem has two parts: (1) selection of genes for estimation and (2) design of an estimation rule. We propose Bayesian variable selection to obtain genes to be used for estimation, and employ both linear and nonlinear regression for the estimation rule itself. Fast implementation issues for these methods are discussed, including the use of QR decomposition for parameter estimation. The proposed methods are tested on data sets arising from hereditary breast cancer and small round blue-cell tumors. The results compare very favorably with currently used methods based on the normalized root-mean-square error. The appendix is available from http://gspsnap.tamu.edu/gspweb/zxb/missing_zxb/ (user: gspweb; passwd: gsplab).

  14. Rule-Based Event Processing and Reaction Rules

    NASA Astrophysics Data System (ADS)

    Paschke, Adrian; Kozlenkov, Alexander

    Reaction rules and event processing technologies play a key role in making business and IT / Internet infrastructures more agile and active. While event processing is concerned with detecting events from large event clouds or streams in almost real-time, reaction rules are concerned with the invocation of actions in response to events and actionable situations. They state the conditions under which actions must be taken. In the last decades various reaction rule and event processing approaches have been developed, which for the most part have been advanced separately. In this paper we survey reaction rule approaches and rule-based event processing systems and languages.

  15. 77 FR 52977 - Regulatory Capital Rules: Advanced Approaches Risk-Based Capital Rule; Market Risk Capital Rule

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-30

    ... Corporation 12 CFR Parts 324, 325 Regulatory Capital Rules: Advanced Approaches Risk-Based Capital Rule... 325 RIN 3064-AD97 Regulatory Capital Rules: Advanced Approaches Risk-Based Capital Rule; Market Risk... the agencies' current capital rules. In this NPR (Advanced Approaches and Market Risk NPR) the...

  16. A Hierarchy of Proof Rules for Checking Differential Invariance of Algebraic Sets

    DTIC Science & Technology

    2014-11-01

    linear hybrid systems by linear algebraic methods. In SAS, volume 6337 of LNCS, pages 373–389. Springer, 2010. [19] E. W. Mayr. Membership in polynomial...383–394, 2009. [31] A. Tarski. A decision method for elementary algebra and geometry. Bull. Amer. Math. Soc., 59, 1951. [32] A. Tiwari. Abstractions...A Hierarchy of Proof Rules for Checking Differential Invariance of Algebraic Sets Khalil Ghorbal1 Andrew Sogokon2 André Platzer1 November 2014 CMU

  17. Quasi-classical modeling of molecular quantum-dot cellular automata multidriver gates

    NASA Astrophysics Data System (ADS)

    Rahimi, Ehsan; Nejad, Shahram Mohammad

    2012-05-01

    Molecular quantum-dot cellular automata (mQCA) has received considerable attention in nanoscience. Unlike the current-based molecular switches, where the digital data is represented by the on/off states of the switches, in mQCA devices, binary information is encoded in charge configuration within molecular redox centers. The mQCA paradigm allows high device density and ultra-low power consumption. Digital mQCA gates are the building blocks of circuits in this paradigm. Design and analysis of these gates require quantum chemical calculations, which are demanding in computer time and memory. Therefore, developing simple models to probe mQCA gates is of paramount importance. We derive a semi-classical model to study the steady-state output polarization of mQCA multidriver gates, directly from the two-state approximation in electron transfer theory. The accuracy and validity of this model are analyzed using full quantum chemistry calculations. A complete set of logic gates, including inverters and minority voters, are implemented to provide an appropriate test bench in the two-dot mQCA regime. We also briefly discuss how the QCADesigner tool could find its application in simulation of mQCA devices.

  18. Simulating Flaring Events via an Intelligent Cellular Automata Mechanism

    NASA Astrophysics Data System (ADS)

    Dimitropoulou, M.; Vlahos, L.; Isliker, H.; Georgoulis, M.

    2010-07-01

    We simulate flaring events through a Cellular Automaton (CA) model, in which, for the first time, we use observed vector magnetograms as initial conditions. After non-linear force free extrapolation of the magnetic field from the vector magnetograms, we identify magnetic discontinuities, using two alternative criteria: (1) the average magnetic field gradient, or (2) the normalized magnetic field curl (i.e. the current). Magnetic discontinuities are identified at the grid-sites where the magnetic field gradient or curl exceeds a specified threshold. We then relax the magnetic discontinuities according to the rules of Lu and Hamilton (1991) or Lu et al. (1993), i.e. we redistribute the magnetic field locally so that the discontinuities disappear. In order to simulate the flaring events, we consider several alternative scenarios with regard to: (1) The threshold above which magnetic discontinuities are identified (applying low, high, and height-dependent threshold values); (2) The driving process that occasionally causes new discontinuities (at randomly chosen grid sites, magnetic field increments are added that are perpendicular (or may-be also parallel) to the existing magnetic field). We address the question whether the coronal active region magnetic fields can indeed be considered to be in the state of self-organized criticality (SOC).

  19. Jahn-Teller effect versus Hund's rule coupling in C60N-

    NASA Astrophysics Data System (ADS)

    Wehrli, S.; Sigrist, M.

    2007-09-01

    We propose variational states for the ground state and the low-energy collective rotator excitations in negatively charged C60N- ions (N=1,…,5) . The approach includes the linear electron-phonon coupling and the Coulomb interaction on the same level. The electron-phonon coupling is treated within the effective mode approximation which yields the linear t1u⊗Hg Jahn-Teller problem whereas the Coulomb interaction gives rise to Hund’s rule coupling for N=2,3,4 . The Hamiltonian has accidental SO(3) symmetry which allows an elegant formulation in terms of angular momenta. Trial states are constructed from coherent states and using projection operators onto angular momentum subspaces which results in good variational states for the complete parameter range. The evaluation of the corresponding energies is to a large extent analytical. We use the approach for a detailed analysis of the competition between Jahn-Teller effect and Hund’s rule coupling, which determines the spin state for N=2,3,4 . We calculate the low-spin-high-spin gap for N=2,3,4 as a function of the Hund’s rule coupling constant J . We find that the experimentally measured gaps suggest a coupling constant in the range J=60-80meV . Using a finite value for J , we recalculate the ground state energies of the C60N- ions and find that the Jahn-Teller energy gain is partly counterbalanced by the Hund’s rule coupling. In particular, the ground state energies for N=2,3,4 are almost equal.

  20. Pushing the rules: effects and aftereffects of deliberate rule violations.

    PubMed

    Wirth, Robert; Pfister, Roland; Foerster, Anna; Huestegge, Lynn; Kunde, Wilfried

    2016-09-01

    Most of our daily life is organized around rules and social norms. But what makes rules so special? And what if one were to break a rule intentionally? Can we simply free us from the present set of rules or do we automatically adhere to them? How do rule violations influence subsequent behavior? To investigate the effects and aftereffects of violating simple S-R rule, we conducted three experiments that investigated continuous finger-tracking responses on an iPad. Our experiments show that rule violations are distinct from rule-based actions in both response times and movement trajectories, they take longer to initiate and execute, and their movement trajectory is heavily contorted. Data not only show differences between the two types of response (rule-based vs. violation), but also yielded a characteristic pattern of aftereffects in case of rule violations: rule violations do not trigger adaptation effects that render further rule violations less difficult, but every rule violation poses repeated effort on the agent. The study represents a first step towards understanding the signature and underlying mechanisms of deliberate rule violations, they cannot be acted out by themselves, but require the activation of the original rule first. Consequently, they are best understood as reformulations of existing rules that are not accessible on their own, but need to be constantly derived from the original rule, with an add-on that might entail an active tendency to steer away from mental representations that reflect (socially) unwanted behavior.

  1. Mechanisms of rule acquisition and rule following in inductive reasoning.

    PubMed

    Crescentini, Cristiano; Seyed-Allaei, Shima; De Pisapia, Nicola; Jovicich, Jorge; Amati, Daniele; Shallice, Tim

    2011-05-25

    Despite the recent interest in the neuroanatomy of inductive reasoning processes, the regional specificity within prefrontal cortex (PFC) for the different mechanisms involved in induction tasks remains to be determined. In this study, we used fMRI to investigate the contribution of PFC regions to rule acquisition (rule search and rule discovery) and rule following. Twenty-six healthy young adult participants were presented with a series of images of cards, each consisting of a set of circles numbered in sequence with one colored blue. Participants had to predict the position of the blue circle on the next card. The rules that had to be acquired pertained to the relationship among succeeding stimuli. Responses given by subjects were categorized in a series of phases either tapping rule acquisition (responses given up to and including rule discovery) or rule following (correct responses after rule acquisition). Mid-dorsolateral PFC (mid-DLPFC) was active during rule search and remained active until successful rule acquisition. By contrast, rule following was associated with activation in temporal, motor, and medial/anterior prefrontal cortex. Moreover, frontopolar cortex (FPC) was active throughout the rule acquisition and rule following phases before a rule became familiar. We attributed activation in mid-DLPFC to hypothesis generation and in FPC to integration of multiple separate inferences. The present study provides evidence that brain activation during inductive reasoning involves a complex network of frontal processes and that different subregions respond during rule acquisition and rule following phases.

  2. 18 CFR 385.104 - Rule of construction (Rule 104).

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Rule of construction (Rule 104). 385.104 Section 385.104 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY... Definitions § 385.104 Rule of construction (Rule 104). To the extent that the text of a rule is inconsistent...

  3. 18 CFR 385.104 - Rule of construction (Rule 104).

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Rule of construction (Rule 104). 385.104 Section 385.104 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY... Definitions § 385.104 Rule of construction (Rule 104). To the extent that the text of a rule is inconsistent...

  4. Phonological reduplication in sign language: Rules rule

    PubMed Central

    Berent, Iris; Dupuis, Amanda; Brentari, Diane

    2014-01-01

    Productivity—the hallmark of linguistic competence—is typically attributed to algebraic rules that support broad generalizations. Past research on spoken language has documented such generalizations in both adults and infants. But whether algebraic rules form part of the linguistic competence of signers remains unknown. To address this question, here we gauge the generalization afforded by American Sign Language (ASL). As a case study, we examine reduplication (X→XX)—a rule that, inter alia, generates ASL nouns from verbs. If signers encode this rule, then they should freely extend it to novel syllables, including ones with features that are unattested in ASL. And since reduplicated disyllables are preferred in ASL, such a rule should favor novel reduplicated signs. Novel reduplicated signs should thus be preferred to nonreduplicative controls (in rating), and consequently, such stimuli should also be harder to classify as nonsigns (in the lexical decision task). The results of four experiments support this prediction. These findings suggest that the phonological knowledge of signers includes powerful algebraic rules. The convergence between these conclusions and previous evidence for phonological rules in spoken language suggests that the architecture of the phonological mind is partly amodal. PMID:24959158

  5. From Feynman rules to conserved quantum numbers, I

    NASA Astrophysics Data System (ADS)

    Nogueira, P.

    2017-05-01

    In the context of Quantum Field Theory (QFT) there is often the need to find sets of graph-like diagrams (the so-called Feynman diagrams) for a given physical model. If negative, the answer to the related problem 'Are there any diagrams with this set of external fields?' may settle certain physical questions at once. Here the latter problem is formulated in terms of a system of linear diophantine equations derived from the Lagrangian density, from which necessary conditions for the existence of the required diagrams may be obtained. Those conditions are equalities that look like either linear diophantine equations or linear modular (i.e. congruence) equations, and may be found by means of fairly simple algorithms that involve integer computations. The diophantine equations so obtained represent (particle) number conservation rules, and are related to the conserved (additive) quantum numbers that may be assigned to the fields of the model.

  6. Automatic Learning of Fine Operating Rules for Online Power System Security Control.

    PubMed

    Sun, Hongbin; Zhao, Feng; Wang, Hao; Wang, Kang; Jiang, Weiyong; Guo, Qinglai; Zhang, Boming; Wehenkel, Louis

    2016-08-01

    Fine operating rules for security control and an automatic system for their online discovery were developed to adapt to the development of smart grids. The automatic system uses the real-time system state to determine critical flowgates, and then a continuation power flow-based security analysis is used to compute the initial transfer capability of critical flowgates. Next, the system applies the Monte Carlo simulations to expected short-term operating condition changes, feature selection, and a linear least squares fitting of the fine operating rules. The proposed system was validated both on an academic test system and on a provincial power system in China. The results indicated that the derived rules provide accuracy and good interpretability and are suitable for real-time power system security control. The use of high-performance computing systems enables these fine operating rules to be refreshed online every 15 min.

  7. Simulation of changes in heavy metal contamination in farmland soils of a typical manufacturing center through logistic-based cellular automata modeling.

    PubMed

    Qiu, Menglong; Wang, Qi; Li, Fangbai; Chen, Junjian; Yang, Guoyi; Liu, Liming

    2016-01-01

    A customized logistic-based cellular automata (CA) model was developed to simulate changes in heavy metal contamination (HMC) in farmland soils of Dongguan, a manufacturing center in Southern China, and to discover the relationship between HMC and related explanatory variables (continuous and categorical). The model was calibrated through the simulation and validation of HMC in 2012. Thereafter, the model was implemented for the scenario simulation of development alternatives for HMC in 2022. The HMC in 2002 and 2012 was determined through soil tests and cokriging. Continuous variables were divided into two groups by odds ratios. Positive variables (odds ratios >1) included the Nemerow synthetic pollution index in 2002, linear drainage density, distance from the city center, distance from the railway, slope, and secondary industrial output per unit of land. Negative variables (odds ratios <1) included elevation, distance from the road, distance from the key polluting enterprises, distance from the town center, soil pH, and distance from bodies of water. Categorical variables, including soil type, parent material type, organic content grade, and land use type, also significantly influenced HMC according to Wald statistics. The relative operating characteristic and kappa coefficients were 0.91 and 0.64, respectively, which proved the validity and accuracy of the model. The scenario simulation shows that the government should not only implement stricter environmental regulation but also strengthen the remediation of the current polluted area to effectively mitigate HMC.

  8. Temperature Effects on Olive Fruit Fly Infestation in the FlySim Cellular Automata Model

    NASA Astrophysics Data System (ADS)

    Bruno, Vincenzo; Baldacchini, Valerio; di Gregorio, Salvatore

    FlySim is a Cellular Automata model developed for simulating infestation of olive fruit flies (Bactrocera Oleae) on olive (Olea europaea) groves. The flies move into the groves looking for mature olives where eggs are spawn. This serious agricultural problem is mainly tackled by using chemical agents at the first signs of the infestation, but organic productions with no or few chemicals are strongly requested by the market. Oil made with infested olives is poor in quality, nor olives are suitable for selling in stores. The FlySim model simulates the diffusion of flies looking for mature olives and the growing of flies due to atmospheric conditions. Foreseeing an infestation is the best way to prevent it and to reduce the need of chemicals in agriculture. In this work we investigated the effects of temperature on olive fruit flies and resulting infestation during late spring and summer.

  9. Predicting non-linear dynamics by stable local learning in a recurrent spiking neural network.

    PubMed

    Gilra, Aditya; Gerstner, Wulfram

    2017-11-27

    The brain needs to predict how the body reacts to motor commands, but how a network of spiking neurons can learn non-linear body dynamics using local, online and stable learning rules is unclear. Here, we present a supervised learning scheme for the feedforward and recurrent connections in a network of heterogeneous spiking neurons. The error in the output is fed back through fixed random connections with a negative gain, causing the network to follow the desired dynamics. The rule for Feedback-based Online Local Learning Of Weights (FOLLOW) is local in the sense that weight changes depend on the presynaptic activity and the error signal projected onto the postsynaptic neuron. We provide examples of learning linear, non-linear and chaotic dynamics, as well as the dynamics of a two-link arm. Under reasonable approximations, we show, using the Lyapunov method, that FOLLOW learning is uniformly stable, with the error going to zero asymptotically.

  10. Predicting non-linear dynamics by stable local learning in a recurrent spiking neural network

    PubMed Central

    Gerstner, Wulfram

    2017-01-01

    The brain needs to predict how the body reacts to motor commands, but how a network of spiking neurons can learn non-linear body dynamics using local, online and stable learning rules is unclear. Here, we present a supervised learning scheme for the feedforward and recurrent connections in a network of heterogeneous spiking neurons. The error in the output is fed back through fixed random connections with a negative gain, causing the network to follow the desired dynamics. The rule for Feedback-based Online Local Learning Of Weights (FOLLOW) is local in the sense that weight changes depend on the presynaptic activity and the error signal projected onto the postsynaptic neuron. We provide examples of learning linear, non-linear and chaotic dynamics, as well as the dynamics of a two-link arm. Under reasonable approximations, we show, using the Lyapunov method, that FOLLOW learning is uniformly stable, with the error going to zero asymptotically. PMID:29173280

  11. Combining cellular automata and Lattice Boltzmann method to model multiscale avascular tumor growth coupled with nutrient diffusion and immune competition.

    PubMed

    Alemani, Davide; Pappalardo, Francesco; Pennisi, Marzio; Motta, Santo; Brusic, Vladimir

    2012-02-28

    In the last decades the Lattice Boltzmann method (LB) has been successfully used to simulate a variety of processes. The LB model describes the microscopic processes occurring at the cellular level and the macroscopic processes occurring at the continuum level with a unique function, the probability distribution function. Recently, it has been tried to couple deterministic approaches with probabilistic cellular automata (probabilistic CA) methods with the aim to model temporal evolution of tumor growths and three dimensional spatial evolution, obtaining hybrid methodologies. Despite the good results attained by CA-PDE methods, there is one important issue which has not been completely solved: the intrinsic stochastic nature of the interactions at the interface between cellular (microscopic) and continuum (macroscopic) level. CA methods are able to cope with the stochastic phenomena because of their probabilistic nature, while PDE methods are fully deterministic. Even if the coupling is mathematically correct, there could be important statistical effects that could be missed by the PDE approach. For such a reason, to be able to develop and manage a model that takes into account all these three level of complexity (cellular, molecular and continuum), we believe that PDE should be replaced with a statistic and stochastic model based on the numerical discretization of the Boltzmann equation: The Lattice Boltzmann (LB) method. In this work we introduce a new hybrid method to simulate tumor growth and immune system, by applying Cellular Automata Lattice Boltzmann (CA-LB) approach. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. RuleMonkey: software for stochastic simulation of rule-based models

    PubMed Central

    2010-01-01

    Background The system-level dynamics of many molecular interactions, particularly protein-protein interactions, can be conveniently represented using reaction rules, which can be specified using model-specification languages, such as the BioNetGen language (BNGL). A set of rules implicitly defines a (bio)chemical reaction network. The reaction network implied by a set of rules is often very large, and as a result, generation of the network implied by rules tends to be computationally expensive. Moreover, the cost of many commonly used methods for simulating network dynamics is a function of network size. Together these factors have limited application of the rule-based modeling approach. Recently, several methods for simulating rule-based models have been developed that avoid the expensive step of network generation. The cost of these "network-free" simulation methods is independent of the number of reactions implied by rules. Software implementing such methods is now needed for the simulation and analysis of rule-based models of biochemical systems. Results Here, we present a software tool called RuleMonkey, which implements a network-free method for simulation of rule-based models that is similar to Gillespie's method. The method is suitable for rule-based models that can be encoded in BNGL, including models with rules that have global application conditions, such as rules for intramolecular association reactions. In addition, the method is rejection free, unlike other network-free methods that introduce null events, i.e., steps in the simulation procedure that do not change the state of the reaction system being simulated. We verify that RuleMonkey produces correct simulation results, and we compare its performance against DYNSTOC, another BNGL-compliant tool for network-free simulation of rule-based models. We also compare RuleMonkey against problem-specific codes implementing network-free simulation methods. Conclusions RuleMonkey enables the simulation of

  13. 4 CFR 22.1 - Applicability of Rules [Rule 1].

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 4 Accounts 1 2010-01-01 2010-01-01 false Applicability of Rules [Rule 1]. 22.1 Section 22.1... ACCOUNTABILITY OFFICE CONTRACT APPEALS BOARD § 22.1 Applicability of Rules [Rule 1]. The Government... all appeals filed with the Board on or after October 1, 2007. ...

  14. On the Universality and Non-Universality of Spiking Neural P Systems With Rules on Synapses.

    PubMed

    Song, Tao; Xu, Jinbang; Pan, Linqiang

    2015-12-01

    Spiking neural P systems with rules on synapses are a new variant of spiking neural P systems. In the systems, the neuron contains only spikes, while the spiking/forgetting rules are moved on the synapses. It was obtained that such system with 30 neurons (using extended spiking rules) or with 39 neurons (using standard spiking rules) is Turing universal. In this work, this number is improved to 6. Specifically, we construct a Turing universal spiking neural P system with rules on synapses having 6 neurons, which can generate any set of Turing computable natural numbers. As well, it is obtained that spiking neural P system with rules on synapses having less than two neurons are not Turing universal: i) such systems having one neuron can characterize the family of finite sets of natural numbers; ii) the family of sets of numbers generated by the systems having two neurons is included in the family of semi-linear sets of natural numbers.

  15. Universal Computation and Construction in GoL Cellular Automata

    NASA Astrophysics Data System (ADS)

    Goucher, Adam P.

    This chapter is concerned with the developments of universal computation and construction within Conway's Game of Life (GoL). I will begin by describing the history of the concepts and mechanisms for universal computation and construction in GoL, before explaining how a Universal Computer-Constructor (UCC) would operate in this automaton. Moreover, I shall present the design of a working UCC in the rule. It is both capable of computing any calculation (i.e. it is Turing-complete) and constructing most, if not all, of the constructible configurations within the rule. It cannot construct patterns which have no predecessor; neither can any machine in the rule (for obvious reasons). As such, it is more accurately a general constructor, rather than a universal constructor.

  16. Modelling and analyzing the watershed dynamics using Cellular Automata (CA)-Markov model - A geo-information based approach

    NASA Astrophysics Data System (ADS)

    Behera, Mukunda D.; Borate, Santosh N.; Panda, Sudhindra N.; Behera, Priti R.; Roy, Partha S.

    2012-08-01

    Improper practices of land use and land cover (LULC) including deforestation, expansion of agriculture and infrastructure development are deteriorating watershed conditions. Here, we have utilized remote sensing and GIS tools to study LULC dynamics using Cellular Automata (CA)-Markov model and predicted the future LULC scenario, in terms of magnitude and direction, based on past trend in a hydrological unit, Choudwar watershed, India. By analyzing the LULC pattern during 1972, 1990, 1999 and 2005 using satellite-derived maps, we observed that the biophysical and socio-economic drivers including residential/industrial development, road-rail and settlement proximity have influenced the spatial pattern of the watershed LULC, leading to an accretive linear growth of agricultural and settlement areas. The annual rate of increase from 1972 to 2004 in agriculture land, settlement was observed to be 181.96, 9.89 ha/year, respectively, while decrease in forest, wetland and marshy land were 91.22, 27.56 and 39.52 ha/year, respectively. Transition probability and transition area matrix derived using inputs of (i) residential/industrial development and (ii) proximity to transportation network as the major causes. The predicted LULC scenario for the year 2014, with reasonably good accuracy would provide useful inputs to the LULC planners for effective management of the watershed. The study is a maiden attempt that revealed agricultural expansion is the main driving force for loss of forest, wetland and marshy land in the Choudwar watershed and has the potential to continue in future. The forest in lower slopes has been converted to agricultural land and may soon take a call on forests occurring on higher slopes. Our study utilizes three time period changes to better account for the trend and the modelling exercise; thereby advocates for better agricultural practices with additional energy subsidy to arrest further forest loss and LULC alternations.

  17. 18 CFR 385.103 - References to rules (Rule 103).

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false References to rules (Rule 103). 385.103 Section 385.103 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY... Definitions § 385.103 References to rules (Rule 103). This part cross-references its sections according to...

  18. 18 CFR 385.103 - References to rules (Rule 103).

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false References to rules (Rule 103). 385.103 Section 385.103 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY... Definitions § 385.103 References to rules (Rule 103). This part cross-references its sections according to...

  19. Multiple-rule bias in the comparison of classification rules

    PubMed Central

    Yousefi, Mohammadmahdi R.; Hua, Jianping; Dougherty, Edward R.

    2011-01-01

    Motivation: There is growing discussion in the bioinformatics community concerning overoptimism of reported results. Two approaches contributing to overoptimism in classification are (i) the reporting of results on datasets for which a proposed classification rule performs well and (ii) the comparison of multiple classification rules on a single dataset that purports to show the advantage of a certain rule. Results: This article provides a careful probabilistic analysis of the second issue and the ‘multiple-rule bias’, resulting from choosing a classification rule having minimum estimated error on the dataset. It quantifies this bias corresponding to estimating the expected true error of the classification rule possessing minimum estimated error and it characterizes the bias from estimating the true comparative advantage of the chosen classification rule relative to the others by the estimated comparative advantage on the dataset. The analysis is applied to both synthetic and real data using a number of classification rules and error estimators. Availability: We have implemented in C code the synthetic data distribution model, classification rules, feature selection routines and error estimation methods. The code for multiple-rule analysis is implemented in MATLAB. The source code is available at http://gsp.tamu.edu/Publications/supplementary/yousefi11a/. Supplementary simulation results are also included. Contact: edward@ece.tamu.edu Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:21546390

  20. Computational Nonlinear Morphology with Emphasis on Semitic Languages. Studies in Natural Language Processing.

    ERIC Educational Resources Information Center

    Kiraz, George Anton

    This book presents a tractable computational model that can cope with complex morphological operations, especially in Semitic languages, and less complex morphological systems present in Western languages. It outlines a new generalized regular rewrite rule system that uses multiple finite-state automata to cater to root-and-pattern morphology,…

  1. An Optimized Three-Level Design of Decoder Based on Nanoscale Quantum-Dot Cellular Automata

    NASA Astrophysics Data System (ADS)

    Seyedi, Saeid; Navimipour, Nima Jafari

    2018-03-01

    Quantum-dot Cellular Automata (QCA) has been potentially considered as a supersede to Complementary Metal-Oxide-Semiconductor (CMOS) because of its inherent advantages. Many QCA-based logic circuits with smaller feature size, improved operating frequency, and lower power consumption than CMOS have been offered. This technology works based on electron relations inside quantum-dots. Due to the importance of designing an optimized decoder in any digital circuit, in this paper, we design, implement and simulate a new 2-to-4 decoder based on QCA with low delay, area, and complexity. The logic functionality of the 2-to-4 decoder is verified using the QCADesigner tool. The results have shown that the proposed QCA-based decoder has high performance in terms of a number of cells, covered area, and time delay. Due to the lower clock pulse frequency, the proposed 2-to-4 decoder is helpful for building QCA-based sequential digital circuits with high performance.

  2. Run charts revisited: a simulation study of run chart rules for detection of non-random variation in health care processes.

    PubMed

    Anhøj, Jacob; Olesen, Anne Vingaard

    2014-01-01

    A run chart is a line graph of a measure plotted over time with the median as a horizontal line. The main purpose of the run chart is to identify process improvement or degradation, which may be detected by statistical tests for non-random patterns in the data sequence. We studied the sensitivity to shifts and linear drifts in simulated processes using the shift, crossings and trend rules for detecting non-random variation in run charts. The shift and crossings rules are effective in detecting shifts and drifts in process centre over time while keeping the false signal rate constant around 5% and independent of the number of data points in the chart. The trend rule is virtually useless for detection of linear drift over time, the purpose it was intended for.

  3. Structural Dynamic Analyses And Test Predictions For Spacecraft Structures With Non-Linearities

    NASA Astrophysics Data System (ADS)

    Vergniaud, Jean-Baptiste; Soula, Laurent; Newerla, Alfred

    2012-07-01

    The overall objective of the mechanical development and verification process is to ensure that the spacecraft structure is able to sustain the mechanical environments encountered during launch. In general the spacecraft structures are a-priori assumed to behave linear, i.e. the responses to a static load or dynamic excitation, respectively, will increase or decrease proportionally to the amplitude of the load or excitation induced. However, past experiences have shown that various non-linearities might exist in spacecraft structures and the consequences of their dynamic effects can significantly affect the development and verification process. Current processes are mainly adapted to linear spacecraft structure behaviour. No clear rules exist for dealing with major structure non-linearities. They are handled outside the process by individual analysis and margin policy, and analyses after tests to justify the CLA coverage. Non-linearities can primarily affect the current spacecraft development and verification process on two aspects. Prediction of flights loads by launcher/satellite coupled loads analyses (CLA): only linear satellite models are delivered for performing CLA and no well-established rules exist how to properly linearize a model when non- linearities are present. The potential impact of the linearization on the results of the CLA has not yet been properly analyzed. There are thus difficulties to assess that CLA results will cover actual flight levels. Management of satellite verification tests: the CLA results generated with a linear satellite FEM are assumed flight representative. If the internal non- linearities are present in the tested satellite then there might be difficulties to determine which input level must be passed to cover satellite internal loads. The non-linear behaviour can also disturb the shaker control, putting the satellite at risk by potentially imposing too high levels. This paper presents the results of a test campaign performed in

  4. A cellular automata based FPGA realization of a new metaheuristic bat-inspired algorithm

    NASA Astrophysics Data System (ADS)

    Progias, Pavlos; Amanatiadis, Angelos A.; Spataro, William; Trunfio, Giuseppe A.; Sirakoulis, Georgios Ch.

    2016-10-01

    Optimization algorithms are often inspired by processes occuring in nature, such as animal behavioral patterns. The main concern with implementing such algorithms in software is the large amounts of processing power they require. In contrast to software code, that can only perform calculations in a serial manner, an implementation in hardware, exploiting the inherent parallelism of single-purpose processors, can prove to be much more efficient both in speed and energy consumption. Furthermore, the use of Cellular Automata (CA) in such an implementation would be efficient both as a model for natural processes, as well as a computational paradigm implemented well on hardware. In this paper, we propose a VHDL implementation of a metaheuristic algorithm inspired by the echolocation behavior of bats. More specifically, the CA model is inspired by the metaheuristic algorithm proposed earlier in the literature, which could be considered at least as efficient than other existing optimization algorithms. The function of the FPGA implementation of our algorithm is explained in full detail and results of our simulations are also demonstrated.

  5. Molecular quantum cellular automata cell design trade-offs: latching vs. power dissipation.

    PubMed

    Rahimi, Ehsan; Reimers, Jeffrey R

    2018-06-20

    The use of molecules to enact quantum cellular automata (QCA) cells has been proposed as a new way for performing electronic logic operations at sub-nm dimensions. A key question that arises concerns whether chemical or physical processes are to be exploited. The use of chemical reactions allows the state of a switch element to be latched in molecular form, making the output of a cell independent of its inputs, but costs energy to do the reaction. Alternatively, if purely electronic polarization is manipulated then no internal latching occurs, but no power is dissipated provided the fields from the inputs change slowly compared to the molecular response times. How these scenarios pan out is discussed by considering calculated properties of the 1,4-diallylbutane cation, a species often used as a paradigm for molecular electronic switching. Utilized are results from different calculation approaches that depict the ion either as a charge-localized mixed-valence compound functioning as a bistable switch, or else as an extremely polarizable molecule with a delocalized electronic structure. Practical schemes for using molecular cells in QCA and other devices emerge.

  6. 76 FR 24376 - Commission's Ex Parte Rules and Other Procedural Rules

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-02

    ...'s Ex Parte Rules and Other Procedural Rules AGENCY: Federal Communications Commission. ACTION: Final rule. SUMMARY: In this document the Commission revises certain ex parte and organizational rules. This document amends and reforms the Commission's rules on ex parte presentations made in the course of...

  7. An attempt of modelling debris flows characterised by strong inertial effects through Cellular Automata

    NASA Astrophysics Data System (ADS)

    Iovine, G.; D'Ambrosio, D.

    2003-04-01

    Cellular Automata models do represent a valid method for the simulation of complex phenomena, when these latter can be described in "a-centric" terms - i.e. through local interactions within a discrete time-space. In particular, flow-type landslides (such as debris flows) can be viewed as a-centric dynamical system. SCIDDICA S4b, the last release of a family of two-dimensional hexagonal Cellular Automata models, has recently been developed for simulating debris flows characterised by strong inertial effects. It has been derived by progressively enriching an initial simplified CA model, originally derived for simulating very simple cases of slow-moving flow-type landslides. In S4b, by applying an empirical strategy, the inertial characters of the flowing mass have been translated into CA terms. In the transition function of the model, the distribution of landslide debris among the cells is computed by considering the momentum of the debris which move among the cells of the neighbourhood, and privileging the flow direction. By properly setting the value of one of the global parameters of the model (the "inertial factor"), the mechanism of distribution of the landslide debris among the cells can be influenced in order to emphasise the inertial effects, according to the energy of the flowing mass. Moreover, the high complexity of both the model and of the phenomena to be simulated (e.g. debris flows characterised by severe erosion along their path, and by strong inertial effects) suggested to employ an automated technique of evaluation, for the determination of the best set of global parameters. Accordingly, the calibration of the model has been performed through Genetic Algorithms, by considering several real cases of study: these latter have been selected among the population of landslides triggered in Campania (Southern Italy) in May 1998 and December 1999. Obtained results are satisfying: errors computed by comparing the simulations with the map of the real

  8. Exploration of SWRL Rule Bases through Visualization, Paraphrasing, and Categorization of Rules

    NASA Astrophysics Data System (ADS)

    Hassanpour, Saeed; O'Connor, Martin J.; Das, Amar K.

    Rule bases are increasingly being used as repositories of knowledge content on the Semantic Web. As the size and complexity of these rule bases increases, developers and end users need methods of rule abstraction to facilitate rule management. In this paper, we describe a rule abstraction method for Semantic Web Rule Language (SWRL) rules that is based on lexical analysis and a set of heuristics. Our method results in a tree data structure that we exploit in creating techniques to visualize, paraphrase, and categorize SWRL rules. We evaluate our approach by applying it to several biomedical ontologies that contain SWRL rules, and show how the results reveal rule patterns within the rule base. We have implemented our method as a plug-in tool for Protégé-OWL, the most widely used ontology modeling software for the Semantic Web. Our tool can allow users to rapidly explore content and patterns in SWRL rule bases, enabling their acquisition and management.

  9. [Woolly hair nevus associated with an ipsilateral linear epidermal nevus].

    PubMed

    Martín-González, T; del Boz-González, J; Vera-Casaño, A

    2007-04-01

    We report a 4-year-old boy with two areas of woolly hair in the right parietotemporal region and a linear epidermal nevus in the areas of woolly hair as well as in the ipsilateral hemiface and chin. Evaluation by scanning electron microscopy showed woolly hair with oval transverse section and longitudinal groove. A complete examination ruled out associated anomalies.

  10. Australian road rules

    DOT National Transportation Integrated Search

    2009-02-01

    *These are national-level rules. Australian Road Rules - 2009 Version, Part 18, Division 1, Rule 300 "Use of Mobile Phones" describes restrictions of mobile phone use while driving. The rule basically states that drivers cannot make or receive calls ...

  11. Rules and Self-Rules: Effects of Variation upon Behavioral Sensitivity to Change

    ERIC Educational Resources Information Center

    Baumann, Ana A.; Abreu-Rodrigues, Josele; da Silva Souza, Alessandra

    2009-01-01

    Four experiments compared the effects of self-rules and rules, and varied and specific schedules of reinforcement. Participants were first exposed to either several schedules (varied groups) or to one schedule (specific groups) and either were asked to generate rules (self-rule groups), were provided rules (rule groups), or were not asked nor…

  12. Visualizing the Chain Rule (for Functions over R and C) and More

    ERIC Educational Resources Information Center

    Kreminski, Rick

    2009-01-01

    A visual approach to understanding the chain rule and related derivative formulae, for functions from R to R and from C to C, is presented. This apparently novel approach has been successfully used with several audiences: students first studying calculus, students with some background in linear algebra, students beginning study of functions of a…

  13. The Meyer-Neldel rule and the statistical shift of the Fermi level in amorphous semiconductors

    NASA Astrophysics Data System (ADS)

    Kikuchi, Minoru

    1988-11-01

    The statistical model is used to study the origin of the Meyer-Neldel (MN) rule [σ0∝exp(AEσ)] in a tetrahedral amorphous system. It is shown that a deep minimum in the gap density of states spectrum can lead to the linearity of the Fermi energy F(T) to the derivative (dF/dkT), as required from the rule. An expression is derived which relates the constant A in the rule to the gap density of states spectrum. The dispersion ranges of σ0 and Eσ are found to be related with the constant A. Model calculations show a magnitude of A and a wide dispersion of σ0 and Eσ in fair agreement with the experimental observations. A discussion is given to what extent the MN rule is dependent on the gap density of states spectrum.

  14. Prediction of linear B-cell epitopes of hepatitis C virus for vaccine development

    PubMed Central

    2015-01-01

    Background High genetic heterogeneity in the hepatitis C virus (HCV) is the major challenge of the development of an effective vaccine. Existing studies for developing HCV vaccines have mainly focused on T-cell immune response. However, identification of linear B-cell epitopes that can stimulate B-cell response is one of the major tasks of peptide-based vaccine development. Owing to the variability in B-cell epitope length, the prediction of B-cell epitopes is much more complex than that of T-cell epitopes. Furthermore, the motifs of linear B-cell epitopes in different pathogens are quite different (e. g. HCV and hepatitis B virus). To cope with this challenge, this work aims to propose an HCV-customized sequence-based prediction method to identify B-cell epitopes of HCV. Results This work establishes an experimentally verified dataset comprising the B-cell response of HCV dataset consisting of 774 linear B-cell epitopes and 774 non B-cell epitopes from the Immune Epitope Database. An interpretable rule mining system of B-cell epitopes (IRMS-BE) is proposed to select informative physicochemical properties (PCPs) and then extracts several if-then rule-based knowledge for identifying B-cell epitopes. A web server Bcell-HCV was implemented using an SVM with the 34 informative PCPs, which achieved a training accuracy of 79.7% and test accuracy of 70.7% better than the SVM-based methods for identifying B-cell epitopes of HCV and the two general-purpose methods. This work performs advanced analysis of the 34 informative properties, and the results indicate that the most effective property is the alpha-helix structure of epitopes, which influences the connection between host cells and the E2 proteins of HCV. Furthermore, 12 interpretable rules are acquired from top-five PCPs and achieve a sensitivity of 75.6% and specificity of 71.3%. Finally, a conserved promising vaccine candidate, PDREMVLYQE, is identified for inclusion in a vaccine against HCV. Conclusions This work

  15. Robustness of a cellular automata model for the HIV infection

    NASA Astrophysics Data System (ADS)

    Figueirêdo, P. H.; Coutinho, S.; Zorzenon dos Santos, R. M.

    2008-11-01

    An investigation was conducted to study the robustness of the results obtained from the cellular automata model which describes the spread of the HIV infection within lymphoid tissues [R.M. Zorzenon dos Santos, S. Coutinho, Phys. Rev. Lett. 87 (2001) 168102]. The analysis focused on the dynamic behavior of the model when defined in lattices with different symmetries and dimensionalities. The results illustrated that the three-phase dynamics of the planar models suffered minor changes in relation to lattice symmetry variations and, while differences were observed regarding dimensionality changes, qualitative behavior was preserved. A further investigation was conducted into primary infection and sensitiveness of the latency period to variations of the model’s stochastic parameters over wide ranging values. The variables characterizing primary infection and the latency period exhibited power-law behavior when the stochastic parameters varied over a few orders of magnitude. The power-law exponents were approximately the same when lattice symmetry varied, but there was a significant variation when dimensionality changed from two to three. The dynamics of the three-dimensional model was also shown to be insensitive to variations of the deterministic parameters related to cell resistance to the infection, and the necessary time lag to mount the specific immune response to HIV variants. The robustness of the model demonstrated in this work reinforce that its basic hypothesis are consistent with the three-stage dynamic of the HIV infection observed in patients.

  16. Modeling of the competition life cycle using the software complex of cellular automata PyCAlab

    NASA Astrophysics Data System (ADS)

    Berg, D. B.; Beklemishev, K. A.; Medvedev, A. N.; Medvedeva, M. A.

    2015-11-01

    The aim of the work is to develop a numerical model of the life cycle of competition on the basis of software complex cellular automata PyCAlab. The model is based on the general patterns of growth of various systems in resource-limited settings. At examples it is shown that the period of transition from an unlimited growth of the market agents to the stage of competitive growth takes quite a long time and may be characterized as monotonic. During this period two main strategies of competitive selection coexist: 1) capture of maximum market space with any reasonable costs; 2) saving by reducing costs. The obtained results allow concluding that the competitive strategies of companies must combine two mentioned types of behavior, and this issue needs to be given adequate attention in the academic literature on management. The created numerical model may be used for market research when developing of the strategies for promotion of new goods and services.

  17. [A Cellular Automata Model for a Community Comprising Two Plant Species of Different Growth Forms].

    PubMed

    Frolov, P V; Zubkova, E V; Komarov, A S

    2015-01-01

    A cellular automata computer model for the interactions between two plant species of different growth forms--the lime hairgrass Deschampsia caespitosa (L.) P. Beauv., a sod cereal, and the moneywort Lysimachia nummularia L., a ground creeping perennial herb--is considered. Computer experiments on the self-maintenance of the populations of each species against the background of a gradual increase in the share of randomly eliminated individuals, coexistence of the populations of two species, and the effect of the phytogenous field have been conducted. As has been shown, all the studied factors determine the number of individuals and self-sustainability of the simulated populations by the degree of their impact. The limits of action have been determined for individual factors; within these limits, the specific features in plant reproduction and dispersal provide sustainable coexistence of the simulated populations. It has been demonstrated that the constructed model allows for studying the long-term developmental dynamics of the plants belonging to the selected growth forms.

  18. Synthesis of a Neutral Mixed-Valence Diferrocenyl Carborane for Molecular Quantum-Dot Cellular Automata Applications.

    PubMed

    Christie, John A; Forrest, Ryan P; Corcelli, Steven A; Wasio, Natalie A; Quardokus, Rebecca C; Brown, Ryan; Kandel, S Alex; Lu, Yuhui; Lent, Craig S; Henderson, Kenneth W

    2015-12-14

    The preparation of 7-Fc(+) -8-Fc-7,8-nido-[C2 B9 H10 ](-) (Fc(+) FcC2 B9 (-) ) demonstrates the successful incorporation of a carborane cage as an internal counteranion bridging between ferrocene and ferrocenium units. This neutral mixed-valence Fe(II) /Fe(III) complex overcomes the proximal electronic bias imposed by external counterions, a practical limitation in the use of molecular switches. A combination of UV/Vis-NIR spectroscopic and TD-DFT computational studies indicate that electron transfer within Fc(+) FcC2 B9 (-) is achieved through a bridge-mediated mechanism. This electronic framework therefore provides the possibility of an all-neutral null state, a key requirement for the implementation of quantum-dot cellular automata (QCA) molecular computing. The adhesion, ordering, and characterization of Fc(+) FcC2 B9 (-) on Au(111) has been observed by scanning tunneling microscopy. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Computer simulation of a cellular automata model for the immune response in a retrovirus system

    NASA Astrophysics Data System (ADS)

    Pandey, R. B.

    1989-02-01

    Immune response in a retrovirus system is modeled by a network of three binary cell elements to take into account some of the main functional features of T4 cells, T8 cells, and viruses. Two different intercell interactions are introduced, one of which leads to three fixed points while the other yields bistable fixed points oscillating between a healthy state and a sick state in a mean field treatment. Evolution of these cells is studied for quenched and annealed random interactions on a simple cubic lattice with a nearest neighbor interaction using inhomogenous cellular automata. Populations of T4 cells and viral cells oscillate together with damping (with constant amplitude) for annealed (quenched) interaction on increasing the value of mixing probability B from zero to a characteristic value B ca ( B cq). For higher B, the average number of T4 cells increases while that of the viral infected cells decreases monotonically on increasing B, suggesting a phase transition at B ca ( B cq).

  20. Thumb rule of visual angle: a new confirmation.

    PubMed

    Groot, C; Ortega, F; Beltran, F S

    1994-02-01

    The classical thumb rule of visual angle was reexamined. Hence, the visual angle was measured as a function of a thumb's width and the distance between eye and thumb. The measurement of a thumb's width when held at arm's length was taken on 67 second-year students of psychology. The visual angle was about 2 degrees as R. P. O'Shea confirmed in 1991. Also, we confirmed a linear relationship between the size of a thumb's width at arm's length and the visual angle.

  1. Multiple origins of linear dunes on Earth and Titan

    USGS Publications Warehouse

    Rubin, David M.; Hesp, Patrick A.

    2009-01-01

    Dunes with relatively long and parallel crests are classified as linear dunes. On Earth, they form in at least two environmental settings: where winds of bimodal direction blow across loose sand, and also where single-direction winds blow over sediment that is locally stabilized, be it through vegetation, sediment cohesion or topographic shelter from the winds. Linear dunes have also been identified on Titan, where they are thought to form in loose sand. Here we present evidence that in the Qaidam Basin, China, linear dunes are found downwind of transverse dunes owing to higher cohesiveness in the downwind sediments, which contain larger amounts of salt and mud. We also present a compilation of other settings where sediment stabilization has been reported to produce linear dunes. We suggest that in this dune-forming process, loose sediment accumulates on the dunes and is stabilized; the stable dune then functions as a topographic shelter, which induces the deposition of sediments downwind. We conclude that a model in which Titan's dunes formed similarly in cohesive sediments cannot be ruled out by the existing data.

  2. Modified Kramers-Kronig relations and sum rules for meromorphic total refractive index

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peiponen, Kai-Erik; Saarinen, Jarkko J.; Vartiainen, Erik M.

    2003-08-01

    Modified Kramers-Kronig relations and corresponding sum rules are shown to hold for the total refractive index that can be presented as a sum of complex linear and nonlinear refractive indices, respectively. It is suggested that a self-action process, involving the degenerate third-order nonlinear susceptibility, can yield a negative total refractive index at some spectral range.

  3. High Dimensional Classification Using Features Annealed Independence Rules.

    PubMed

    Fan, Jianqing; Fan, Yingying

    2008-01-01

    Classification using high-dimensional features arises frequently in many contemporary statistical studies such as tumor classification using microarray or other high-throughput data. The impact of dimensionality on classifications is largely poorly understood. In a seminal paper, Bickel and Levina (2004) show that the Fisher discriminant performs poorly due to diverging spectra and they propose to use the independence rule to overcome the problem. We first demonstrate that even for the independence classification rule, classification using all the features can be as bad as the random guessing due to noise accumulation in estimating population centroids in high-dimensional feature space. In fact, we demonstrate further that almost all linear discriminants can perform as bad as the random guessing. Thus, it is paramountly important to select a subset of important features for high-dimensional classification, resulting in Features Annealed Independence Rules (FAIR). The conditions under which all the important features can be selected by the two-sample t-statistic are established. The choice of the optimal number of features, or equivalently, the threshold value of the test statistics are proposed based on an upper bound of the classification error. Simulation studies and real data analysis support our theoretical results and demonstrate convincingly the advantage of our new classification procedure.

  4. Integration of logistic regression, Markov chain and cellular automata models to simulate urban expansion

    NASA Astrophysics Data System (ADS)

    Jokar Arsanjani, Jamal; Helbich, Marco; Kainz, Wolfgang; Darvishi Boloorani, Ali

    2013-04-01

    This research analyses the suburban expansion in the metropolitan area of Tehran, Iran. A hybrid model consisting of logistic regression model, Markov chain (MC), and cellular automata (CA) was designed to improve the performance of the standard logistic regression model. Environmental and socio-economic variables dealing with urban sprawl were operationalised to create a probability surface of spatiotemporal states of built-up land use for the years 2006, 2016, and 2026. For validation, the model was evaluated by means of relative operating characteristic values for different sets of variables. The approach was calibrated for 2006 by cross comparing of actual and simulated land use maps. The achieved outcomes represent a match of 89% between simulated and actual maps of 2006, which was satisfactory to approve the calibration process. Thereafter, the calibrated hybrid approach was implemented for forthcoming years. Finally, future land use maps for 2016 and 2026 were predicted by means of this hybrid approach. The simulated maps illustrate a new wave of suburban development in the vicinity of Tehran at the western border of the metropolis during the next decades.

  5. Impact of time delay on the dynamics of SEIR epidemic model using cellular automata

    NASA Astrophysics Data System (ADS)

    Sharma, Natasha; Gupta, Arvind Kumar

    2017-04-01

    The delay of an infectious disease is significant when aiming to predict its strength and spreading patterns. In this paper the SEIR ​(susceptible-exposed-infected-recovered) epidemic spread with time delay is analyzed through a two-dimensional cellular automata model. The time delay corresponding to the infectious span, predominantly, includes death during the latency period in due course of infection. The advancement of whole system is described by SEIR transition function complemented with crucial factors like inhomogeneous population distribution, birth and disease independent mortality. Moreover, to reflect more realistic population dynamics some stochastic parameters like population movement and connections at local level are also considered. The existence and stability of disease free equilibrium is investigated. Two prime behavioral patterns of disease dynamics is found depending on delay. The critical value of delay, beyond which there are notable variations in spread patterns, is computed. The influence of important parameters affecting the disease dynamics on basic reproduction number is also examined. The results obtained show that delay plays an affirmative role to control disease progression in an infected host.

  6. An extended cost potential field cellular automata model considering behavior variation of pedestrian flow

    NASA Astrophysics Data System (ADS)

    Guo, Fang; Li, Xingli; Kuang, Hua; Bai, Yang; Zhou, Huaguo

    2016-11-01

    The original cost potential field cellular automata describing normal pedestrian evacuation is extended to study more general evacuation scenarios. Based on the cost potential field function, through considering the psychological characteristics of crowd under emergencies, the quantitative formula of behavior variation is introduced to reflect behavioral changes caused by psychology tension. The numerical simulations are performed to investigate the effects of the magnitude of behavior variation, the different pedestrian proportions with different behavior variation and other factors on the evacuation efficiency and process in a room. The spatiotemporal dynamic characteristic during the evacuation process is also discussed. The results show that compared with the normal evacuation, the behavior variation under an emergency does not necessarily lead to the decrease of the evacuation efficiency. At low density, the increase of the behavior variation can improve the evacuation efficiency, while at high density, the evacuation efficiency drops significantly with the increasing amplitude of the behavior variation. In addition, the larger proportion of pedestrian affected by the behavior variation will prolong the evacuation time.

  7. Enabling model checking for collaborative process analysis: from BPMN to `Network of Timed Automata'

    NASA Astrophysics Data System (ADS)

    Mallek, Sihem; Daclin, Nicolas; Chapurlat, Vincent; Vallespir, Bruno

    2015-04-01

    Interoperability is a prerequisite for partners involved in performing collaboration. As a consequence, the lack of interoperability is now considered a major obstacle. The research work presented in this paper aims to develop an approach that allows specifying and verifying a set of interoperability requirements to be satisfied by each partner in the collaborative process prior to process implementation. To enable the verification of these interoperability requirements, it is necessary first and foremost to generate a model of the targeted collaborative process; for this research effort, the standardised language BPMN 2.0 is used. Afterwards, a verification technique must be introduced, and model checking is the preferred option herein. This paper focuses on application of the model checker UPPAAL in order to verify interoperability requirements for the given collaborative process model. At first, this step entails translating the collaborative process model from BPMN into a UPPAAL modelling language called 'Network of Timed Automata'. Second, it becomes necessary to formalise interoperability requirements into properties with the dedicated UPPAAL language, i.e. the temporal logic TCTL.

  8. Dynamics of the HIV infection under antiretroviral therapy: A cellular automata approach

    NASA Astrophysics Data System (ADS)

    González, Ramón E. R.; Coutinho, Sérgio; Zorzenon dos Santos, Rita Maria; de Figueirêdo, Pedro Hugo

    2013-10-01

    The dynamics of human immunodeficiency virus infection under antiretroviral therapy is investigated using a cellular automata model where the effectiveness of each drug is self-adjusted by the concentration of CD4+ T infected cells present at each time step. The effectiveness of the drugs and the infected cell concentration at the beginning of treatment are the control parameters of the cell population’s dynamics during therapy. The model allows describing processes of mono and combined therapies. The dynamics that emerges from this model when considering combined antiretroviral therapies reproduces with fair qualitative agreement the phases and different time scales of the process. As observed in clinical data, the results reproduce the significant decrease in the population of infected cells and a concomitant increase of the population of healthy cells in a short timescale (weeks) after the initiation of treatment. Over long time scales, early treatment with potent drugs may lead to undetectable levels of infection. For late treatment or treatments starting with a low density of CD4+ T healthy cells it was observed that the treatment may lead to a steady state in which the T cell counts are above the threshold associated with the onset of AIDS. The results obtained are validated through comparison to available clinical trial data.

  9. Structural classification of proteins using texture descriptors extracted from the cellular automata image.

    PubMed

    Kavianpour, Hamidreza; Vasighi, Mahdi

    2017-02-01

    Nowadays, having knowledge about cellular attributes of proteins has an important role in pharmacy, medical science and molecular biology. These attributes are closely correlated with the function and three-dimensional structure of proteins. Knowledge of protein structural class is used by various methods for better understanding the protein functionality and folding patterns. Computational methods and intelligence systems can have an important role in performing structural classification of proteins. Most of protein sequences are saved in databanks as characters and strings and a numerical representation is essential for applying machine learning methods. In this work, a binary representation of protein sequences is introduced based on reduced amino acids alphabets according to surrounding hydrophobicity index. Many important features which are hidden in these long binary sequences can be clearly displayed through their cellular automata images. The extracted features from these images are used to build a classification model by support vector machine. Comparing to previous studies on the several benchmark datasets, the promising classification rates obtained by tenfold cross-validation imply that the current approach can help in revealing some inherent features deeply hidden in protein sequences and improve the quality of predicting protein structural class.

  10. Application of stochastic automata networks for creation of continuous time Markov chain models of voltage gating of gap junction channels.

    PubMed

    Snipas, Mindaugas; Pranevicius, Henrikas; Pranevicius, Mindaugas; Pranevicius, Osvaldas; Paulauskas, Nerijus; Bukauskas, Feliksas F

    2015-01-01

    The primary goal of this work was to study advantages of numerical methods used for the creation of continuous time Markov chain models (CTMC) of voltage gating of gap junction (GJ) channels composed of connexin protein. This task was accomplished by describing gating of GJs using the formalism of the stochastic automata networks (SANs), which allowed for very efficient building and storing of infinitesimal generator of the CTMC that allowed to produce matrices of the models containing a distinct block structure. All of that allowed us to develop efficient numerical methods for a steady-state solution of CTMC models. This allowed us to accelerate CPU time, which is necessary to solve CTMC models, ~20 times.

  11. Application of Stochastic Automata Networks for Creation of Continuous Time Markov Chain Models of Voltage Gating of Gap Junction Channels

    PubMed Central

    Pranevicius, Henrikas; Pranevicius, Mindaugas; Pranevicius, Osvaldas; Bukauskas, Feliksas F.

    2015-01-01

    The primary goal of this work was to study advantages of numerical methods used for the creation of continuous time Markov chain models (CTMC) of voltage gating of gap junction (GJ) channels composed of connexin protein. This task was accomplished by describing gating of GJs using the formalism of the stochastic automata networks (SANs), which allowed for very efficient building and storing of infinitesimal generator of the CTMC that allowed to produce matrices of the models containing a distinct block structure. All of that allowed us to develop efficient numerical methods for a steady-state solution of CTMC models. This allowed us to accelerate CPU time, which is necessary to solve CTMC models, ∼20 times. PMID:25705700

  12. Large-Nc masses of light mesons from QCD sum rules for nonlinear radial Regge trajectories

    NASA Astrophysics Data System (ADS)

    Afonin, S. S.; Solomko, T. D.

    2018-04-01

    The large-Nc masses of light vector, axial, scalar and pseudoscalar mesons are calculated from QCD spectral sum rules for a particular ansatz interpolating the radial Regge trajectories. The ansatz includes a linear part plus exponentially degreasing corrections to the meson masses and residues. The form of corrections was proposed some time ago for consistency with analytical structure of Operator Product Expansion of the two-point correlation functions. We revised that original analysis and found the second solution for the proposed sum rules. The given solution describes better the spectrum of vector and axial mesons.

  13. Cellular Automata for Spatiotemporal Pattern Formation from Reaction-Diffusion Partial Differential Equations

    NASA Astrophysics Data System (ADS)

    Ohmori, Shousuke; Yamazaki, Yoshihiro

    2016-01-01

    Ultradiscrete equations are derived from a set of reaction-diffusion partial differential equations, and cellular automaton rules are obtained on the basis of the ultradiscrete equations. Some rules reproduce the dynamical properties of the original reaction-diffusion equations, namely, bistability and pulse annihilation. Furthermore, other rules bring about soliton-like preservation and periodic pulse generation with a pacemaker, which are not obtained from the original reaction-diffusion equations.

  14. Comparison of proposed countermeasures for dilemma zone at signalized intersections based on cellular automata simulations.

    PubMed

    Wu, Yina; Abdel-Aty, Mohamed; Ding, Yaoxian; Jia, Bin; Shi, Qi; Yan, Xuedong

    2018-07-01

    The Type II dilemma zone describes the road segment to a signalized intersection where drivers have difficulties to decide either stop or go at the onset of yellow signal. Such phenomenon can result in an increased crash risk at signalized intersections. Different types of warning systems have been proposed to help drivers make decisions. Although the warning systems help to improve drivers' behavior, they also have several disadvantages such as increasing rear-end crashes or red-light running (RLR) violations. In this study, a new warning system called pavement marking with auxiliary countermeasure (PMAIC) is proposed to reduce the dilemma zone and enhance the traffic safety at signalized intersections. The proposed warning system integrates the pavement marking and flashing yellow system which can provide drivers with better suggestions about stop/go decisions based on their arriving time and speed. In order to evaluate the performance of the proposed warning system, this paper presents a cellular automata (CA) simulation study. The CA simulations are conducted for four different scenarios in total, including the typical intersection without warning system, the intersection with flashing green countermeasure, the intersection with pavement marking, and the intersection with the PMAIC warning system. Before the specific CA simulation analysis, a logistic regression model is calibrated based on field video data to predict drivers' general stop/go decisions. Also, the rules of vehicle movements in the CA models under the influence by different warning systems are proposed. The proxy indicators of rear-end crash and potential RLR violations were estimated and used to evaluate safety levels for the different scenarios. The simulation results showed that the PMAIC countermeasure consistently offered best performance to reduce rear-end crash and RLR violation. Meanwhile, the results indicate that the flashing-green countermeasure could not effectively reduce either rear

  15. A novel time series link prediction method: Learning automata approach

    NASA Astrophysics Data System (ADS)

    Moradabadi, Behnaz; Meybodi, Mohammad Reza

    2017-09-01

    Link prediction is a main social network challenge that uses the network structure to predict future links. The common link prediction approaches to predict hidden links use a static graph representation where a snapshot of the network is analyzed to find hidden or future links. For example, similarity metric based link predictions are a common traditional approach that calculates the similarity metric for each non-connected link and sort the links based on their similarity metrics and label the links with higher similarity scores as the future links. Because people activities in social networks are dynamic and uncertainty, and the structure of the networks changes over time, using deterministic graphs for modeling and analysis of the social network may not be appropriate. In the time-series link prediction problem, the time series link occurrences are used to predict the future links In this paper, we propose a new time series link prediction based on learning automata. In the proposed algorithm for each link that must be predicted there is one learning automaton and each learning automaton tries to predict the existence or non-existence of the corresponding link. To predict the link occurrence in time T, there is a chain consists of stages 1 through T - 1 and the learning automaton passes from these stages to learn the existence or non-existence of the corresponding link. Our preliminary link prediction experiments with co-authorship and email networks have provided satisfactory results when time series link occurrences are considered.

  16. Statistical Origin of the Meyer-Neldel Rule in Amorphous Semiconductor Thin Film Transistors

    NASA Astrophysics Data System (ADS)

    Kikuchi, Minoru

    1990-09-01

    The origin of the Meyer-Neldel (MN) rule [G0{\\propto}\\exp (AEσ)] in the dc conductance of amorphous semiconductor thin-film transistors (TFT) is investigated based on the statistical model. We analyzed the temperature derivative of the band bending energy eVs(T) at the semiconductor interface as a function of Vs. It is shown that the condition for the validity of the rule, i.e., the linearity of the derivative deVs/dkT to Vs, certainly holds as a natural consequence of the interplay between the steep tail states and the low gap density of states spectrum. An expression is derived which relates the parameter A in the rule to the gap states spectrum. Model calculations show a magnitude of A in fair agreement with the experimental observations. The effects of the Fermi level position and the magnitude of the midgap density of states are also discussed.

  17. Parental Rule Socialization for Preventive Health and Adolescent Rule Compliance

    ERIC Educational Resources Information Center

    Bylund, Carma L.; Baxter, Leslie A.; Imes, Rebecca S.; Wolf, Bianca

    2010-01-01

    This study examined family rules about nutrition, exercise, and sun protection in 164 parent-young adult children dyads. Both parents and their young adult children independently reported on health rules that they perceived throughout their child's adolescent years and the extent to which the rules were articulated, violations sanctioned, and…

  18. Cell Phones: Rule-Setting, Rule-Breaking, and Relationships in Classrooms

    ERIC Educational Resources Information Center

    Charles, Anita S.

    2012-01-01

    Based on a small qualitative study, this article focuses on understanding the rules for cell phones and other social networking media in schools, an aspect of broader research that led to important understandings of teacher-student negotiations. It considers the rules that schools and teachers make, the rampant breaking of these rules, the…

  19. Using Rule-Based Computer Programming to Unify Communication Rules Research.

    ERIC Educational Resources Information Center

    Sanford, David L.; Roach, J. W.

    This paper proposes the use of a rule-based computer programming language as a standard for the expression of rules, arguing that the adoption of a standard would enable researchers to communicate about rules in a consistent and significant way. Focusing on the formal equivalence of artificial intelligence (AI) programming to different types of…

  20. A Rule-Based Policy-Level Model of Nonsuperpower Behavior in Strategic Conflicts.

    DTIC Science & Technology

    1982-12-01

    a mechanism. The human mind tends to work linearly and to focus implicitly on a few variables. Experience results in subconscious models with far...which is slower. Alternatives to the current ROSIE implementation include reprogramming Scenario Agent in the C language (the language used for the Red...perception, opportunity perception, opportunity response, and assertiveness. As rules are refined, maintenance and reprogramming of the model will be required

  1. A Coral Reef Algorithm Based on Learning Automata for the Coverage Control Problem of Heterogeneous Directional Sensor Networks

    PubMed Central

    Li, Ming; Miao, Chunyan; Leung, Cyril

    2015-01-01

    Coverage control is one of the most fundamental issues in directional sensor networks. In this paper, the coverage optimization problem in a directional sensor network is formulated as a multi-objective optimization problem. It takes into account the coverage rate of the network, the number of working sensor nodes and the connectivity of the network. The coverage problem considered in this paper is characterized by the geographical irregularity of the sensed events and heterogeneity of the sensor nodes in terms of sensing radius, field of angle and communication radius. To solve this multi-objective problem, we introduce a learning automata-based coral reef algorithm for adaptive parameter selection and use a novel Tchebycheff decomposition method to decompose the multi-objective problem into a single-objective problem. Simulation results show the consistent superiority of the proposed algorithm over alternative approaches. PMID:26690162

  2. A Coral Reef Algorithm Based on Learning Automata for the Coverage Control Problem of Heterogeneous Directional Sensor Networks.

    PubMed

    Li, Ming; Miao, Chunyan; Leung, Cyril

    2015-12-04

    Coverage control is one of the most fundamental issues in directional sensor networks. In this paper, the coverage optimization problem in a directional sensor network is formulated as a multi-objective optimization problem. It takes into account the coverage rate of the network, the number of working sensor nodes and the connectivity of the network. The coverage problem considered in this paper is characterized by the geographical irregularity of the sensed events and heterogeneity of the sensor nodes in terms of sensing radius, field of angle and communication radius. To solve this multi-objective problem, we introduce a learning automata-based coral reef algorithm for adaptive parameter selection and use a novel Tchebycheff decomposition method to decompose the multi-objective problem into a single-objective problem. Simulation results show the consistent superiority of the proposed algorithm over alternative approaches.

  3. Construction of phase diagrams for nanoscaled Ising thin films on the honeycomb lattice using cellular automata simulation approach

    NASA Astrophysics Data System (ADS)

    Ghaemi, Mehrdad; Javadi, Nabi

    2017-11-01

    The phase diagrams of the three-layer Ising model on the honeycomb lattice with a diluted surface have been constructed using the probabilistic cellular automata based on Glauber algorithm. The effects of the exchange interactions on the phase diagrams have been investigated. A general mathematical expression for the critical temperature is obtained in terms of relative coupling r = J1/J and Δs = (Js/J) - 1, where J and Js represent the nearest neighbor coupling within inner- and surface-layers, respectively, and each magnetic site in the surface-layer is coupled with the nearest neighbor site in the inner-layer via the exchange coupling J1. In the case of antiferromagnetic coupling between surface-layer and inner-layer, system reveals many interesting phenomena, such as the possibility of existence of compensation line before the critical temperature.

  4. Modifying Intramural Rules.

    ERIC Educational Resources Information Center

    Rokosz, Francis M.

    1981-01-01

    Standard sports rules can be altered to improve the game for intramural participants. These changes may improve players' attitudes, simplify rules for officials, and add safety features to a game. Specific rule modifications are given for volleyball, football, softball, floor hockey, basketball, and soccer. (JN)

  5. Tuning of quantum entanglement in molecular quantum cellular automata based on mixed-valence tetrameric units.

    PubMed

    Palii, Andrew; Tsukerblat, Boris

    2016-10-25

    In this article we consider two coupled tetrameric mixed-valence (MV) units accommodating electron pairs, which play the role of cells in molecular quantum cellular automata. It is supposed that the Coulombic interaction between instantly localized electrons within the cell markedly inhibits the transfer processes between the redox centers. Under this condition, as well as due to the vibronic localization of the electron pair, the cell can encode binary information, which is controlled by neighboring cells. We show that under certain conditions the two low-lying vibronic spin levels of the cell (ground and first excited states) can be regarded as originating from an effective spin-spin interaction. This is shown to depend on the internal parameters of the cell as well as on the induced polarization. Within this simplified two-level picture we evaluate the quantum entanglement in the system represented by the two electrons in the cell and show how the entanglement within the cell and concurrence can be controlled via polarization of the neighboring cells and temperature.

  6. Computer simulation of a cellular automata model for the immune response in a retrovirus system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pandey, R.B.

    1989-02-01

    Immune response in a retrovirus system is modeled by a network of three binary cell elements to take into account some of the main functional features of T4 cells, T8 cells, and viruses. Two different intercell interactions are introduced, one of which leads to three fixed points while the other yields bistable fixed points oscillating between a healthy state and a sick state in a mean field treatment. Evolution of these cells is studied for quenched and annealed random interactions on a simple cubic lattice with a nearest neighbor interaction using inhomogenous cellular automata. Populations of T4 cells and viralmore » cells oscillate together with damping (with constant amplitude) for annealed (quenched) interaction on increasing the value of mixing probability B from zero to a characteristic value B/sub ca/ (B/sub cq/). For higher B, the average number of T4 cells increases while that of the viral infected cells decreases monotonically on increasing B, suggesting a phase transition at B/sub ca/ (B/sub cq/).« less

  7. Estimating enthalpy of vaporization from vapor pressure using Trouton's rule.

    PubMed

    MacLeod, Matthew; Scheringer, Martin; Hungerbühler, Konrad

    2007-04-15

    The enthalpy of vaporization of liquids and subcooled liquids at 298 K (delta H(VAP)) is an important parameter in environmental fate assessments that consider spatial and temporal variability in environmental conditions. It has been shown that delta H(VAP)P for non-hydrogen-bonding substances can be estimated from vapor pressure at 298 K (P(L)) using an empirically derived linear relationship. Here, we demonstrate that the relationship between delta H(VAP)and PL is consistent with Trouton's rule and the ClausiusClapeyron equation under the assumption that delta H(VAP) is linearly dependent on temperature between 298 K and the boiling point temperature. Our interpretation based on Trouton's rule substantiates the empirical relationship between delta H(VAP) degree and P(L) degrees for non-hydrogen-bonding chemicals with subcooled liquid vapor pressures ranging over 15 orders of magnitude. We apply the relationship between delta H(VAP) degrees and P(L) degrees to evaluate data reported in literature reviews for several important classes of semivolatile environmental contaminants, including polycyclic aromatic hydrocarbons, chlorobenzenes, polychlorinated biphenyls and polychlorinated dibenzo-dioxins and -furans and illustrate the temperature dependence of results from a multimedia model presented as a partitioning map. The uncertainty associated with estimating delta H(VAP)degrees from P(L) degrees using this relationship is acceptable for most environmental fate modeling of non-hydrogen-bonding semivolatile organic chemicals.

  8. A probabilistic cellular automata model for the dynamics of a population driven by logistic growth and weak Allee effect

    NASA Astrophysics Data System (ADS)

    Mendonça, J. R. G.

    2018-04-01

    We propose and investigate a one-parameter probabilistic mixture of one-dimensional elementary cellular automata under the guise of a model for the dynamics of a single-species unstructured population with nonoverlapping generations in which individuals have smaller probability of reproducing and surviving in a crowded neighbourhood but also suffer from isolation and dispersal. Remarkably, the first-order mean field approximation to the dynamics of the model yields a cubic map containing terms representing both logistic and weak Allee effects. The model has a single absorbing state devoid of individuals, but depending on the reproduction and survival probabilities can achieve a stable population. We determine the critical probability separating these two phases and find that the phase transition between them is in the directed percolation universality class of critical behaviour.

  9. Sum rules for the uniform-background model of an atomic-sharp metal corner

    NASA Astrophysics Data System (ADS)

    Streitenberger, P.

    1994-04-01

    Analytical results are derived for the electrostatic potential of an atomic-sharp 90° metal corner in the uniform-background model. The electrostatic potential at a free jellium edge and the jellium corner, respectively, is determined exactly in terms of the energy per electron of the uniform electron gas integrated over the background density. The surface energy, the edge formation energy and the derivative of the corner formation energy with respect to the background density are given as integrals over the electrostatic potential. The present approach represents a novel approach to such sum rules, inclusive of the Budd-Vannimenus sum rules for a free jellium surface, based on general properties of linear response functions.

  10. Resolving task rule incongruence during task switching by competitor rule suppression.

    PubMed

    Meiran, Nachshon; Hsieh, Shulan; Dimov, Eduard

    2010-07-01

    Task switching requires maintaining readiness to execute any task of a given set of tasks. However, when tasks switch, the readiness to execute the now-irrelevant task generates interference, as seen in the task rule incongruence effect. Overcoming such interference requires fine-tuned inhibition that impairs task readiness only minimally. In an experiment involving 2 object classification tasks and 2 location classification tasks, the authors show that irrelevant task rules that generate response conflicts are inhibited. This competitor rule suppression (CRS) is seen in response slowing in subsequent trials, when the competing rules become relevant. CRS is shown to operate on specific rules without affecting similar rules. CRS and backward inhibition, which is another inhibitory phenomenon, produced additive effects on reaction time, suggesting their mutual independence. Implications for current formal theories of task switching as well as for conflict monitoring theories are discussed. (c) 2010 APA, all rights reserved

  11. Collective Behaviors in Spatially Extended Systems with Local Interactions and Synchronous Updating

    NASA Astrophysics Data System (ADS)

    ChatÉ, H.; Manneville, P.

    1992-01-01

    Assessing the extent to which dynamical systems with many degrees of freedom can be described within a thermodynamics formalism is a problem that currently attracts much attention. In this context, synchronously updated regular lattices of identical, chaotic elements with local interactions are promising models for which statistical mechanics may be hoped to provide some insights. This article presents a large class of cellular automata rules and coupled map lattices of the above type in space dimensions d = 2 to 6.Such simple models can be approached by a mean-field approximation which usually reduces the dynamics to that of a map governing the evolution of some extensive density. While this approximation is exact in the d = infty limit, where macroscopic variables must display the time-dependent behavior of the mean-field map, basic intuition from equilibrium statistical mechanics rules out any such behavior in a low-dimensional systems, since it would involve the collective motion of locally disordered elements.The models studied are chosen to be as close as possible to mean-field conditions, i.e., rather high space dimension, large connectivity, and equal-weight coupling between sites. While the mean-field evolution is never observed, a new type of non-trivial collective behavior is found, at odds with the predictions of equilibrium statistical mechanics. Both in the cellular automata models and in the coupled map lattices, macroscopic variables frequently display a non-transient, time-dependent, low-dimensional dynamics emerging out of local disorder. Striking examples are period 3 cycles in two-state cellular automata and a Hopf bifurcation for a d = 5 lattice of coupled logistic maps. An extensive account of the phenomenology is given, including a catalog of behaviors, classification tables for the celular automata rules, and bifurcation diagrams for the coupled map lattices.The observed underlying dynamics is accompanied by an intrinsic quasi-Gaussian noise

  12. How second-grade students internalize rules during teacher-student transactions: a case study.

    PubMed

    Méard, Jacques; Bertone, Stefano; Flavier, Eric

    2008-09-01

    Vygotsky's theory of the internalization of signs provided the basis for this study. This study tried to analyse the processes by which second-grade students internalize school rules. Ethnographic data were collected on 102 lessons in a second-grade class (6-8 years) during 1 year. This study focused on three lessons (ethnographic data completed by video-recordings, post-lesson interviews with the teacher, and re-transcriptions of the verbal interactions of the lessons and interviews). The longitudinal observation data were broken down into discrete transactions, crossed with the recorded data, and analysed in a four-step procedure. The results showed that the students' self-regulated actions (voluntary performance of prescribed actions) corresponded to the teacher's presentation of the rules, which was varied and personalized. She used explanation/justification, negotiation, persuasion, or imposition as a function of the situation and the students concerned. The results revealed: (a) Multiple actions of explanation/justification of the rules, negotiation and persuasion to the entire class, (b) Personalized actions of persuasion and rule imposition in instances of heteronomous actions by students, (c) Actions adjusted to the dynamics of the transactions. This study demonstrates how closely the actions of teacher and students are linked. More than a linear process of rules internalization, education looks like a co-construction of rules between teacher and students. These results can serve as a basis for the tools of teacher teaching.

  13. Designing Nanoscale Counter Using Reversible Gate Based on Quantum-Dot Cellular Automata

    NASA Astrophysics Data System (ADS)

    Moharrami, Elham; Navimipour, Nima Jafari

    2018-04-01

    Some new technologies such as Quantum-dot Cellular Automata (QCA) is suggested to solve the physical limits of the Complementary Metal-Oxide Semiconductor (CMOS) technology. The QCA as one of the novel technologies at nanoscale has potential applications in future computers. This technology has some advantages such as minimal size, high speed, low latency, and low power consumption. As a result, it is used for creating all varieties of memory. Counter circuits as one of the important circuits in the digital systems are composed of some latches, which are connected to each other in series and actually they count input pulses in the circuit. On the other hand, the reversible computations are very important because of their ability in reducing energy in nanometer circuits. Improving the energy efficiency, increasing the speed of nanometer circuits, increasing the portability of system, making smaller components of the circuit in a nuclear size and reducing the power consumption are considered as the usage of reversible logic. Therefore, this paper aims to design a two-bit reversible counter that is optimized on the basis of QCA using an improved reversible gate. The proposed reversible structure of 2-bit counter can be increased to 3-bit, 4-bit and more. The advantages of the proposed design have been shown using QCADesigner in terms of the delay in comparison with previous circuits.

  14. A cellular automata model for avascular solid tumor growth under the effect of therapy

    NASA Astrophysics Data System (ADS)

    Reis, E. A.; Santos, L. B. L.; Pinho, S. T. R.

    2009-04-01

    Tumor growth has long been a target of investigation within the context of mathematical and computer modeling. The objective of this study is to propose and analyze a two-dimensional stochastic cellular automata model to describe avascular solid tumor growth, taking into account both the competition between cancer cells and normal cells for nutrients and/or space and a time-dependent proliferation of cancer cells. Gompertzian growth, characteristic of some tumors, is described and some of the features of the time-spatial pattern of solid tumors, such as compact morphology with irregular borders, are captured. The parameter space is studied in order to analyze the occurrence of necrosis and the response to therapy. Our findings suggest that transitions exist between necrotic and non-necrotic phases (no-therapy cases), and between the states of cure and non-cure (therapy cases). To analyze cure, the control and order parameters are, respectively, the highest probability of cancer cell proliferation and the probability of the therapeutic effect on cancer cells. With respect to patterns, it is possible to observe the inner necrotic core and the effect of the therapy destroying the tumor from its outer borders inwards.

  15. Safety impacts of red light cameras at signalized intersections based on cellular automata models.

    PubMed

    Chai, C; Wong, Y D; Lum, K M

    2015-01-01

    This study applies a simulation technique to evaluate the hypothesis that red light cameras (RLCs) exert important effects on accident risks. Conflict occurrences are generated by simulation and compared at intersections with and without RLCs to assess the impact of RLCs on several conflict types under various traffic conditions. Conflict occurrences are generated through simulating vehicular interactions based on an improved cellular automata (CA) model. The CA model is calibrated and validated against field observations at approaches with and without RLCs. Simulation experiments are conducted for RLC and non-RLC intersections with different geometric layouts and traffic demands to generate conflict occurrences that are analyzed to evaluate the hypothesis that RLCs exert important effects on road safety. The comparison of simulated conflict occurrences show favorable safety impacts of RLCs on crossing conflicts and unfavorable impacts for rear-end conflicts during red/amber phases. Corroborative results are found from broad analysis of accident occurrence. RLCs are found to have a mixed effect on accident risk at signalized intersections: crossing collisions are reduced, whereas rear-end collisions may increase. The specially developed CA model is found to be a feasible safety assessment tool.

  16. Estimation of the spatiotemporal dynamics of snow covered area by using cellular automata models

    NASA Astrophysics Data System (ADS)

    Pardo-Igúzquiza, Eulogio; Collados-Lara, Antonio-Juan; Pulido-Velazquez, David

    2017-07-01

    Given the need to consider the cryosphere in water resources management for mountainous regions, the purpose of this paper is to model the daily spatially distributed dynamics of snow covered area (SCA) by using calibrated cellular automata models. For the operational use of the calibrated model, the only data requirements are the altitude of each cell of the spatial discretization of the area of interest and precipitation and temperature indexes for the area of interest. For the calibration step, experimental snow covered area data are needed. Potential uses of the model are to estimate the snow covered area when satellite data are absent, or when they provide a temporal resolution different from the operational resolution, or when the satellite images are useless because they are covered by clouds or because there has been a sensor failure. Another interesting application is the simulation of SCA dynamics for the snow covered area under future climatic scenarios. The model is applied to the Sierra Nevada mountain range, in southern Spain, which is home to significant biodiversity, contains important water resources in its snowpack, and contains the most meridional ski resort in Europe.

  17. Spin structure of the neutron ({sup 3}He) and the Bjoerken sum rule

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meziani, Z.E.

    1994-12-01

    A first measurement of the longitudinal asymmetry of deep-inelastic scattering of polarized electrons from a polarized {sup 3}He target at energies ranging from 19 to 26 GeV has been performed at the Stanford Linear Accelerator Center (SLAC). The spin-structure function of the neutron g{sub 1}{sup n} has been extracted from the measured asymmetries. The Quark Parton Model (QPM) interpretation of the nucleon spin-structure function is examined in light of the new results. A test of the Ellis-Jaffe sum rule (E-J) on the neutron is performed at high momentum transfer and found to be satisfied. Furthermore, combining the proton results ofmore » the European Muon Collaboration (EMC) and the neutron results of E-142, the Bjoerken sum rule test is carried at high Q{sup 2} where higher order Perturbative Quantum Chromodynamics (PQCD) corrections and higher-twist corrections are smaller. The sum rule is saturated to within one standard deviation.« less

  18. Land use change modeling through scenario-based cellular automata Markov: improving spatial forecasting.

    PubMed

    Jahanishakib, Fatemeh; Mirkarimi, Seyed Hamed; Salmanmahiny, Abdolrassoul; Poodat, Fatemeh

    2018-05-08

    Efficient land use management requires awareness of past changes, present actions, and plans for future developments. Part of these requirements is achieved using scenarios that describe a future situation and the course of changes. This research aims to link scenario results with spatially explicit and quantitative forecasting of land use development. To develop land use scenarios, SMIC PROB-EXPERT and MORPHOL methods were used. It revealed eight scenarios as the most probable. To apply the scenarios, we considered population growth rate and used a cellular automata-Markov chain (CA-MC) model to implement the quantified changes described by each scenario. For each scenario, a set of landscape metrics was used to assess the ecological integrity of land use classes in terms of fragmentation and structural connectivity. The approach enabled us to develop spatial scenarios of land use change and detect their differences for choosing the most integrated landscape pattern in terms of landscape metrics. Finally, the comparison between paired forecasted scenarios based on landscape metrics indicates that scenarios 1-1, 2-2, 3-2, and 4-1 have a more suitable integrity. The proposed methodology for developing spatial scenarios helps executive managers to create scenarios with many repetitions and customize spatial patterns in real world applications and policies.

  19. Cellular automata-based modelling and simulation of biofilm structure on multi-core computers.

    PubMed

    Skoneczny, Szymon

    2015-01-01

    The article presents a mathematical model of biofilm growth for aerobic biodegradation of a toxic carbonaceous substrate. Modelling of biofilm growth has fundamental significance in numerous processes of biotechnology and mathematical modelling of bioreactors. The process following double-substrate kinetics with substrate inhibition proceeding in a biofilm has not been modelled so far by means of cellular automata. Each process in the model proposed, i.e. diffusion of substrates, uptake of substrates, growth and decay of microorganisms and biofilm detachment, is simulated in a discrete manner. It was shown that for flat biofilm of constant thickness, the results of the presented model agree with those of a continuous model. The primary outcome of the study was to propose a mathematical model of biofilm growth; however a considerable amount of focus was also placed on the development of efficient algorithms for its solution. Two parallel algorithms were created, differing in the way computations are distributed. Computer programs were created using OpenMP Application Programming Interface for C++ programming language. Simulations of biofilm growth were performed on three high-performance computers. Speed-up coefficients of computer programs were compared. Both algorithms enabled a significant reduction of computation time. It is important, inter alia, in modelling and simulation of bioreactor dynamics.

  20. A method of extracting impervious surface based on rule algorithm

    NASA Astrophysics Data System (ADS)

    Peng, Shuangyun; Hong, Liang; Xu, Quanli

    2018-02-01

    The impervious surface has become an important index to evaluate the urban environmental quality and measure the development level of urbanization. At present, the use of remote sensing technology to extract impervious surface has become the main way. In this paper, a method to extract impervious surface based on rule algorithm is proposed. The main ideas of the method is to use the rule-based algorithm to extract impermeable surface based on the characteristics and the difference which is between the impervious surface and the other three types of objects (water, soil and vegetation) in the seven original bands, NDWI and NDVI. The steps can be divided into three steps: 1) Firstly, the vegetation is extracted according to the principle that the vegetation is higher in the near-infrared band than the other bands; 2) Then, the water is extracted according to the characteristic of the water with the highest NDWI and the lowest NDVI; 3) Finally, the impermeable surface is extracted based on the fact that the impervious surface has a higher NDWI value and the lowest NDVI value than the soil.In order to test the accuracy of the rule algorithm, this paper uses the linear spectral mixed decomposition algorithm, the CART algorithm, the NDII index algorithm for extracting the impervious surface based on six remote sensing image of the Dianchi Lake Basin from 1999 to 2014. Then, the accuracy of the above three methods is compared with the accuracy of the rule algorithm by using the overall classification accuracy method. It is found that the extraction method based on the rule algorithm is obviously higher than the above three methods.

  1. Choosing the Rules: Distinct and Overlapping Frontoparietal Representations of Task Rules for Perceptual Decisions

    PubMed Central

    Kriegeskorte, Nikolaus; Carlin, Johan D.; Rowe, James B.

    2013-01-01

    Behavior is governed by rules that associate stimuli with responses and outcomes. Human and monkey studies have shown that rule-specific information is widely represented in the frontoparietal cortex. However, it is not known how establishing a rule under different contexts affects its neural representation. Here, we use event-related functional MRI (fMRI) and multivoxel pattern classification methods to investigate the human brain's mechanisms of establishing and maintaining rules for multiple perceptual decision tasks. Rules were either chosen by participants or specifically instructed to them, and the fMRI activation patterns representing rule-specific information were compared between these contexts. We show that frontoparietal regions differ in the properties of their rule representations during active maintenance before execution. First, rule-specific information maintained in the dorsolateral and medial frontal cortex depends on the context in which it was established (chosen vs specified). Second, rule representations maintained in the ventrolateral frontal and parietal cortex are independent of the context in which they were established. Furthermore, we found that the rule-specific coding maintained in anticipation of stimuli may change with execution of the rule: representations in context-independent regions remain invariant from maintenance to execution stages, whereas rule representations in context-dependent regions do not generalize to execution stage. The identification of distinct frontoparietal systems with context-independent and context-dependent task rule representations, and the distinction between anticipatory and executive rule representations, provide new insights into the functional architecture of goal-directed behavior. PMID:23864675

  2. Smooth criminal: convicted rule-breakers show reduced cognitive conflict during deliberate rule violations.

    PubMed

    Jusyte, Aiste; Pfister, Roland; Mayer, Sarah V; Schwarz, Katharina A; Wirth, Robert; Kunde, Wilfried; Schönenberg, Michael

    2017-09-01

    Classic findings on conformity and obedience document a strong and automatic drive of human agents to follow any type of rule or social norm. At the same time, most individuals tend to violate rules on occasion, and such deliberate rule violations have recently been shown to yield cognitive conflict for the rule-breaker. These findings indicate persistent difficulty to suppress the rule representation, even though rule violations were studied in a controlled experimental setting with neither gains nor possible sanctions for violators. In the current study, we validate these findings by showing that convicted criminals, i.e., individuals with a history of habitual and severe forms of rule violations, can free themselves from such cognitive conflict in a similarly controlled laboratory task. These findings support an emerging view that aims at understanding rule violations from the perspective of the violating agent rather than from the perspective of outside observer.

  3. Presenting Germany's drug pricing rule as a cost-per-QALY rule.

    PubMed

    Gandjour, Afschin

    2012-06-01

    In Germany, the Institute for Quality and Efficiency in Health Care (IQWiG) makes recommendations for ceiling prices of drugs based on an evaluation of the relationship between costs and effectiveness. To set ceiling prices, IQWiG uses the following decision rule: the incremental cost-effectiveness ratio of a new drug compared with the next effective intervention should not be higher than that of the next effective intervention compared to its comparator. The purpose of this paper is to show that IQWiG's decision rule can be presented as a cost-per-QALY rule by using equity-weighted QALYs. This transformation shows where both rules share commonalities. Furthermore, it makes the underlying ethical implications of IQWiG's decision rule transparent and open to debate.

  4. Analysis of traffic congestion induced by the work zone

    NASA Astrophysics Data System (ADS)

    Fei, L.; Zhu, H. B.; Han, X. L.

    2016-05-01

    Based on the cellular automata model, a meticulous two-lane cellular automata model is proposed, in which the driving behavior difference and the difference of vehicles' accelerations between the moving state and the starting state are taken into account. Furthermore the vehicles' motion is refined by using the small cell of one meter long. Then accompanied by coming up with a traffic management measure, a two-lane highway traffic model containing a work zone is presented, in which the road is divided into normal area, merging area and work zone. The vehicles in different areas move forward according to different lane changing rules and position updating rules. After simulation it is found that when the density is small the cluster length in front of the work zone increases with the decrease of the merging probability. Then the suitable merging length and the appropriate speed limit value are recommended. The simulation result in the form of the speed-flow diagram is in good agreement with the empirical data. It indicates that the presented model is efficient and can partially reflect the real traffic. The results may be meaningful for traffic optimization and road construction management.

  5. Gorilla and Orangutan Brains Conform to the Primate Cellular Scaling Rules: Implications for Human Evolution

    PubMed Central

    Herculano-Houzel, Suzana; Kaas, Jon H.

    2011-01-01

    Gorillas and orangutans are primates at least as large as humans, but their brains amount to about one third of the size of the human brain. This discrepancy has been used as evidence that the human brain is about 3 times larger than it should be for a primate species of its body size. In contrast to the view that the human brain is special in its size, we have suggested that it is the great apes that might have evolved bodies that are unusually large, on the basis of our recent finding that the cellular composition of the human brain matches that expected for a primate brain of its size, making the human brain a linearly scaled-up primate brain in its number of cells. To investigate whether the brain of great apes also conforms to the primate cellular scaling rules identified previously, we determine the numbers of neuronal and other cells that compose the orangutan and gorilla cerebella, use these numbers to calculate the size of the brain and of the cerebral cortex expected for these species, and show that these match the sizes described in the literature. Our results suggest that the brains of great apes also scale linearly in their numbers of neurons like other primate brains, including humans. The conformity of great apes and humans to the linear cellular scaling rules that apply to other primates that diverged earlier in primate evolution indicates that prehistoric Homo species as well as other hominins must have had brains that conformed to the same scaling rules, irrespective of their body size. We then used those scaling rules and published estimated brain volumes for various hominin species to predict the numbers of neurons that composed their brains. We predict that Homo heidelbergensis and Homo neanderthalensis had brains with approximately 80 billion neurons, within the range of variation found in modern Homo sapiens. We propose that while the cellular scaling rules that apply to the primate brain have remained stable in hominin evolution (since they

  6. Gorilla and orangutan brains conform to the primate cellular scaling rules: implications for human evolution.

    PubMed

    Herculano-Houzel, Suzana; Kaas, Jon H

    2011-01-01

    Gorillas and orangutans are primates at least as large as humans, but their brains amount to about one third of the size of the human brain. This discrepancy has been used as evidence that the human brain is about 3 times larger than it should be for a primate species of its body size. In contrast to the view that the human brain is special in its size, we have suggested that it is the great apes that might have evolved bodies that are unusually large, on the basis of our recent finding that the cellular composition of the human brain matches that expected for a primate brain of its size, making the human brain a linearly scaled-up primate brain in its number of cells. To investigate whether the brain of great apes also conforms to the primate cellular scaling rules identified previously, we determine the numbers of neuronal and other cells that compose the orangutan and gorilla cerebella, use these numbers to calculate the size of the brain and of the cerebral cortex expected for these species, and show that these match the sizes described in the literature. Our results suggest that the brains of great apes also scale linearly in their numbers of neurons like other primate brains, including humans. The conformity of great apes and humans to the linear cellular scaling rules that apply to other primates that diverged earlier in primate evolution indicates that prehistoric Homo species as well as other hominins must have had brains that conformed to the same scaling rules, irrespective of their body size. We then used those scaling rules and published estimated brain volumes for various hominin species to predict the numbers of neurons that composed their brains. We predict that Homo heidelbergensis and Homo neanderthalensis had brains with approximately 80 billion neurons, within the range of variation found in modern Homo sapiens. We propose that while the cellular scaling rules that apply to the primate brain have remained stable in hominin evolution (since they

  7. Applying genetic algorithms for calibrating a hexagonal cellular automata model for the simulation of debris flows characterised by strong inertial effects

    NASA Astrophysics Data System (ADS)

    Iovine, G.; D'Ambrosio, D.; Di Gregorio, S.

    2005-03-01

    In modelling complex a-centric phenomena which evolve through local interactions within a discrete time-space, cellular automata (CA) represent a valid alternative to standard solution methods based on differential equations. Flow-type phenomena (such as lava flows, pyroclastic flows, earth flows, and debris flows) can be viewed as a-centric dynamical systems, and they can therefore be properly investigated in CA terms. SCIDDICA S 4a is the last release of a two-dimensional hexagonal CA model for simulating debris flows characterised by strong inertial effects. S 4a has been obtained by progressively enriching an initial simplified model, originally derived for simulating very simple cases of slow-moving flow-type landslides. Using an empirical strategy, in S 4a, the inertial character of the flowing mass is translated into CA terms by means of local rules. In particular, in the transition function of the model, the distribution of landslide debris among the cells is obtained through a double cycle of computation. In the first phase, the inertial character of the landslide debris is taken into account by considering indicators of momentum. In the second phase, any remaining debris in the central cell is distributed among the adjacent cells, according to the principle of maximum possible equilibrium. The complexities of the model and of the phenomena to be simulated suggested the need for an automated technique of evaluation for the determination of the best set of global parameters. Accordingly, the model is calibrated using a genetic algorithm and by considering the May 1998 Curti-Sarno (Southern Italy) debris flow. The boundaries of the area affected by the debris flow are simulated well with the model. Errors computed by comparing the simulations with the mapped areal extent of the actual landslide are smaller than those previously obtained without genetic algorithms. As the experiments have been realised in a sequential computing environment, they could be

  8. Common-Sense Rule Inference

    NASA Astrophysics Data System (ADS)

    Lombardi, Ilaria; Console, Luca

    In the paper we show how rule-based inference can be made more flexible by exploiting semantic information associated with the concepts involved in the rules. We introduce flexible forms of common sense reasoning in which whenever no rule applies to a given situation, the inference engine can fire rules that apply to more general or to similar situations. This can be obtained by defining new forms of match between rules and the facts in the working memory and new forms of conflict resolution. We claim that in this way we can overcome some of the brittleness problems that are common in rule-based systems.

  9. Complex-energy approach to sum rules within nuclear density functional theory

    DOE PAGES

    Hinohara, Nobuo; Kortelainen, Markus; Nazarewicz, Witold; ...

    2015-04-27

    The linear response of the nucleus to an external field contains unique information about the effective interaction, correlations governing the behavior of the many-body system, and properties of its excited states. To characterize the response, it is useful to use its energy-weighted moments, or sum rules. By comparing computed sum rules with experimental values, the information content of the response can be utilized in the optimization process of the nuclear Hamiltonian or nuclear energy density functional (EDF). But the additional information comes at a price: compared to the ground state, computation of excited states is more demanding. To establish anmore » efficient framework to compute energy-weighted sum rules of the response that is adaptable to the optimization of the nuclear EDF and large-scale surveys of collective strength, we have developed a new technique within the complex-energy finite-amplitude method (FAM) based on the quasiparticle random- phase approximation. The proposed sum-rule technique based on the complex-energy FAM is a tool of choice when optimizing effective interactions or energy functionals. The method is very efficient and well-adaptable to parallel computing. As a result, the FAM formulation is especially useful when standard theorems based on commutation relations involving the nuclear Hamiltonian and external field cannot be used.« less

  10. A generalized linear integrate-and-fire neural model produces diverse spiking behaviors.

    PubMed

    Mihalaş, Stefan; Niebur, Ernst

    2009-03-01

    For simulations of neural networks, there is a trade-off between the size of the network that can be simulated and the complexity of the model used for individual neurons. In this study, we describe a generalization of the leaky integrate-and-fire model that produces a wide variety of spiking behaviors while still being analytically solvable between firings. For different parameter values, the model produces spiking or bursting, tonic, phasic or adapting responses, depolarizing or hyperpolarizing after potentials and so forth. The model consists of a diagonalizable set of linear differential equations describing the time evolution of membrane potential, a variable threshold, and an arbitrary number of firing-induced currents. Each of these variables is modified by an update rule when the potential reaches threshold. The variables used are intuitive and have biological significance. The model's rich behavior does not come from the differential equations, which are linear, but rather from complex update rules. This single-neuron model can be implemented using algorithms similar to the standard integrate-and-fire model. It is a natural match with event-driven algorithms for which the firing times are obtained as a solution of a polynomial equation.

  11. A Generalized Linear Integrate-and-Fire Neural Model Produces Diverse Spiking Behaviors

    PubMed Central

    Mihalaş, Ştefan; Niebur, Ernst

    2010-01-01

    For simulations of neural networks, there is a trade-off between the size of the network that can be simulated and the complexity of the model used for individual neurons. In this study, we describe a generalization of the leaky integrate-and-fire model that produces a wide variety of spiking behaviors while still being analytically solvable between firings. For different parameter values, the model produces spiking or bursting, tonic, phasic or adapting responses, depolarizing or hyperpolarizing after potentials and so forth. The model consists of a diagonalizable set of linear differential equations describing the time evolution of membrane potential, a variable threshold, and an arbitrary number of firing-induced currents. Each of these variables is modified by an update rule when the potential reaches threshold. The variables used are intuitive and have biological significance. The model’s rich behavior does not come from the differential equations, which are linear, but rather from complex update rules. This single-neuron model can be implemented using algorithms similar to the standard integrate-and-fire model. It is a natural match with event-driven algorithms for which the firing times are obtained as a solution of a polynomial equation. PMID:18928368

  12. A hybrid gene selection approach for microarray data classification using cellular learning automata and ant colony optimization.

    PubMed

    Vafaee Sharbaf, Fatemeh; Mosafer, Sara; Moattar, Mohammad Hossein

    2016-06-01

    This paper proposes an approach for gene selection in microarray data. The proposed approach consists of a primary filter approach using Fisher criterion which reduces the initial genes and hence the search space and time complexity. Then, a wrapper approach which is based on cellular learning automata (CLA) optimized with ant colony method (ACO) is used to find the set of features which improve the classification accuracy. CLA is applied due to its capability to learn and model complicated relationships. The selected features from the last phase are evaluated using ROC curve and the most effective while smallest feature subset is determined. The classifiers which are evaluated in the proposed framework are K-nearest neighbor; support vector machine and naïve Bayes. The proposed approach is evaluated on 4 microarray datasets. The evaluations confirm that the proposed approach can find the smallest subset of genes while approaching the maximum accuracy. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Design and implementation of an efficient single layer five input majority voter gate in quantum-dot cellular automata.

    PubMed

    Bahar, Ali Newaz; Waheed, Sajjad

    2016-01-01

    The fundamental logical element of a quantum-dot cellular automata (QCA) circuit is majority voter gate (MV). The efficiency of a QCA circuit is depends on the efficiency of the MV. This paper presents an efficient single layer five-input majority voter gate (MV5). The structure of proposed MV5 is very simple and easy to implement in any logical circuit. This proposed MV5 reduce number of cells and use conventional QCA cells. However, using MV5 a multilayer 1-bit full-adder (FA) is designed. The functional accuracy of the proposed MV5 and FA are confirmed by QCADesigner a well-known QCA layout design and verification tools. Furthermore, the power dissipation of proposed circuits are estimated, which shows that those circuits dissipate extremely small amount of energy and suitable for reversible computing. The simulation outcomes demonstrate the superiority of the proposed circuit.

  14. 38 CFR 20.1 - Rule 1. Purpose and construction of Rules of Practice.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2010-07-01 2010-07-01 false Rule 1. Purpose and construction of Rules of Practice. 20.1 Section 20.1 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS (CONTINUED) BOARD OF VETERANS' APPEALS: RULES OF PRACTICE General § 20.1 Rule 1. Purpose...

  15. 38 CFR 20.1 - Rule 1. Purpose and construction of Rules of Practice.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2011-07-01 2011-07-01 false Rule 1. Purpose and construction of Rules of Practice. 20.1 Section 20.1 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS (CONTINUED) BOARD OF VETERANS' APPEALS: RULES OF PRACTICE General § 20.1 Rule 1. Purpose...

  16. Rule-based modeling: a computational approach for studying biomolecular site dynamics in cell signaling systems

    PubMed Central

    Chylek, Lily A.; Harris, Leonard A.; Tung, Chang-Shung; Faeder, James R.; Lopez, Carlos F.

    2013-01-01

    Rule-based modeling was developed to address the limitations of traditional approaches for modeling chemical kinetics in cell signaling systems. These systems consist of multiple interacting biomolecules (e.g., proteins), which themselves consist of multiple parts (e.g., domains, linear motifs, and sites of phosphorylation). Consequently, biomolecules that mediate information processing generally have the potential to interact in multiple ways, with the number of possible complexes and post-translational modification states tending to grow exponentially with the number of binary interactions considered. As a result, only large reaction networks capture all possible consequences of the molecular interactions that occur in a cell signaling system, which is problematic because traditional modeling approaches for chemical kinetics (e.g., ordinary differential equations) require explicit network specification. This problem is circumvented through representation of interactions in terms of local rules. With this approach, network specification is implicit and model specification is concise. Concise representation results in a coarse graining of chemical kinetics, which is introduced because all reactions implied by a rule inherit the rate law associated with that rule. Coarse graining can be appropriate if interactions are modular, and the coarseness of a model can be adjusted as needed. Rules can be specified using specialized model-specification languages, and recently developed tools designed for specification of rule-based models allow one to leverage powerful software engineering capabilities. A rule-based model comprises a set of rules, which can be processed by general-purpose simulation and analysis tools to achieve different objectives (e.g., to perform either a deterministic or stochastic simulation). PMID:24123887

  17. Linear elastic fracture mechanics primer

    NASA Technical Reports Server (NTRS)

    Wilson, Christopher D.

    1992-01-01

    This primer is intended to remove the blackbox perception of fracture mechanics computer software by structural engineers. The fundamental concepts of linear elastic fracture mechanics are presented with emphasis on the practical application of fracture mechanics to real problems. Numerous rules of thumb are provided. Recommended texts for additional reading, and a discussion of the significance of fracture mechanics in structural design are given. Griffith's criterion for crack extension, Irwin's elastic stress field near the crack tip, and the influence of small-scale plasticity are discussed. Common stress intensities factor solutions and methods for determining them are included. Fracture toughness and subcritical crack growth are discussed. The application of fracture mechanics to damage tolerance and fracture control is discussed. Several example problems and a practice set of problems are given.

  18. Strategy as simple rules.

    PubMed

    Eisenhardt, K M; Sull, D N

    2001-01-01

    The success of Yahoo!, eBay, Enron, and other companies that have become adept at morphing to meet the demands of changing markets can't be explained using traditional thinking about competitive strategy. These companies have succeeded by pursuing constantly evolving strategies in market spaces that were considered unattractive according to traditional measures. In this article--the third in an HBR series by Kathleen Eisenhardt and Donald Sull on strategy in the new economy--the authors ask, what are the sources of competitive advantage in high-velocity markets? The secret, they say, is strategy as simple rules. The companies know that the greatest opportunities for competitive advantage lie in market confusion, but they recognize the need for a few crucial strategic processes and a few simple rules. In traditional strategy, advantage comes from exploiting resources or stable market positions. In strategy as simple rules, advantage comes from successfully seizing fleeting opportunities. Key strategic processes, such as product innovation, partnering, or spinout creation, place the company where the flow of opportunities is greatest. Simple rules then provide the guidelines within which managers can pursue such opportunities. Simple rules, which grow out of experience, fall into five broad categories: how- to rules, boundary conditions, priority rules, timing rules, and exit rules. Companies with simple-rules strategies must follow the rules religiously and avoid the temptation to change them too frequently. A consistent strategy helps managers sort through opportunities and gain short-term advantage by exploiting the attractive ones. In stable markets, managers rely on complicated strategies built on detailed predictions of the future. But when business is complicated, strategy should be simple.

  19. Approximate probabilistic cellular automata for the dynamics of single-species populations under discrete logisticlike growth with and without weak Allee effects.

    PubMed

    Mendonça, J Ricardo G; Gevorgyan, Yeva

    2017-05-01

    We investigate one-dimensional elementary probabilistic cellular automata (PCA) whose dynamics in first-order mean-field approximation yields discrete logisticlike growth models for a single-species unstructured population with nonoverlapping generations. Beginning with a general six-parameter model, we find constraints on the transition probabilities of the PCA that guarantee that the ensuing approximations make sense in terms of population dynamics and classify the valid combinations thereof. Several possible models display a negative cubic term that can be interpreted as a weak Allee factor. We also investigate the conditions under which a one-parameter PCA derived from the more general six-parameter model can generate valid population growth dynamics. Numerical simulations illustrate the behavior of some of the PCA found.

  20. 49 CFR 222.41 - How does this rule affect Pre-Rule Quiet Zones and Pre-Rule Partial Quiet Zones?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...-Rule Quiet Zone may be established by automatic approval and remain in effect, subject to § 222.51, if... Zone may be established by automatic approval and remain in effect, subject to § 222.51, if the Pre... 49 Transportation 4 2011-10-01 2011-10-01 false How does this rule affect Pre-Rule Quiet Zones and...

  1. 49 CFR 222.41 - How does this rule affect Pre-Rule Quiet Zones and Pre-Rule Partial Quiet Zones?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...-Rule Quiet Zone may be established by automatic approval and remain in effect, subject to § 222.51, if... Zone may be established by automatic approval and remain in effect, subject to § 222.51, if the Pre... 49 Transportation 4 2013-10-01 2013-10-01 false How does this rule affect Pre-Rule Quiet Zones and...

  2. 49 CFR 222.41 - How does this rule affect Pre-Rule Quiet Zones and Pre-Rule Partial Quiet Zones?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...-Rule Quiet Zone may be established by automatic approval and remain in effect, subject to § 222.51, if... Zone may be established by automatic approval and remain in effect, subject to § 222.51, if the Pre... 49 Transportation 4 2010-10-01 2010-10-01 false How does this rule affect Pre-Rule Quiet Zones and...

  3. 49 CFR 222.41 - How does this rule affect Pre-Rule Quiet Zones and Pre-Rule Partial Quiet Zones?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...-Rule Quiet Zone may be established by automatic approval and remain in effect, subject to § 222.51, if... Zone may be established by automatic approval and remain in effect, subject to § 222.51, if the Pre... 49 Transportation 4 2014-10-01 2014-10-01 false How does this rule affect Pre-Rule Quiet Zones and...

  4. 49 CFR 222.41 - How does this rule affect Pre-Rule Quiet Zones and Pre-Rule Partial Quiet Zones?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...-Rule Quiet Zone may be established by automatic approval and remain in effect, subject to § 222.51, if... Zone may be established by automatic approval and remain in effect, subject to § 222.51, if the Pre... 49 Transportation 4 2012-10-01 2012-10-01 false How does this rule affect Pre-Rule Quiet Zones and...

  5. Using the Chain Rule as the Key Link in Deriving the General Rules for Differentiation

    ERIC Educational Resources Information Center

    Sprows, David

    2011-01-01

    The standard approach to the general rules for differentiation is to first derive the power, product, and quotient rules and then derive the chain rule. In this short article we give an approach to these rules which uses the chain rule as the main tool in deriving the power, product, and quotient rules in a manner which is more student-friendly…

  6. Does the cost function matter in Bayes decision rule?

    PubMed

    Schlü ter, Ralf; Nussbaum-Thom, Markus; Ney, Hermann

    2012-02-01

    In many tasks in pattern recognition, such as automatic speech recognition (ASR), optical character recognition (OCR), part-of-speech (POS) tagging, and other string recognition tasks, we are faced with a well-known inconsistency: The Bayes decision rule is usually used to minimize string (symbol sequence) error, whereas, in practice, we want to minimize symbol (word, character, tag, etc.) error. When comparing different recognition systems, we do indeed use symbol error rate as an evaluation measure. The topic of this work is to analyze the relation between string (i.e., 0-1) and symbol error (i.e., metric, integer valued) cost functions in the Bayes decision rule, for which fundamental analytic results are derived. Simple conditions are derived for which the Bayes decision rule with integer-valued metric cost function and with 0-1 cost gives the same decisions or leads to classes with limited cost. The corresponding conditions can be tested with complexity linear in the number of classes. The results obtained do not make any assumption w.r.t. the structure of the underlying distributions or the classification problem. Nevertheless, the general analytic results are analyzed via simulations of string recognition problems with Levenshtein (edit) distance cost function. The results support earlier findings that considerable improvements are to be expected when initial error rates are high.

  7. Does Life Resist Asynchrony?

    NASA Astrophysics Data System (ADS)

    Fatès, Nazim

    Undoubtedly, Conway's Game of Life — or simply Life — is one of the most amazing inventions in the field of cellular automata. Forty years after its discovery, the model still fascinates researchers as if it were an inexhaustible source of puzzles. One of the most intriguing questions is to determine what makes this rule so particular among the quasi-infinite set of rules one can search. In this chapter we analyse how the Game of Life is affected by the presence of two structural pertubations: a change in the synchrony of the updates and a modification of the links between the cells.

  8. Phase change cellular automata modeling of GeTe, GaSb and SnSe stacked chalcogenide films

    NASA Astrophysics Data System (ADS)

    Mihai, C.; Velea, A.

    2018-06-01

    Data storage needs are increasing at a rapid pace across all economic sectors, so the need for new memory technologies with adequate capabilities is also high. Phase change memories (PCMs) are a leading contender in the emerging race for non-volatile memories due to their fast operation speed, high scalability, good reliability and low power consumption. However, in order to meet the present and future storage demands, PCM technologies must further increase the storage density. Here, we employ a probabilistic cellular automata approach to explore the multi-step threshold switching from the reset (off) to the set (on) state in chalcogenide stacked structures. Simulations have shown that in order to obtain multi-step switching with high contrast among different resistance states, the stacked structure needs to contain materials with a large difference among their crystallization temperatures and careful tuning of strata thicknesses. The crystallization dynamics can be controlled through the external energy pulses applied to the system, in such a way that a balance between nucleation and growth in phase change behavior can be achieved, optimized for PCMs.

  9. Micro-simulation of vehicle conflicts involving right-turn vehicles at signalized intersections based on cellular automata.

    PubMed

    Chai, C; Wong, Y D

    2014-02-01

    At intersection, vehicles coming from different directions conflict with each other. Improper geometric design and signal settings at signalized intersection will increase occurrence of conflicts between road users and results in a reduction of the safety level. This study established a cellular automata (CA) model to simulate vehicular interactions involving right-turn vehicles (as similar to left-turn vehicles in US). Through various simulation scenarios for four case cross-intersections, the relationships between conflict occurrences involving right-turn vehicles with traffic volume and right-turn movement control strategies are analyzed. Impacts of traffic volume, permissive right-turn compared to red-amber-green (RAG) arrow, shared straight-through and right-turn lane as well as signal setting are estimated from simulation results. The simulation model is found to be able to provide reasonable assessment of conflicts through comparison of existed simulation approach and observed accidents. Through the proposed approach, prediction models for occurrences and severity of vehicle conflicts can be developed for various geometric layouts and traffic control strategies. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Evaluation of the dispersion effect in through movement bicycles at signalized intersection via cellular automata simulation

    NASA Astrophysics Data System (ADS)

    Jiang, Hang; Ma, Yongjian; Jiang, Lin; Chen, Guozhou; Wang, Dongwei

    2018-05-01

    At signalized intersection areas, bicycle traffic presents a dispersion feature which may influence the movements of vehicles during peak period. The primary objective of this study is to simulate the dispersion effect in through-movement bicycle traffic at intersection areas and evaluate its influence on through-movement traffic. A cellular automata (CA) model is developed and validated to simulate the operations of through-movement bicycle traffic departing from two types of intersection approaches. Simulation results show that bicycles benefit from the dispersion effect when they depart from the approach with an exclusive right-turn vehicle lane. But when bicycles travel from the approach with a shared right-turn and through vehicle lane, the dispersion effect will result in friction interference and block interference on through-movement vehicles. Bicycle interferences reduce the vehicle speed and increase the delay of through-movement vehicles. The policy implications in regard to the dispersion effect from two types of approaches are discussed to improve the performance of through-movement traffic operations at signalized intersections.

  11. Analysis of Architectural Building Design Influences on Fire Spread in Densely Urban Settlement using Cellular Automata

    NASA Astrophysics Data System (ADS)

    Tambunan, L.; Salamah, H.; Asriana, N.

    2017-03-01

    This study aims to determine the influence of architectural design on the risk of fire spread in densely urban settlement area. Cellular Automata (CA) is used to analyse the fire spread pattern, speed, and the extent of damage. Four cells represent buildings, streets, and fields characteristic in the simulated area, as well as their flammability level and fire spread capabilities. Two fire scenarios are used to model the spread of fire: (1) fire origin in a building with concrete and wood material majority, and (2) fire origin in building with wood material majority. Building shape, building distance, road width, and total area of wall openings are considered constant, while wind is ignored. The result shows that fire spread faster in the building area with wood majority than with concrete majority. Significant amount of combustible building material, absence of distance between buildings, narrow streets and limited fields are factors which influence fire spread speed and pattern as well as extent of damage when fire occurs in the densely urban settlement area.

  12. Rules, culture, and fitness

    PubMed Central

    Baum, William M.

    1995-01-01

    Behavior analysis risks intellectual isolation unless it integrates its explanations with evolutionary theory. Rule-governed behavior is an example of a topic that requires an evolutionary perspective for a full understanding. A rule may be defined as a verbal discriminative stimulus produced by the behavior of a speaker under the stimulus control of a long-term contingency between the behavior and fitness. As a discriminative stimulus, the rule strengthens listener behavior that is reinforced in the short run by socially mediated contingencies, but which also enters into the long-term contingency that enhances the listener's fitness. The long-term contingency constitutes the global context for the speaker's giving the rule. When a rule is said to be “internalized,” the listener's behavior has switched from short- to long-term control. The fitness-enhancing consequences of long-term contingencies are health, resources, relationships, or reproduction. This view ties rules both to evolutionary theory and to culture. Stating a rule is a cultural practice. The practice strengthens, with short-term reinforcement, behavior that usually enhances fitness in the long run. The practice evolves because of its effect on fitness. The standard definition of a rule as a verbal statement that points to a contingency fails to distinguish between a rule and a bargain (“If you'll do X, then I'll do Y”), which signifies only a single short-term contingency that provides mutual reinforcement for speaker and listener. In contrast, the giving and following of a rule (“Dress warmly; it's cold outside”) can be understood only by reference also to a contingency providing long-term enhancement of the listener's fitness or the fitness of the listener's genes. Such a perspective may change the way both behavior analysts and evolutionary biologists think about rule-governed behavior. ImagesFigure 1 PMID:22478201

  13. Court Rules - Alaska Court System

    Science.gov Websites

    Association Child in Need of Aid Civil Procedure Code of Judicial Conduct Criminal Procedure Delinquency the rules' standards for issuing summons and warrants. Proposed Changes to the CINA/Delinquency Rules Amending CINA Rule 2, adding new CINA Rule 3.1 - Consolidation in sibling CINA cases. New Delinquency Rule

  14. Continuous variables logic via coupled automata using a DNAzyme cascade with feedback.

    PubMed

    Lilienthal, S; Klein, M; Orbach, R; Willner, I; Remacle, F; Levine, R D

    2017-03-01

    The concentration of molecules can be changed by chemical reactions and thereby offer a continuous readout. Yet computer architecture is cast in textbooks in terms of binary valued, Boolean variables. To enable reactive chemical systems to compute we show how, using the Cox interpretation of probability theory, one can transcribe the equations of chemical kinetics as a sequence of coupled logic gates operating on continuous variables. It is discussed how the distinct chemical identity of a molecule allows us to create a common language for chemical kinetics and Boolean logic. Specifically, the logic AND operation is shown to be equivalent to a bimolecular process. The logic XOR operation represents chemical processes that take place concurrently. The values of the rate constants enter the logic scheme as inputs. By designing a reaction scheme with a feedback we endow the logic gates with a built in memory because their output then depends on the input and also on the present state of the system. Technically such a logic machine is an automaton. We report an experimental realization of three such coupled automata using a DNAzyme multilayer signaling cascade. A simple model verifies analytically that our experimental scheme provides an integrator generating a power series that is third order in time. The model identifies two parameters that govern the kinetics and shows how the initial concentrations of the substrates are the coefficients in the power series.

  15. Resolution of Infinite-Loop in Hyperincursive and Nonlocal Cellular Automata: Introduction to Slime Mold Computing

    NASA Astrophysics Data System (ADS)

    Aono, Masashi; Gunji, Yukio-Pegio

    2004-08-01

    How can non-algorithmic/non-deterministic computational syntax be computed? "The hyperincursive system" introduced by Dubois is an anticipatory system embracing the contradiction/uncertainty. Although it may provide a novel viewpoint for the understanding of complex systems, conventional digital computers cannot run faithfully as the hyperincursive computational syntax specifies, in a strict sense. Then is it an imaginary story? In this paper we try to argue that it is not. We show that a model of complex systems "Elementary Conflictable Cellular Automata (ECCA)" proposed by Aono and Gunji is embracing the hyperincursivity and the nonlocality. ECCA is based on locality-only type settings basically as well as other CA models, and/but at the same time, each cell is required to refer to globality-dominant regularity. Due to this contradictory locality-globality loop, the time evolution equation specifies that the system reaches the deadlock/infinite-loop. However, we show that there is a possibility of the resolution of these problems if the computing system has parallel and/but non-distributed property like an amoeboid organism. This paper is an introduction to "the slime mold computing" that is an attempt to cultivate an unconventional notion of computation.

  16. Validating Cellular Automata Lava Flow Emplacement Algorithms with Standard Benchmarks

    NASA Astrophysics Data System (ADS)

    Richardson, J. A.; Connor, L.; Charbonnier, S. J.; Connor, C.; Gallant, E.

    2015-12-01

    A major existing need in assessing lava flow simulators is a common set of validation benchmark tests. We propose three levels of benchmarks which test model output against increasingly complex standards. First, imulated lava flows should be morphologically identical, given changes in parameter space that should be inconsequential, such as slope direction. Second, lava flows simulated in simple parameter spaces can be tested against analytical solutions or empirical relationships seen in Bingham fluids. For instance, a lava flow simulated on a flat surface should produce a circular outline. Third, lava flows simulated over real world topography can be compared to recent real world lava flows, such as those at Tolbachik, Russia, and Fogo, Cape Verde. Success or failure of emplacement algorithms in these validation benchmarks can be determined using a Bayesian approach, which directly tests the ability of an emplacement algorithm to correctly forecast lava inundation. Here we focus on two posterior metrics, P(A|B) and P(¬A|¬B), which describe the positive and negative predictive value of flow algorithms. This is an improvement on less direct statistics such as model sensitivity and the Jaccard fitness coefficient. We have performed these validation benchmarks on a new, modular lava flow emplacement simulator that we have developed. This simulator, which we call MOLASSES, follows a Cellular Automata (CA) method. The code is developed in several interchangeable modules, which enables quick modification of the distribution algorithm from cell locations to their neighbors. By assessing several different distribution schemes with the benchmark tests, we have improved the performance of MOLASSES to correctly match early stages of the 2012-3 Tolbachik Flow, Kamchakta Russia, to 80%. We also can evaluate model performance given uncertain input parameters using a Monte Carlo setup. This illuminates sensitivity to model uncertainty.

  17. Bennett clocking of quantum-dot cellular automata and the limits to binary logic scaling.

    PubMed

    Lent, Craig S; Liu, Mo; Lu, Yuhui

    2006-08-28

    We examine power dissipation in different clocking schemes for molecular quantum-dot cellular automata (QCA) circuits. 'Landauer clocking' involves the adiabatic transition of a molecular cell from the null state to an active state carrying data. Cell layout creates devices which allow data in cells to interact and thereby perform useful computation. We perform direct solutions of the equation of motion for the system in contact with the thermal environment and see that Landauer's Principle applies: one must dissipate an energy of at least k(B)T per bit only when the information is erased. The ideas of Bennett can be applied to keep copies of the bit information by echoing inputs to outputs, thus embedding any logically irreversible circuit in a logically reversible circuit, at the cost of added circuit complexity. A promising alternative which we term 'Bennett clocking' requires only altering the timing of the clocking signals so that bit information is simply held in place by the clock until a computational block is complete, then erased in the reverse order of computation. This approach results in ultralow power dissipation without additional circuit complexity. These results offer a concrete example in which to consider recent claims regarding the fundamental limits of binary logic scaling.

  18. Modelling land use/cover changes with markov-cellular automata in Komering Watershed, South Sumatera

    NASA Astrophysics Data System (ADS)

    Kusratmoko, E.; Albertus, S. D. Y.; Supriatna

    2017-01-01

    This research has a purpose to study and develop a model that can representing and simulating spatial distribution pattern of land use change in Komering watershed. The Komering watershed is one of nine sub Musi river basin and is located in the southern part of Sumatra island that has an area of 8060,62 km2. Land use change simulations, achieved through Markov-cellular automata (CA) methodologies. Slope, elevation, distance from road, distance from river, distance from capital sub-district, distance from settlement area area were driving factors that used in this research. Land use prediction result in 2030 also shows decrease of forest acreage up to -3.37%, agricultural land decreased up to -2.13%, and open land decreased up to -0.13%. On the other hand settlement area increased up to 0.07%, and plantation land increased up to 5.56%. Based on the predictive result, land use unconformity percentage to RTRW in Komering watershed is 18.62 % and land use conformity is 58.27%. Based on the results of the scenario, where forest in protected areas and agriculture land are maintained, shows increase the land use conformity amounted to 60.41 % and reduce unconformity that occur in Komering watershed to 17.23 %.

  19. Bennett clocking of quantum-dot cellular automata and the limits to binary logic scaling

    NASA Astrophysics Data System (ADS)

    Lent, Craig S.; Liu, Mo; Lu, Yuhui

    2006-08-01

    We examine power dissipation in different clocking schemes for molecular quantum-dot cellular automata (QCA) circuits. 'Landauer clocking' involves the adiabatic transition of a molecular cell from the null state to an active state carrying data. Cell layout creates devices which allow data in cells to interact and thereby perform useful computation. We perform direct solutions of the equation of motion for the system in contact with the thermal environment and see that Landauer's Principle applies: one must dissipate an energy of at least kBT per bit only when the information is erased. The ideas of Bennett can be applied to keep copies of the bit information by echoing inputs to outputs, thus embedding any logically irreversible circuit in a logically reversible circuit, at the cost of added circuit complexity. A promising alternative which we term 'Bennett clocking' requires only altering the timing of the clocking signals so that bit information is simply held in place by the clock until a computational block is complete, then erased in the reverse order of computation. This approach results in ultralow power dissipation without additional circuit complexity. These results offer a concrete example in which to consider recent claims regarding the fundamental limits of binary logic scaling.

  20. A novel power-efficient high-speed clock management unit using quantum-dot cellular automata

    NASA Astrophysics Data System (ADS)

    Abutaleb, M. M.

    2017-04-01

    Quantum-dot cellular automata (QCA) is one of the most attractive alternatives for complementary metal-oxide semiconductor technology. The QCA widely supports a new paradigm in the field of nanotechnology that has the potential for high density, low power, and high speed. The clock manager is an essential building block in the new microwave and radio frequency integrated circuits. This paper describes a novel QCA-based clock management unit (CMU) that provides innovative clocking capabilities. The proposed CMU is achieved by utilizing edge-triggered D-type flip-flops (D-FFs) in the design of frequency synthesizer and phase splitter. Edge-triggered D-FF structures proposed in this paper have the successful QCA implementation and simulation with the least complexity and power dissipation as compared to earlier structures. The frequency synthesizer is used to generate new clock frequencies from the reference clock frequency based on a combination of power-of-two frequency dividers. The phase splitter is integrated with the frequency synthesizer to generate four clock signals that are 90o out of phase with each other. This paper demonstrates that the proposed QCA CMU structure has a superior performance. Furthermore, the proposed CMU is straightforwardly scalable due to the use of modular component architecture.

  1. Stressed out and overcommitted! The relationships between time demands and family rules and parents’ and their child’s weight status

    PubMed Central

    Hearst, Mary O.; Sevcik, Sarah; Fulkerson, Jayne A.; Pasch, Keryn E.; Harnack, Lisa J.; Lytle, Leslie A.

    2013-01-01

    Objective To determine the relationship between parent time demands and presence and enforcement of family rules and parent/child dyad weight status. Methods Dyads of one child/parent per family (n=681dyads), Twin Cities, Minnesota, 2007–2008 had measured height/weight and a survey of demographics, time demands and family rules-related questions. Parent/child dyads were classified into four healthy weight/overweight categories. Multivariate linear associations were analyzed with SAS, testing for interaction by work status and family composition (p<0.10). Results In adjusted models, lack of family rules and difficulty with rule enforcement were statistically lower in dyads in which the parent/child was healthy weight compared to dyads in which the parent/child was both overweight (Difference in family rules scores=0.49, p=0.03; difference in rule enforcement scores=1.09, p=<0.01). Of parents who worked full-time, healthy weight dyads reported lower time demands than other dyads (Difference in time demands scores=1.44, p=0.01). Conclusions Family experiences of time demands and use of family rules are related to the weight status of parents and children within families. PMID:22228775

  2. Modulation/demodulation techniques for satellite communications. Part 2: Advanced techniques. The linear channel

    NASA Technical Reports Server (NTRS)

    Omura, J. K.; Simon, M. K.

    1982-01-01

    A theory is presented for deducing and predicting the performance of transmitter/receivers for bandwidth efficient modulations suitable for use on the linear satellite channel. The underlying principle used is the development of receiver structures based on the maximum-likelihood decision rule. The application of the performance prediction tools, e.g., channel cutoff rate and bit error probability transfer function bounds to these modulation/demodulation techniques.

  3. Requiem for the max rule?

    PubMed Central

    Ma, Wei Ji; Shen, Shan; Dziugaite, Gintare; van den Berg, Ronald

    2015-01-01

    In tasks such as visual search and change detection, a key question is how observers integrate noisy measurements from multiple locations to make a decision. Decision rules proposed to model this process haven fallen into two categories: Bayes-optimal (ideal observer) rules and ad-hoc rules. Among the latter, the maximum-of-outputs (max) rule has been most prominent. Reviewing recent work and performing new model comparisons across a range of paradigms, we find that in all cases except for one, the optimal rule describes human data as well as or better than every max rule either previously proposed or newly introduced here. This casts doubt on the utility of the max rule for understanding perceptual decision-making. PMID:25584425

  4. Requiem for the max rule?

    PubMed

    Ma, Wei Ji; Shen, Shan; Dziugaite, Gintare; van den Berg, Ronald

    2015-11-01

    In tasks such as visual search and change detection, a key question is how observers integrate noisy measurements from multiple locations to make a decision. Decision rules proposed to model this process have fallen into two categories: Bayes-optimal (ideal observer) rules and ad-hoc rules. Among the latter, the maximum-of-outputs (max) rule has been the most prominent. Reviewing recent work and performing new model comparisons across a range of paradigms, we find that in all cases except for one, the optimal rule describes human data as well as or better than every max rule either previously proposed or newly introduced here. This casts doubt on the utility of the max rule for understanding perceptual decision-making. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Rule groupings in expert systems using nearest neighbour decision rules, and convex hulls

    NASA Technical Reports Server (NTRS)

    Anastasiadis, Stergios

    1991-01-01

    Expert System shells are lacking in many areas of software engineering. Large rule based systems are not semantically comprehensible, difficult to debug, and impossible to modify or validate. Partitioning a set of rules found in CLIPS (C Language Integrated Production System) into groups of rules which reflect the underlying semantic subdomains of the problem, will address adequately the concerns stated above. Techniques are introduced to structure a CLIPS rule base into groups of rules that inherently have common semantic information. The concepts involved are imported from the field of A.I., Pattern Recognition, and Statistical Inference. Techniques focus on the areas of feature selection, classification, and a criteria of how 'good' the classification technique is, based on Bayesian Decision Theory. A variety of distance metrics are discussed for measuring the 'closeness' of CLIPS rules and various Nearest Neighbor classification algorithms are described based on the above metric.

  6. Novice Rules for Projectile Motion.

    ERIC Educational Resources Information Center

    Maloney, David P.

    1988-01-01

    Investigates several aspects of undergraduate students' rules for projectile motion including general patterns; rules for questions about time, distance, solids and liquids; and changes in rules when asked to ignore air resistance. Reports approach differences by sex and high school physics experience, and that novice rules are situation…

  7. WellnessRules: A Web 3.0 Case Study in RuleML-Based Prolog-N3 Profile Interoperation

    NASA Astrophysics Data System (ADS)

    Boley, Harold; Osmun, Taylor Michael; Craig, Benjamin Larry

    An interoperation study, WellnessRules, is described, where rules about wellness opportunities are created by participants in rule languages such as Prolog and N3, and translated within a wellness community using RuleML/XML. The wellness rules are centered around participants, as profiles, encoding knowledge about their activities conditional on the season, the time-of-day, the weather, etc. This distributed knowledge base extends FOAF profiles with a vocabulary and rules about wellness group networking. The communication between participants is organized through Rule Responder, permitting wellness-profile translation and distributed querying across engines. WellnessRules interoperates between rules and queries in the relational (Datalog) paradigm of the pure-Prolog subset of POSL and in the frame (F-logic) paradigm of N3. An evaluation of Rule Responder instantiated for WellnessRules revealed acceptable Web response times.

  8. Rule-governed behavior: teaching a preliminary repertoire of rule-following to children with autism.

    PubMed

    Tarbox, Jonathan; Zuckerman, Carrie K; Bishop, Michele R; Olive, Melissa L; O'Hora, Denis P

    2011-01-01

    Rule-governed behavior is generally considered an integral component of complex verbal repertoires but has rarely been the subject of empirical research. In particular, little or no previous research has attempted to establish rule-governed behavior in individuals who do not already display the repertoire. This study consists of two experiments that evaluated multiple exemplar training procedures for teaching a simple component skill, which may be necessary for developing a repertoire of rule-governed behavior. In both experiments, children with autism were taught to respond to simple rules that specified antecedents and the behaviors that should occur in their presence. In the first study, participants were taught to respond to rules containing "if/then" statements, where the antecedent was specified before the behavior. The second experiment was a replication and extension of the first. It involved a variation on the manner in which rules were presented. Both experiments eventually demonstrated generalization to novel rules for all participants; however variations to the standard procedure were required for several participants. Results suggest that rule-following can be analyzed and taught as generalized operant behavior and implications for future research are discussed.

  9. Learning of Rule Ensembles for Multiple Attribute Ranking Problems

    NASA Astrophysics Data System (ADS)

    Dembczyński, Krzysztof; Kotłowski, Wojciech; Słowiński, Roman; Szeląg, Marcin

    In this paper, we consider the multiple attribute ranking problem from a Machine Learning perspective. We propose two approaches to statistical learning of an ensemble of decision rules from decision examples provided by the Decision Maker in terms of pairwise comparisons of some objects. The first approach consists in learning a preference function defining a binary preference relation for a pair of objects. The result of application of this function on all pairs of objects to be ranked is then exploited using the Net Flow Score procedure, giving a linear ranking of objects. The second approach consists in learning a utility function for single objects. The utility function also gives a linear ranking of objects. In both approaches, the learning is based on the boosting technique. The presented approaches to Preference Learning share good properties of the decision rule preference model and have good performance in the massive-data learning problems. As Preference Learning and Multiple Attribute Decision Aiding share many concepts and methodological issues, in the introduction, we review some aspects bridging these two fields. To illustrate the two approaches proposed in this paper, we solve with them a toy example concerning the ranking of a set of cars evaluated by multiple attributes. Then, we perform a large data experiment on real data sets. The first data set concerns credit rating. Since recent research in the field of Preference Learning is motivated by the increasing role of modeling preferences in recommender systems and information retrieval, we chose two other massive data sets from this area - one comes from movie recommender system MovieLens, and the other concerns ranking of text documents from 20 Newsgroups data set.

  10. Algorithmic Trading with Developmental and Linear Genetic Programming

    NASA Astrophysics Data System (ADS)

    Wilson, Garnett; Banzhaf, Wolfgang

    A developmental co-evolutionary genetic programming approach (PAM DGP) and a standard linear genetic programming (LGP) stock trading systemare applied to a number of stocks across market sectors. Both GP techniques were found to be robust to market fluctuations and reactive to opportunities associated with stock price rise and fall, with PAMDGP generating notably greater profit in some stock trend scenarios. Both algorithms were very accurate at buying to achieve profit and selling to protect assets, while exhibiting bothmoderate trading activity and the ability to maximize or minimize investment as appropriate. The content of the trading rules produced by both algorithms are also examined in relation to stock price trend scenarios.

  11. Classifying elementary cellular automata using compressibility, diversity and sensitivity measures

    NASA Astrophysics Data System (ADS)

    Ninagawa, Shigeru; Adamatzky, Andrew

    2014-10-01

    An elementary cellular automaton (ECA) is a one-dimensional, synchronous, binary automaton, where each cell update depends on its own state and states of its two closest neighbors. We attempt to uncover correlations between the following measures of ECA behavior: compressibility, sensitivity and diversity. The compressibility of ECA configurations is calculated using the Lempel-Ziv (LZ) compression algorithm LZ78. The sensitivity of ECA rules to initial conditions and perturbations is evaluated using Derrida coefficients. The generative morphological diversity shows how many different neighborhood states are produced from a single nonquiescent cell. We found no significant correlation between sensitivity and compressibility. There is a substantial correlation between generative diversity and compressibility. Using sensitivity, compressibility and diversity, we uncover and characterize novel groupings of rules.

  12. Analysis of Rules for Islamic Inheritance Law in Indonesia Using Hybrid Rule Based Learning

    NASA Astrophysics Data System (ADS)

    Khosyi'ah, S.; Irfan, M.; Maylawati, D. S.; Mukhlas, O. S.

    2018-01-01

    Along with the development of human civilization in Indonesia, the changes and reform of Islamic inheritance law so as to conform to the conditions and culture cannot be denied. The distribution of inheritance in Indonesia can be done automatically by storing the rule of Islamic inheritance law in the expert system. In this study, we analyze the knowledge of experts in Islamic inheritance in Indonesia and represent it in the form of rules using rule-based Forward Chaining (FC) and Davis-Putman-Logemann-Loveland (DPLL) algorithms. By hybridizing FC and DPLL algorithms, the rules of Islamic inheritance law in Indonesia are clearly defined and measured. The rules were conceptually validated by some experts in Islamic laws and informatics. The results revealed that generally all rules were ready for use in an expert system.

  13. 29 CFR 2200.2 - Scope of rules; applicability of Federal Rules of Civil Procedure; construction.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...; applicability of Federal Rules of Civil Procedure; construction. (a) Scope. These rules shall govern all proceedings before the Commission and its Judges. (b) Applicability of Federal Rules of Civil Procedure. In the absence of a specific provision, procedure shall be in accordance with the Federal Rules of Civil...

  14. 29 CFR 2200.2 - Scope of rules; applicability of Federal Rules of Civil Procedure; construction.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...; applicability of Federal Rules of Civil Procedure; construction. (a) Scope. These rules shall govern all proceedings before the Commission and its Judges. (b) Applicability of Federal Rules of Civil Procedure. In the absence of a specific provision, procedure shall be in accordance with the Federal Rules of Civil...

  15. A Better Budget Rule

    ERIC Educational Resources Information Center

    Dothan, Michael; Thompson, Fred

    2009-01-01

    Debt limits, interest coverage ratios, one-off balanced budget requirements, pay-as-you-go rules, and tax and expenditure limits are among the most important fiscal rules for constraining intertemporal transfers. There is considerable evidence that the least costly and most effective of such rules are those that focus directly on the rate of…

  16. Challenges for Rule Systems on the Web

    NASA Astrophysics Data System (ADS)

    Hu, Yuh-Jong; Yeh, Ching-Long; Laun, Wolfgang

    The RuleML Challenge started in 2007 with the objective of inspiring the issues of implementation for management, integration, interoperation and interchange of rules in an open distributed environment, such as the Web. Rules are usually classified as three types: deductive rules, normative rules, and reactive rules. The reactive rules are further classified as ECA rules and production rules. The study of combination rule and ontology is traced back to an earlier active rule system for relational and object-oriented (OO) databases. Recently, this issue has become one of the most important research problems in the Semantic Web. Once we consider a computer executable policy as a declarative set of rules and ontologies that guides the behavior of entities within a system, we have a flexible way to implement real world policies without rewriting the computer code, as we did before. Fortunately, we have de facto rule markup languages, such as RuleML or RIF to achieve the portability and interchange of rules for different rule systems. Otherwise, executing real-life rule-based applications on the Web is almost impossible. Several commercial or open source rule engines are available for the rule-based applications. However, we still need a standard rule language and benchmark for not only to compare the rule systems but also to measure the progress in the field. Finally, a number of real-life rule-based use cases will be investigated to demonstrate the applicability of current rule systems on the Web.

  17. Scarp degraded by linear diffusion: inverse solution for age.

    USGS Publications Warehouse

    Andrews, D.J.; Hanks, T.C.

    1985-01-01

    Under the assumption that landforms unaffected by drainage channels are degraded according to the linear diffusion equation, a procedure is developed to invert a scarp profile to find its 'diffusion age'. The inverse procedure applied to synthetic data yields the following rules of thumb. Evidence of initial scarp shape has been lost when apparent age reaches twice its initial value. A scarp that appears to have been formed by one event may have been formed by two with an interval between them as large as apparent age. The simplicity of scarp profile measurement and this inversion makes profile analysis attractive. -from Authors

  18. Communicating rules in recreation areas

    Treesearch

    Terence L. Ross; George H. Moeller

    1974-01-01

    Five hundred fifty-eight campers were surveyed on the Allegheny National Forest to determine their knowledge of rules governing recreation behavior. Most of them were uninformed about the rules. Results of the study suggest that previous camping experience, age, camping style, and residence significantly affect knowledge of rules. Campers who received rule brochures or...

  19. Project RAMA: Reconstructing Asteroids Into Mechanical Automata

    NASA Technical Reports Server (NTRS)

    Dunn, Jason; Fagin, Max; Snyder, Michael; Joyce, Eric

    2017-01-01

    Many interesting ideas have been conceived for building space-based infrastructure in cislunar space. From O'Neill's space colonies, to solar power satellite farms, and even prospecting retrieved near earth asteroids. In all the scenarios, one thing remained fixed - the need for space resources at the outpost. To satisfy this need, O'Neill suggested an electromagnetic railgun to deliver resources from the lunar surface, while NASA's Asteroid Redirect Mission called for a solar electric tug to deliver asteroid materials from interplanetary space. At Made In Space, we propose an entirely new concept. One which is scalable, cost effective, and ensures that the abundant material wealth of the inner solar system becomes readily available to humankind in a nearly automated fashion. We propose the RAMA architecture, which turns asteroids into self-contained spacecraft capable of moving themselves back to cislunar space. The RAMA architecture is just as capable of transporting conventional-sized asteroids on the 10-meter length scale as transporting asteroids 100 meters or larger, making it the most versatile asteroid retrieval architecture in terms of retrieved-mass capability. This report describes the results of the Phase I study funded by the NASA NIAC program for Made In Space to establish the concept feasibility of using space manufacturing to convert asteroids into autonomous, mechanical spacecraft. Project RAMA, Reconstituting Asteroids into Mechanical Automata, is designed to leverage the future advances of additive manufacturing (AM), in-situ resource utilization (ISRU) and in-situ manufacturing (ISM) to realize enormous efficiencies in repeated asteroid redirect missions. A team of engineers at Made In Space performed the study work with consultation from the asteroid mining industry, academia, and NASA. Previous studies for asteroid retrieval have been constrained to studying only asteroids that are both large enough to be discovered, and small enough to be

  20. Effect of antimony (Sb) addition on the linear and non-linear optical properties of amorphous Ge-Te-Sb thin films

    NASA Astrophysics Data System (ADS)

    Kumar, P.; Kaur, J.; Tripathi, S. K.; Sharma, I.

    2017-12-01

    Non-crystalline thin films of Ge20Te80-xSbx (x = 0, 2, 4, 6, 10) systems were deposited on glass substrate using thermal evaporation technique. The optical coefficients were accurately determined by transmission spectra using Swanepoel envelope method in the spectral region of 400-1600 nm. The refractive index was found to increase from 2.38 to 2.62 with the corresponding increase in Sb content over the entire spectral range. The dispersion of refractive index was discussed in terms of the single oscillator Wemple-DiDomenico model. Tauc relation for the allowed indirect transition showed decrease in optical band gap. To explore non-linearity, the spectral dependence of third order susceptibility of a-Ge-Te-Sb thin films was evaluated from change of index of refraction using Miller's rule. Susceptibility values were found to enhance rapidly from 10-13 to 10-12 (esu), with the red shift in the absorption edge. Non-linear refractive index was calculated by Fourier and Snitzer formula. The values were of the order of 10-12 esu. At telecommunication wavelength, these non-linear refractive index values showed three orders higher than that of silica glass. Dielectric constant and optical conductivity were also reported. The prepared Sb doped thin films on glass substrate with observed improved functional properties have a noble prospect in the application of nonlinear optical devices and might be used for a high speed communication fiber. Non-linear parameters showed good agreement with the values given in the literature.

  1. Tests of ecogeographical relationships in a non-native species: what rules avian morphology?

    PubMed

    Cardilini, Adam P A; Buchanan, Katherine L; Sherman, Craig D H; Cassey, Phillip; Symonds, Matthew R E

    2016-07-01

    The capacity of non-native species to undergo rapid adaptive change provides opportunities to research contemporary evolution through natural experiments. This capacity is particularly true when considering ecogeographical rules, to which non-native species have been shown to conform within relatively short periods of time. Ecogeographical rules explain predictable spatial patterns of morphology, physiology, life history and behaviour. We tested whether Australian populations of non-native starling, Sturnus vulgaris, introduced to the country approximately 150 years ago, exhibited predicted environmental clines in body size, appendage size and heart size (Bergmann's, Allen's and Hesse's rules, respectively). Adult starlings (n = 411) were collected from 28 localities from across eastern Australia from 2011 to 2012. Linear models were constructed to examine the relationships between morphology and local environment. Patterns of variation in body mass and bill surface area were consistent with Bergmann's and Allen's rules, respectively (small body size and larger bill size in warmer climates), with maximum summer temperature being a strongly weighted predictor of both variables. In the only intraspecific test of Hesse's rule in birds to date, we found no evidence to support the idea that relative heart size will be larger in individuals which live in colder climates. Our study does provide evidence that maximum temperature is a strong driver of morphological adaptation for starlings in Australia. The changes in morphology presented here demonstrate the potential for avian species to make rapid adaptive changes in relation to a changing climate to ameliorate the effects of heat stress.

  2. Organisational Rules in Schools: Teachers' Opinions about Functions of Rules, Rule-Following and Breaking Behaviours in Relation to Their Locus of Control

    ERIC Educational Resources Information Center

    Demirkasimoglu, Nihan; Aydin, Inayet; Erdogan, Cetin; Akin, Ugur

    2012-01-01

    The main aim of this research is to examine teachers' opinions about functions of school rules, reasons for rule-breaking and results of rule-breaking in relation to their locus of control, gender, age, seniority and branch. 350 public elementary school teachers in Ankara are included in the correlational survey model study. According to the…

  3. CATS - A process-based model for turbulent turbidite systems at the reservoir scale

    NASA Astrophysics Data System (ADS)

    Teles, Vanessa; Chauveau, Benoît; Joseph, Philippe; Weill, Pierre; Maktouf, Fakher

    2016-09-01

    The Cellular Automata for Turbidite systems (CATS) model is intended to simulate the fine architecture and facies distribution of turbidite reservoirs with a multi-event and process-based approach. The main processes of low-density turbulent turbidity flow are modeled: downslope sediment-laden flow, entrainment of ambient water, erosion and deposition of several distinct lithologies. This numerical model, derived from (Salles, 2006; Salles et al., 2007), proposes a new approach based on the Rouse concentration profile to consider the flow capacity to carry the sediment load in suspension. In CATS, the flow distribution on a given topography is modeled with local rules between neighboring cells (cellular automata) based on potential and kinetic energy balance and diffusion concepts. Input parameters are the initial flow parameters and a 3D topography at depositional time. An overview of CATS capabilities in different contexts is presented and discussed.

  4. Consensus for linear multi-agent system with intermittent information transmissions using the time-scale theory

    NASA Astrophysics Data System (ADS)

    Taousser, Fatima; Defoort, Michael; Djemai, Mohamed

    2016-01-01

    This paper investigates the consensus problem for linear multi-agent system with fixed communication topology in the presence of intermittent communication using the time-scale theory. Since each agent can only obtain relative local information intermittently, the proposed consensus algorithm is based on a discontinuous local interaction rule. The interaction among agents happens at a disjoint set of continuous-time intervals. The closed-loop multi-agent system can be represented using mixed linear continuous-time and linear discrete-time models due to intermittent information transmissions. The time-scale theory provides a powerful tool to combine continuous-time and discrete-time cases and study the consensus protocol under a unified framework. Using this theory, some conditions are derived to achieve exponential consensus under intermittent information transmissions. Simulations are performed to validate the theoretical results.

  5. Study of The Non-linear Uv Dosimetry In Simulated Extraterrestrial Conditions

    NASA Astrophysics Data System (ADS)

    Berces, A.; Kerekgyarto, T.; Ronto, G.; Lammer, H.; Kargl, G.; Komle, N. I.

    In UV biological dosimetry the UV dose scale is additive starting at a value of zero ac- cording to the definition of CIE (Technical Report TC-6-18). The biological dose can be defined by a measured end-effect. In our dosimeters (phage T7 and uracil dosime- ter) exposed to natural (terrestrial) UV radiation the proportion of pyrimidin photo- products among the total photoproducts is smaller than 10 and the linear correlation between the biological and physical dose is higher than 0.9. According to the experi- mental data this linear relationship is often not valid. We observed that UV radiation did not only induce dimerisation but shorter wavelengths caused monomerisation of pyrimidin dimers. Performing the irradiation in oxygen free environment and using a Deuterium lamp as UV source, we could increase monomerisation against dimerisa- tion thus the DNA-based dosimetrySs additivity rule is not fulfilled in these conditions. In this study we will demonstrate those non-linear experiments which constitute the basis of our biological experiments on the International Space Station.

  6. Modeling and predicting urban growth pattern of the Tokyo metropolitan area based on cellular automata

    NASA Astrophysics Data System (ADS)

    Zhao, Yaolong; Zhao, Junsan; Murayama, Yuji

    2008-10-01

    The period of high economic growth in Japan which began in the latter half of the 1950s led to a massive migration of population from rural regions to the Tokyo metropolitan area. This phenomenon brought about rapid urban growth and urban structure changes in this area. Purpose of this study is to establish a constrained CA (Cellular Automata) model with GIS (Geographical Information Systems) to simulate urban growth pattern in the Tokyo metropolitan area towards predicting urban form and landscape for the near future. Urban land-use is classified into multi-categories for interpreting the effect of interaction among land-use categories in the spatial process of urban growth. Driving factors of urban growth pattern, such as land condition, railway network, land-use zoning, random perturbation, and neighborhood interaction and so forth, are explored and integrated into this model. These driving factors are calibrated based on exploratory spatial data analysis (ESDA), spatial statistics, logistic regression, and "trial and error" approach. The simulation is assessed at both macro and micro classification levels in three ways: visual approach; fractal dimension; and spatial metrics. Results indicate that this model provides an effective prototype to simulate and predict urban growth pattern of the Tokyo metropolitan area.

  7. Modelling urban growth in the Indo-Gangetic plain using nighttime OLS data and cellular automata

    NASA Astrophysics Data System (ADS)

    Roy Chowdhury, P. K.; Maithani, Sandeep

    2014-12-01

    The present study demonstrates the applicability of the Operational Linescan System (OLS) sensor in modelling urban growth at regional level. The nighttime OLS data provides an easy, inexpensive way to map urban areas at a regional scale, requiring a very small volume of data. A cellular automata (CA) model was developed for simulating urban growth in the Indo-Gangetic plain; using OLS data derived maps as input. In the proposed CA model, urban growth was expressed in terms of causative factors like economy, topography, accessibility and urban infrastructure. The model was calibrated and validated based on OLS data of year 2003 and 2008 respectively using spatial metrics measures and subsequently the urban growth was predicted for the year 2020. The model predicted high urban growth in North Western part of the study area, in south eastern part growth would be concentrated around two cities, Kolkata and Howrah. While in the middle portion of the study area, i.e., Jharkhand, Bihar and Eastern Uttar Pradesh, urban growth has been predicted in form of clusters, mostly around the present big cities. These results will not only provide an input to urban planning but can also be utilized in hydrological and ecological modelling which require an estimate of future built up areas especially at regional level.

  8. Feynman rules for the Standard Model Effective Field Theory in R ξ -gauges

    NASA Astrophysics Data System (ADS)

    Dedes, A.; Materkowska, W.; Paraskevas, M.; Rosiek, J.; Suxho, K.

    2017-06-01

    We assume that New Physics effects are parametrized within the Standard Model Effective Field Theory (SMEFT) written in a complete basis of gauge invariant operators up to dimension 6, commonly referred to as "Warsaw basis". We discuss all steps necessary to obtain a consistent transition to the spontaneously broken theory and several other important aspects, including the BRST-invariance of the SMEFT action for linear R ξ -gauges. The final theory is expressed in a basis characterized by SM-like propagators for all physical and unphysical fields. The effect of the non-renormalizable operators appears explicitly in triple or higher multiplicity vertices. In this mass basis we derive the complete set of Feynman rules, without resorting to any simplifying assumptions such as baryon-, lepton-number or CP conservation. As it turns out, for most SMEFT vertices the expressions are reasonably short, with a noticeable exception of those involving 4, 5 and 6 gluons. We have also supplemented our set of Feynman rules, given in an appendix here, with a publicly available Mathematica code working with the FeynRules package and producing output which can be integrated with other symbolic algebra or numerical codes for automatic SMEFT amplitude calculations.

  9. The Cellular Automata for modelling of spreading of lava flow on the earth surface

    NASA Astrophysics Data System (ADS)

    Jarna, A.

    2012-12-01

    Volcanic risk assessment is a very important scientific, political and economic issue in densely populated areas close to active volcanoes. Development of effective tools for early prediction of a potential volcanic hazard and management of crises are paramount. However, to this date volcanic hazard maps represent the most appropriate way to illustrate the geographical area that can potentially be affected by a volcanic event. Volcanic hazard maps are usually produced by mapping out old volcanic deposits, however dynamic lava flow simulation gaining popularity and can give crucial information to corroborate other methodologies. The methodology which is used here for the generation of volcanic hazard maps is based on numerical simulation of eruptive processes by the principle of Cellular Automata (CA). The python script is integrated into ArcToolbox in ArcMap (ESRI) and the user can select several input and output parameters which influence surface morphology, size and shape of the flow, flow thickness, flow velocity and length of lava flows. Once the input parameters are selected, the software computes and generates hazard maps on the fly. The results can be exported to Google Maps (.klm format) to visualize the results of the computation. For validation of the simulation code are used data from a real lava flow. Comparison of the simulation results with real lava flows mapped out from satellite images will be presented.

  10. 19 CFR 102.11 - General rules.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... RULES OF ORIGIN Rules of Origin § 102.11 General rules. The following rules shall apply for purposes of determining the country of origin of imported goods other than textile and apparel products covered by § 102... 19 Customs Duties 1 2010-04-01 2010-04-01 false General rules. 102.11 Section 102.11 Customs...

  11. Weakly coupled map lattice models for multicellular patterning and collective normalization of abnormal single-cell states

    NASA Astrophysics Data System (ADS)

    García-Morales, Vladimir; Manzanares, José A.; Mafe, Salvador

    2017-04-01

    We present a weakly coupled map lattice model for patterning that explores the effects exerted by weakening the local dynamic rules on model biological and artificial networks composed of two-state building blocks (cells). To this end, we use two cellular automata models based on (i) a smooth majority rule (model I) and (ii) a set of rules similar to those of Conway's Game of Life (model II). The normal and abnormal cell states evolve according to local rules that are modulated by a parameter κ . This parameter quantifies the effective weakening of the prescribed rules due to the limited coupling of each cell to its neighborhood and can be experimentally controlled by appropriate external agents. The emergent spatiotemporal maps of single-cell states should be of significance for positional information processes as well as for intercellular communication in tumorigenesis, where the collective normalization of abnormal single-cell states by a predominantly normal neighborhood may be crucial.

  12. Weakly coupled map lattice models for multicellular patterning and collective normalization of abnormal single-cell states.

    PubMed

    García-Morales, Vladimir; Manzanares, José A; Mafe, Salvador

    2017-04-01

    We present a weakly coupled map lattice model for patterning that explores the effects exerted by weakening the local dynamic rules on model biological and artificial networks composed of two-state building blocks (cells). To this end, we use two cellular automata models based on (i) a smooth majority rule (model I) and (ii) a set of rules similar to those of Conway's Game of Life (model II). The normal and abnormal cell states evolve according to local rules that are modulated by a parameter κ. This parameter quantifies the effective weakening of the prescribed rules due to the limited coupling of each cell to its neighborhood and can be experimentally controlled by appropriate external agents. The emergent spatiotemporal maps of single-cell states should be of significance for positional information processes as well as for intercellular communication in tumorigenesis, where the collective normalization of abnormal single-cell states by a predominantly normal neighborhood may be crucial.

  13. Following the Rules.

    PubMed

    Katz, Anne

    2016-05-01

    I am getting better at following the rules as I grow older, although I still bristle at many of them. I was a typical rebellious teenager; no one understood me, David Bowie was my idol, and, one day, my generation was going to change the world. Now I really want people to understand me: David Bowie remains one of my favorite singers and, yes, my generation has changed the world, and not necessarily for the better. Growing up means that you have to make the rules, not just follow those set by others, and, at times, having rules makes a lot of sense.
.

  14. Compensatory Mitigation Rule Q&A

    EPA Pesticide Factsheets

    What is compensatory mitigation? How is compensatory mitigation accomplished? What does this final rule do? What are the most significant changes required by this rule compared to previous mitigation practices? What are the goals of the final rule?

  15. Rules based process window OPC

    NASA Astrophysics Data System (ADS)

    O'Brien, Sean; Soper, Robert; Best, Shane; Mason, Mark

    2008-03-01

    As a preliminary step towards Model-Based Process Window OPC we have analyzed the impact of correcting post-OPC layouts using rules based methods. Image processing on the Brion Tachyon was used to identify sites where the OPC model/recipe failed to generate an acceptable solution. A set of rules for 65nm active and poly were generated by classifying these failure sites. The rules were based upon segment runlengths, figure spaces, and adjacent figure widths. 2.1 million sites for active were corrected in a small chip (comparing the pre and post rules based operations), and 59 million were found at poly. Tachyon analysis of the final reticle layout found weak margin sites distinct from those sites repaired by rules-based corrections. For the active layer more than 75% of the sites corrected by rules would have printed without a defect indicating that most rulesbased cleanups degrade the lithographic pattern. Some sites were missed by the rules based cleanups due to either bugs in the DRC software or gaps in the rules table. In the end dramatic changes to the reticle prevented catastrophic lithography errors, but this method is far too blunt. A more subtle model-based procedure is needed changing only those sites which have unsatisfactory lithographic margin.

  16. Frequency selection rule for high definition and high frame rate Lissajous scanning.

    PubMed

    Hwang, Kyungmin; Seo, Yeong-Hyeon; Ahn, Jinhyo; Kim, Pilhan; Jeong, Ki-Hun

    2017-10-26

    Lissajous microscanners are very attractive in compact laser scanning applications such as endomicroscopy or pro-projection display owing to high mechanical stability and low operating voltages. The scanning frequency serves as a critical factor for determining the scanning imaging quality. Here we report the selection rule of scanning frequencies that can realize high definition and high frame-rate (HDHF) full-repeated Lissajous scanning imaging. The fill factor (FF) monotonically increases with the total lobe number of a Lissajous curve, i.e., the sum of scanning frequencies divided by the great common divisor (GCD) of bi-axial scanning frequencies. The frames per second (FPS), called the pattern repeated rate or the frame rate, linearly increases with GCD. HDHF Lissajous scanning is achieved at the bi-axial scanning frequencies, where the GCD has the maximum value among various sets of the scanning frequencies satisfying the total lobe number for a target FF. Based on this selection rule, the experimental results clearly demonstrate that conventional Lissajous scanners substantially increase both FF and FPS by slightly modulating the scanning frequencies at near the resonance within the resonance bandwidth of a Lissajous scanner. This selection rule provides a new guideline for HDHF Lissajous scanning in compact laser scanning systems.

  17. Simulation of Corrosion Process for Structure with the Cellular Automata Method

    NASA Astrophysics Data System (ADS)

    Chen, M. C.; Wen, Q. Q.

    2017-06-01

    In this paper, from the mesoscopic point of view, under the assumption of metal corrosion damage evolution being a diffusive process, the cellular automata (CA) method was proposed to simulate numerically the uniform corrosion damage evolution of outer steel tube of concrete filled steel tubular columns subjected to corrosive environment, and the effects of corrosive agent concentration, dissolution probability and elapsed etching time on the corrosion damage evolution were also investigated. It was shown that corrosion damage increases nonlinearly with increasing elapsed etching time, and the longer the etching time, the more serious the corrosion damage; different concentration of corrosive agents had different impacts on the corrosion damage degree of the outer steel tube, but the difference between the impacts was very small; the heavier the concentration, the more serious the influence. The greater the dissolution probability, the more serious the corrosion damage of the outer steel tube, but with the increase of dissolution probability, the difference between its impacts on the corrosion damage became smaller and smaller. To validate present method, corrosion damage measurements for concrete filled square steel tubular columns (CFSSTCs) sealed at both their ends and immersed fully in a simulating acid rain solution were conducted, and Faraday’s law was used to predict their theoretical values. Meanwhile, the proposed CA mode was applied for the simulation of corrosion damage evolution of the CFSSTCs. It was shown by the comparisons of results from the three methods aforementioned that they were in good agreement, implying that the proposed method used for the simulation of corrosion damage evolution of concrete filled steel tubular columns is feasible and effective. It will open a new approach to study and evaluate further the corrosion damage, loading capacity and lifetime prediction of concrete filled steel tubular structures.

  18. Design Pattern Mining Using Distributed Learning Automata and DNA Sequence Alignment

    PubMed Central

    Esmaeilpour, Mansour; Naderifar, Vahideh; Shukur, Zarina

    2014-01-01

    Context Over the last decade, design patterns have been used extensively to generate reusable solutions to frequently encountered problems in software engineering and object oriented programming. A design pattern is a repeatable software design solution that provides a template for solving various instances of a general problem. Objective This paper describes a new method for pattern mining, isolating design patterns and relationship between them; and a related tool, DLA-DNA for all implemented pattern and all projects used for evaluation. DLA-DNA achieves acceptable precision and recall instead of other evaluated tools based on distributed learning automata (DLA) and deoxyribonucleic acid (DNA) sequences alignment. Method The proposed method mines structural design patterns in the object oriented source code and extracts the strong and weak relationships between them, enabling analyzers and programmers to determine the dependency rate of each object, component, and other section of the code for parameter passing and modular programming. The proposed model can detect design patterns better that available other tools those are Pinot, PTIDEJ and DPJF; and the strengths of their relationships. Results The result demonstrate that whenever the source code is build standard and non-standard, based on the design patterns, then the result of the proposed method is near to DPJF and better that Pinot and PTIDEJ. The proposed model is tested on the several source codes and is compared with other related models and available tools those the results show the precision and recall of the proposed method, averagely 20% and 9.6% are more than Pinot, 27% and 31% are more than PTIDEJ and 3.3% and 2% are more than DPJF respectively. Conclusion The primary idea of the proposed method is organized in two following steps: the first step, elemental design patterns are identified, while at the second step, is composed to recognize actual design patterns. PMID:25243670

  19. Design pattern mining using distributed learning automata and DNA sequence alignment.

    PubMed

    Esmaeilpour, Mansour; Naderifar, Vahideh; Shukur, Zarina

    2014-01-01

    Over the last decade, design patterns have been used extensively to generate reusable solutions to frequently encountered problems in software engineering and object oriented programming. A design pattern is a repeatable software design solution that provides a template for solving various instances of a general problem. This paper describes a new method for pattern mining, isolating design patterns and relationship between them; and a related tool, DLA-DNA for all implemented pattern and all projects used for evaluation. DLA-DNA achieves acceptable precision and recall instead of other evaluated tools based on distributed learning automata (DLA) and deoxyribonucleic acid (DNA) sequences alignment. The proposed method mines structural design patterns in the object oriented source code and extracts the strong and weak relationships between them, enabling analyzers and programmers to determine the dependency rate of each object, component, and other section of the code for parameter passing and modular programming. The proposed model can detect design patterns better that available other tools those are Pinot, PTIDEJ and DPJF; and the strengths of their relationships. The result demonstrate that whenever the source code is build standard and non-standard, based on the design patterns, then the result of the proposed method is near to DPJF and better that Pinot and PTIDEJ. The proposed model is tested on the several source codes and is compared with other related models and available tools those the results show the precision and recall of the proposed method, averagely 20% and 9.6% are more than Pinot, 27% and 31% are more than PTIDEJ and 3.3% and 2% are more than DPJF respectively. The primary idea of the proposed method is organized in two following steps: the first step, elemental design patterns are identified, while at the second step, is composed to recognize actual design patterns.

  20. Automated visualization of rule-based models

    PubMed Central

    Tapia, Jose-Juan; Faeder, James R.

    2017-01-01

    Frameworks such as BioNetGen, Kappa and Simmune use “reaction rules” to specify biochemical interactions compactly, where each rule specifies a mechanism such as binding or phosphorylation and its structural requirements. Current rule-based models of signaling pathways have tens to hundreds of rules, and these numbers are expected to increase as more molecule types and pathways are added. Visual representations are critical for conveying rule-based models, but current approaches to show rules and interactions between rules scale poorly with model size. Also, inferring design motifs that emerge from biochemical interactions is an open problem, so current approaches to visualize model architecture rely on manual interpretation of the model. Here, we present three new visualization tools that constitute an automated visualization framework for rule-based models: (i) a compact rule visualization that efficiently displays each rule, (ii) the atom-rule graph that conveys regulatory interactions in the model as a bipartite network, and (iii) a tunable compression pipeline that incorporates expert knowledge and produces compact diagrams of model architecture when applied to the atom-rule graph. The compressed graphs convey network motifs and architectural features useful for understanding both small and large rule-based models, as we show by application to specific examples. Our tools also produce more readable diagrams than current approaches, as we show by comparing visualizations of 27 published models using standard graph metrics. We provide an implementation in the open source and freely available BioNetGen framework, but the underlying methods are general and can be applied to rule-based models from the Kappa and Simmune frameworks also. We expect that these tools will promote communication and analysis of rule-based models and their eventual integration into comprehensive whole-cell models. PMID:29131816