Sample records for turing machine paradigm

  1. A Simple Universal Turing Machine for the Game of Life Turing Machine

    NASA Astrophysics Data System (ADS)

    Rendell, Paul

    In this chapter we present a simple universal Turing machine which is small enough to fit into the design limits of the Turing machine build in Conway's Game of Life by the author. That limit is 8 symbols and 16 states. By way of comparison we also describe one of the smallest known universal Turing machines due to Rogozhin which has 6 symbols and 4 states.

  2. The Need for Alternative Paradigms in Science and Engineering Education

    ERIC Educational Resources Information Center

    Baggi, Dennis L.

    2007-01-01

    There are two main claims in this article. First, that the classic pillars of engineering education, namely, traditional mathematics and differential equations, are merely a particular, if not old-fashioned, representation of a broader mathematical vision, which spans from Turing machine programming and symbolic productions sets to sub-symbolic…

  3. Passing the Turing Test Does Not Mean the End of Humanity.

    PubMed

    Warwick, Kevin; Shah, Huma

    In this paper we look at the phenomenon that is the Turing test. We consider how Turing originally introduced his imitation game and discuss what this means in a practical scenario. Due to its popular appeal we also look into different representations of the test as indicated by numerous reviewers. The main emphasis here, however, is to consider what it actually means for a machine to pass the Turing test and what importance this has, if any. In particular does it mean that, as Turing put it, a machine can "think". Specifically we consider claims that passing the Turing test means that machines will have achieved human-like intelligence and as a consequence the singularity will be upon us in the blink of an eye.

  4. Finite machines, mental procedures, and modern physics.

    PubMed

    Lupacchini, Rossella

    2007-01-01

    A Turing machine provides a mathematical definition of the natural process of calculating. It rests on trust that a procedure of reason can be reproduced mechanically. Turing's analysis of the concept of mechanical procedure in terms of a finite machine convinced Gödel of the validity of the Church thesis. And yet, Gödel's later concern was that, insofar as Turing's work shows that "mental procedure cannot go beyond mechanical procedures", it would imply the same kind of limitation on human mind. He therefore deems Turing's argument to be inconclusive. The question then arises as to which extent a computing machine operating by finite means could provide an adequate model of human intelligence. It is argued that a rigorous answer to this question can be given by developing Turing's considerations on the nature of mental processes. For Turing such processes are the consequence of physical processes and he seems to be led to the conclusion that quantum mechanics could help to find a more comprehensive explanation of them.

  5. Taking the fifth amendment in Turing's imitation game

    NASA Astrophysics Data System (ADS)

    Warwick, Kevin; Shah, Huma

    2017-03-01

    In this paper, we look at a specific issue with practical Turing tests, namely the right of the machine to remain silent during interrogation. In particular, we consider the possibility of a machine passing the Turing test simply by not saying anything. We include a number of transcripts from practical Turing tests in which silence has actually occurred on the part of a hidden entity. Each of the transcripts considered here resulted in a judge being unable to make the 'right identification', i.e., they could not say for certain which hidden entity was the machine.

  6. Can machines think? A report on Turing test experiments at the Royal Society

    NASA Astrophysics Data System (ADS)

    Warwick, Kevin; Shah, Huma

    2016-11-01

    In this article we consider transcripts that originated from a practical series of Turing's Imitation Game that was held on 6 and 7 June 2014 at the Royal Society London. In all cases the tests involved a three-participant simultaneous comparison by an interrogator of two hidden entities, one being a human and the other a machine. Each of the transcripts considered here resulted in a human interrogator being fooled such that they could not make the 'right identification', that is, they could not say for certain which was the machine and which was the human. The transcripts presented all involve one machine only, namely 'Eugene Goostman', the result being that the machine became the first to pass the Turing test, as set out by Alan Turing, on unrestricted conversation. This is the first time that results from the Royal Society tests have been disclosed and discussed in a paper.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vanchurin, Vitaly, E-mail: vvanchur@d.umn.edu

    We initiate a formal study of logical inferences in context of the measure problem in cosmology or what we call cosmic logic. We describe a simple computational model of cosmic logic suitable for analysis of, for example, discretized cosmological systems. The construction is based on a particular model of computation, developed by Alan Turing, with cosmic observers (CO), cosmic measures (CM) and cosmic symmetries (CS) described by Turing machines. CO machines always start with a blank tape and CM machines take CO's Turing number (also known as description number or Gödel number) as input and output the corresponding probability. Similarly,more » CS machines take CO's Turing number as input, but output either one if the CO machines are in the same equivalence class or zero otherwise. We argue that CS machines are more fundamental than CM machines and, thus, should be used as building blocks in constructing CM machines. We prove the non-computability of a CS machine which discriminates between two classes of CO machines: mortal that halts in finite time and immortal that runs forever. In context of eternal inflation this result implies that it is impossible to construct CM machines to compute probabilities on the set of all CO machines using cut-off prescriptions. The cut-off measures can still be used if the set is reduced to include only machines which halt after a finite and predetermined number of steps.« less

  8. Cosmic logic: a computational model

    NASA Astrophysics Data System (ADS)

    Vanchurin, Vitaly

    2016-02-01

    We initiate a formal study of logical inferences in context of the measure problem in cosmology or what we call cosmic logic. We describe a simple computational model of cosmic logic suitable for analysis of, for example, discretized cosmological systems. The construction is based on a particular model of computation, developed by Alan Turing, with cosmic observers (CO), cosmic measures (CM) and cosmic symmetries (CS) described by Turing machines. CO machines always start with a blank tape and CM machines take CO's Turing number (also known as description number or Gödel number) as input and output the corresponding probability. Similarly, CS machines take CO's Turing number as input, but output either one if the CO machines are in the same equivalence class or zero otherwise. We argue that CS machines are more fundamental than CM machines and, thus, should be used as building blocks in constructing CM machines. We prove the non-computability of a CS machine which discriminates between two classes of CO machines: mortal that halts in finite time and immortal that runs forever. In context of eternal inflation this result implies that it is impossible to construct CM machines to compute probabilities on the set of all CO machines using cut-off prescriptions. The cut-off measures can still be used if the set is reduced to include only machines which halt after a finite and predetermined number of steps.

  9. AI in Informal Science Education: Bringing Turing Back to Life to Perform the Turing Test

    ERIC Educational Resources Information Center

    Gonzalez, Avelino J.; Hollister, James R.; DeMara, Ronald F.; Leigh, Jason; Lanman, Brandan; Lee, Sang-Yoon; Parker, Shane; Walls, Christopher; Parker, Jeanne; Wong, Josiah; Barham, Clayton; Wilder, Bryan

    2017-01-01

    This paper describes an interactive museum exhibit featuring an avatar of Alan Turing that informs museum visitors about artificial intelligence and Turing's seminal Turing Test for machine intelligence. The objective of the exhibit is to engage and motivate visiting children in the hope of sparking an interest in them about computer science and…

  10. Photochromic molecular implementations of universal computation.

    PubMed

    Chaplin, Jack C; Krasnogor, Natalio; Russell, Noah A

    2014-12-01

    Unconventional computing is an area of research in which novel materials and paradigms are utilised to implement computation. Previously we have demonstrated how registers, logic gates and logic circuits can be implemented, unconventionally, with a biocompatible molecular switch, NitroBIPS, embedded in a polymer matrix. NitroBIPS and related molecules have been shown elsewhere to be capable of modifying many biological processes in a manner that is dependent on its molecular form. Thus, one possible application of this type of unconventional computing is to embed computational processes into biological systems. Here we expand on our earlier proof-of-principle work and demonstrate that universal computation can be implemented using NitroBIPS. We have previously shown that spatially localised computational elements, including registers and logic gates, can be produced. We explain how parallel registers can be implemented, then demonstrate an application of parallel registers in the form of Turing machine tapes, and demonstrate both parallel registers and logic circuits in the form of elementary cellular automata. The Turing machines and elementary cellular automata utilise the same samples and same hardware to implement their registers, logic gates and logic circuits; and both represent examples of universal computing paradigms. This shows that homogenous photochromic computational devices can be dynamically repurposed without invasive reconfiguration. The result represents an important, necessary step towards demonstrating the general feasibility of interfacial computation embedded in biological systems or other unconventional materials and environments. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  11. A Turing Machine Simulator.

    ERIC Educational Resources Information Center

    Navarro, Aaron B.

    1981-01-01

    Presents a program in Level II BASIC for a TRS-80 computer that simulates a Turing machine and discusses the nature of the device. The program is run interactively and is designed to be used as an educational tool by computer science or mathematics students studying computational or automata theory. (MP)

  12. Quantum information, cognition, and music.

    PubMed

    Dalla Chiara, Maria L; Giuntini, Roberto; Leporini, Roberto; Negri, Eleonora; Sergioli, Giuseppe

    2015-01-01

    Parallelism represents an essential aspect of human mind/brain activities. One can recognize some common features between psychological parallelism and the characteristic parallel structures that arise in quantum theory and in quantum computation. The article is devoted to a discussion of the following questions: a comparison between classical probabilistic Turing machines and quantum Turing machines.possible applications of the quantum computational semantics to cognitive problems.parallelism in music.

  13. Quantum information, cognition, and music

    PubMed Central

    Dalla Chiara, Maria L.; Giuntini, Roberto; Leporini, Roberto; Negri, Eleonora; Sergioli, Giuseppe

    2015-01-01

    Parallelism represents an essential aspect of human mind/brain activities. One can recognize some common features between psychological parallelism and the characteristic parallel structures that arise in quantum theory and in quantum computation. The article is devoted to a discussion of the following questions: a comparison between classical probabilistic Turing machines and quantum Turing machines.possible applications of the quantum computational semantics to cognitive problems.parallelism in music. PMID:26539139

  14. Towards a molecular logic machine

    NASA Astrophysics Data System (ADS)

    Remacle, F.; Levine, R. D.

    2001-06-01

    Finite state logic machines can be realized by pump-probe spectroscopic experiments on an isolated molecule. The most elaborate setup, a Turing machine, can be programmed to carry out a specific computation. We argue that a molecule can be similarly programmed, and provide examples using two photon spectroscopies. The states of the molecule serve as the possible states of the head of the Turing machine and the physics of the problem determines the possible instructions of the program. The tape is written in an alphabet that allows the listing of the different pump and probe signals that are applied in a given experiment. Different experiments using the same set of molecular levels correspond to different tapes that can be read and processed by the same head and program. The analogy to a Turing machine is not a mechanical one and is not completely molecular because the tape is not part of the molecular machine. We therefore also discuss molecular finite state machines, such as sequential devices, for which the tape is not part of the machine. Nonmolecular tapes allow for quite long input sequences with a rich alphabet (at the level of 7 bits) and laser pulse shaping experiments provide concrete examples. Single molecule spectroscopies show that a single molecule can be repeatedly cycled through a logical operation.

  15. Are human beings humean robots?

    NASA Astrophysics Data System (ADS)

    Génova, Gonzalo; Quintanilla Navarro, Ignacio

    2018-01-01

    David Hume, the Scottish philosopher, conceives reason as the slave of the passions, which implies that human reason has predetermined objectives it cannot question. An essential element of an algorithm running on a computational machine (or Logical Computing Machine, as Alan Turing calls it) is its having a predetermined purpose: an algorithm cannot question its purpose, because it would cease to be an algorithm. Therefore, if self-determination is essential to human intelligence, then human beings are neither Humean beings, nor computational machines. We examine also some objections to the Turing Test as a model to understand human intelligence.

  16. The world problem: on the computability of the topology of 4-manifolds

    NASA Technical Reports Server (NTRS)

    vanMeter, J. R.

    2005-01-01

    Topological classification of the 4-manifolds bridges computation theory and physics. A proof of the undecidability of the homeomorphy problem for 4-manifolds is outlined here in a clarifying way. It is shown that an arbitrary Turing machine with an arbitrary input can be encoded into the topology of a 4-manifold, such that the 4-manifold is homeomorphic to a certain other 4-manifold if and only if the corresponding Turing machine halts on the associated input. Physical implications are briefly discussed.

  17. Consequences of nonclassical measurement for the algorithmic description of continuous dynamical systems

    NASA Technical Reports Server (NTRS)

    Fields, Chris

    1989-01-01

    Continuous dynamical systems intuitively seem capable of more complex behavior than discrete systems. If analyzed in the framework of the traditional theory of computation, a continuous dynamical system with countably many quasistable states has at least the computational power of a universal Turing machine. Such an analysis assumes, however, the classical notion of measurement. If measurement is viewed nonclassically, a continuous dynamical system cannot, even in principle, exhibit behavior that cannot be simulated by a universal Turing machine.

  18. Consequences of nonclassical measurement for the algorithmic description of continuous dynamical systems

    NASA Technical Reports Server (NTRS)

    Fields, Chris

    1989-01-01

    Continuous dynamical systems intuitively seem capable of more complex behavior than discrete systems. If analyzed in the framework of the traditional theory of computation, a continuous dynamical system with countablely many quasistable states has at least the computational power of a universal Turing machine. Such an analyses assumes, however, the classical notion of measurement. If measurement is viewed nonclassically, a continuous dynamical system cannot, even in principle, exhibit behavior that cannot be simulated by a universal Turing machine.

  19. Perspex machine: V. Compilation of C programs

    NASA Astrophysics Data System (ADS)

    Spanner, Matthew P.; Anderson, James A. D. W.

    2006-01-01

    The perspex machine arose from the unification of the Turing machine with projective geometry. The original, constructive proof used four special, perspective transformations to implement the Turing machine in projective geometry. These four transformations are now generalised and applied in a compiler, implemented in Pop11, that converts a subset of the C programming language into perspexes. This is interesting both from a geometrical and a computational point of view. Geometrically, it is interesting that program source can be converted automatically to a sequence of perspective transformations and conditional jumps, though we find that the product of homogeneous transformations with normalisation can be non-associative. Computationally, it is interesting that program source can be compiled for a Reduced Instruction Set Computer (RISC), the perspex machine, that is a Single Instruction, Zero Exception (SIZE) computer.

  20. A novel modification of the Turing test for artificial intelligence and robotics in healthcare.

    PubMed

    Ashrafian, Hutan; Darzi, Ara; Athanasiou, Thanos

    2015-03-01

    The increasing demands of delivering higher quality global healthcare has resulted in a corresponding expansion in the development of computer-based and robotic healthcare tools that rely on artificially intelligent technologies. The Turing test was designed to assess artificial intelligence (AI) in computer technology. It remains an important qualitative tool for testing the next generation of medical diagnostics and medical robotics. Development of quantifiable diagnostic accuracy meta-analytical evaluative techniques for the Turing test paradigm. Modification of the Turing test to offer quantifiable diagnostic precision and statistical effect-size robustness in the assessment of AI for computer-based and robotic healthcare technologies. Modification of the Turing test to offer robust diagnostic scores for AI can contribute to enhancing and refining the next generation of digital diagnostic technologies and healthcare robotics. Copyright © 2014 John Wiley & Sons, Ltd.

  1. Verification and validation of a Work Domain Analysis with turing machine task analysis.

    PubMed

    Rechard, J; Bignon, A; Berruet, P; Morineau, T

    2015-03-01

    While the use of Work Domain Analysis as a methodological framework in cognitive engineering is increasing rapidly, verification and validation of work domain models produced by this method are becoming a significant issue. In this article, we propose the use of a method based on Turing machine formalism named "Turing Machine Task Analysis" to verify and validate work domain models. The application of this method on two work domain analyses, one of car driving which is an "intentional" domain, and the other of a ship water system which is a "causal domain" showed the possibility of highlighting improvements needed by these models. More precisely, the step by step analysis of a degraded task scenario in each work domain model pointed out unsatisfactory aspects in the first modelling, like overspecification, underspecification, omission of work domain affordances, or unsuitable inclusion of objects in the work domain model. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  2. The man behind the machine

    NASA Astrophysics Data System (ADS)

    Cerf, Vint

    2018-01-01

    As a practising computer scientist, I thought I had a fairly good grasp of Alan Turing’s many contributions to the field. But The Turing Guide by Jack Copeland, Jonathan Bowen, Mark Sprevak and Robin Wilson has opened up a universe of Turing's other pursuits I knew nothing about, inflating my admiration for him and his work.

  3. Testing the Turing Test — do Men Pass It?

    NASA Astrophysics Data System (ADS)

    Adam, Ruth; Hershberg, Uri; Schul, Yaacov; Solomon, Sorin

    We are fascinated by the idea of giving life to the inanimate. The fields of Artificial Life and Artificial Intelligence (AI) attempt to use a scientific approach to pursue this desire. The first steps on this approach hark back to Turing and his suggestion of an imitation game as an alternative answer to the question "can machines think?".1 To test his hypothesis, Turing formulated the Turing test1 to detect human behavior in computers. But how do humans pass such a test? What would you say if you would learn that they do not pass it well? What would it mean for our understanding of human behavior? What would it mean for our design of tests of the success of artificial life? We report below an experiment in which men consistently failed the Turing test.

  4. Cooperative combinatorial optimization: evolutionary computation case study.

    PubMed

    Burgin, Mark; Eberbach, Eugene

    2008-01-01

    This paper presents a formalization of the notion of cooperation and competition of multiple systems that work toward a common optimization goal of the population using evolutionary computation techniques. It is proved that evolutionary algorithms are more expressive than conventional recursive algorithms, such as Turing machines. Three classes of evolutionary computations are introduced and studied: bounded finite, unbounded finite, and infinite computations. Universal evolutionary algorithms are constructed. Such properties of evolutionary algorithms as completeness, optimality, and search decidability are examined. A natural extension of evolutionary Turing machine (ETM) model is proposed to properly reflect phenomena of cooperation and competition in the whole population.

  5. A 'Turing' Test for Landscape Evolution Models

    NASA Astrophysics Data System (ADS)

    Parsons, A. J.; Wise, S. M.; Wainwright, J.; Swift, D. A.

    2008-12-01

    Resolving the interactions among tectonics, climate and surface processes at long timescales has benefited from the development of computer models of landscape evolution. However, testing these Landscape Evolution Models (LEMs) has been piecemeal and partial. We argue that a more systematic approach is required. What is needed is a test that will establish how 'realistic' an LEM is and thus the extent to which its predictions may be trusted. We propose a test based upon the Turing Test of artificial intelligence as a way forward. In 1950 Alan Turing posed the question of whether a machine could think. Rather than attempt to address the question directly he proposed a test in which an interrogator asked questions of a person and a machine, with no means of telling which was which. If the machine's answer could not be distinguished from those of the human, the machine could be said to demonstrate artificial intelligence. By analogy, if an LEM cannot be distinguished from a real landscape it can be deemed to be realistic. The Turing test of intelligence is a test of the way in which a computer behaves. The analogy in the case of an LEM is that it should show realistic behaviour in terms of form and process, both at a given moment in time (punctual) and in the way both form and process evolve over time (dynamic). For some of these behaviours, tests already exist. For example there are numerous morphometric tests of punctual form and measurements of punctual process. The test discussed in this paper provides new ways of assessing dynamic behaviour of an LEM over realistically long timescales. However challenges remain in developing an appropriate suite of challenging tests, in applying these tests to current LEMs and in developing LEMs that pass them.

  6. A mechanical Turing machine: blueprint for a biomolecular computer

    PubMed Central

    Shapiro, Ehud

    2012-01-01

    We describe a working mechanical device that embodies the theoretical computing machine of Alan Turing, and as such is a universal programmable computer. The device operates on three-dimensional building blocks by applying mechanical analogues of polymer elongation, cleavage and ligation, movement along a polymer, and control by molecular recognition unleashing allosteric conformational changes. Logically, the device is not more complicated than biomolecular machines of the living cell, and all its operations are part of the standard repertoire of these machines; hence, a biomolecular embodiment of the device is not infeasible. If implemented, such a biomolecular device may operate in vivo, interacting with its biochemical environment in a program-controlled manner. In particular, it may ‘compute’ synthetic biopolymers and release them into its environment in response to input from the environment, a capability that may have broad pharmaceutical and biological applications. PMID:22649583

  7. Quantum turing machine and brain model represented by Fock space

    NASA Astrophysics Data System (ADS)

    Iriyama, Satoshi; Ohya, Masanori

    2016-05-01

    The adaptive dynamics is known as a new mathematics to treat with a complex phenomena, for example, chaos, quantum algorithm and psychological phenomena. In this paper, we briefly review the notion of the adaptive dynamics, and explain the definition of the generalized Turing machine (GTM) and recognition process represented by the Fock space. Moreover, we show that there exists the quantum channel which is described by the GKSL master equation to achieve the Chaos Amplifier used in [M. Ohya and I. V. Volovich, J. Opt. B 5(6) (2003) 639., M. Ohya and I. V. Volovich, Rep. Math. Phys. 52(1) (2003) 25.

  8. Single instruction computer architecture and its application in image processing

    NASA Astrophysics Data System (ADS)

    Laplante, Phillip A.

    1992-03-01

    A single processing computer system using only half-adder circuits is described. In addition, it is shown that only a single hard-wired instruction is needed in the control unit to obtain a complete instruction set for this general purpose computer. Such a system has several advantages. First it is intrinsically a RISC machine--in fact the 'ultimate RISC' machine. Second, because only a single type of logic element is employed the entire computer system can be easily realized on a single, highly integrated chip. Finally, due to the homogeneous nature of the computer's logic elements, the computer has possible implementations as an optical or chemical machine. This in turn suggests possible paradigms for neural computing and artificial intelligence. After showing how we can implement a full-adder, min, max and other operations using the half-adder, we use an array of such full-adders to implement the dilation operation for two black and white images. Next we implement the erosion operation of two black and white images using a relative complement function and the properties of erosion and dilation. This approach was inspired by papers by van der Poel in which a single instruction is used to furnish a complete set of general purpose instructions and by Bohm- Jacopini where it is shown that any problem can be solved using a Turing machine with one entry and one exit.

  9. Simplified and Yet Turing Universal Spiking Neural P Systems with Communication on Request.

    PubMed

    Wu, Tingfang; Bîlbîe, Florin-Daniel; Păun, Andrei; Pan, Linqiang; Neri, Ferrante

    2018-04-02

    Spiking neural P systems are a class of third generation neural networks belonging to the framework of membrane computing. Spiking neural P systems with communication on request (SNQ P systems) are a type of spiking neural P system where the spikes are requested from neighboring neurons. SNQ P systems have previously been proved to be universal (computationally equivalent to Turing machines) when two types of spikes are considered. This paper studies a simplified version of SNQ P systems, i.e. SNQ P systems with one type of spike. It is proved that one type of spike is enough to guarantee the Turing universality of SNQ P systems. Theoretical results are shown in the cases of the SNQ P system used in both generating and accepting modes. Furthermore, the influence of the number of unbounded neurons (the number of spikes in a neuron is not bounded) on the computation power of SNQ P systems with one type of spike is investigated. It is found that SNQ P systems functioning as number generating devices with one type of spike and four unbounded neurons are Turing universal.

  10. Bio-steps beyond Turing.

    PubMed

    Calude, Cristian S; Păun, Gheorghe

    2004-11-01

    Are there 'biologically computing agents' capable to compute Turing uncomputable functions? It is perhaps tempting to dismiss this question with a negative answer. Quite the opposite, for the first time in the literature on molecular computing we contend that the answer is not theoretically negative. Our results will be formulated in the language of membrane computing (P systems). Some mathematical results presented here are interesting in themselves. In contrast with most speed-up methods which are based on non-determinism, our results rest upon some universality results proved for deterministic P systems. These results will be used for building "accelerated P systems". In contrast with the case of Turing machines, acceleration is a part of the hardware (not a quality of the environment) and it is realised either by decreasing the size of "reactors" or by speeding-up the communication channels. Consequently, two acceleration postulates of biological inspiration are introduced; each of them poses specific questions to biology. Finally, in a more speculative part of the paper, we will deal with Turing non-computability activity of the brain and possible forms of (extraterrestrial) intelligence.

  11. On The Computational Capabilities of Physical Systems. Part 2; Relationship With Conventional Computer Science

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.; Koga, Dennis (Technical Monitor)

    2000-01-01

    In the first of this pair of papers, it was proven that there cannot be a physical computer to which one can properly pose any and all computational tasks concerning the physical universe. It was then further proven that no physical computer C can correctly carry out all computational tasks that can be posed to C. As a particular example, this result means that no physical computer that can, for any physical system external to that computer, take the specification of that external system's state as input and then correctly predict its future state before that future state actually occurs; one cannot build a physical computer that can be assured of correctly "processing information faster than the universe does". These results do not rely on systems that are infinite, and/or non-classical, and/or obey chaotic dynamics. They also hold even if one uses an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing Machine. This generality is a direct consequence of the fact that a novel definition of computation - "physical computation" - is needed to address the issues considered in these papers, which concern real physical computers. While this novel definition does not fit into the traditional Chomsky hierarchy, the mathematical structure and impossibility results associated with it have parallels in the mathematics of the Chomsky hierarchy. This second paper of the pair presents a preliminary exploration of some of this mathematical structure. Analogues of Chomskian results concerning universal Turing Machines and the Halting theorem are derived, as are results concerning the (im)possibility of certain kinds of error-correcting codes. In addition, an analogue of algorithmic information complexity, "prediction complexity", is elaborated. A task-independent bound is derived on how much the prediction complexity of a computational task can differ for two different reference universal physical computers used to solve that task, a bound similar to the "encoding" bound governing how much the algorithm information complexity of a Turing machine calculation can differ for two reference universal Turing machines. Finally, it is proven that either the Hamiltonian of our universe proscribes a certain type of computation, or prediction complexity is unique (unlike algorithmic information complexity), in that there is one and only version of it that can be applicable throughout our universe.

  12. The brain as a system of nested but partially overlapping networks. Heuristic relevance of the model for brain physiology and pathology.

    PubMed

    Agnati, L F; Guidolin, D; Fuxe, K

    2007-01-01

    A new model of the brain organization is proposed. The model is based on the assumption that a global molecular network enmeshes the entire central nervous system. Thus, brain extra-cellular and intra-cellular molecular networks are proposed to communicate at the level of special plasma membrane regions (e.g., the lipid rafts) where horizontal molecular networks can represent input/output regions allowing the cell to have informational exchanges with the extracellular environment. Furthermore, some "pervasive signals" such as field potentials, pressure waves and thermal gradients that affect large parts of the brain cellular and molecular networks are discussed. Finally, at least two learning paradigms are analyzed taking into account the possible role of Volume Transmission: the so-called model of "temporal difference learning" and the "Turing B-unorganised machine". The relevance of this new view of brain organization for a deeper understanding of some neurophysiological and neuropathological aspects of its function is briefly discussed.

  13. Artificial Intelligence.

    ERIC Educational Resources Information Center

    Thornburg, David D.

    1986-01-01

    Overview of the artificial intelligence (AI) field provides a definition; discusses past research and areas of future research; describes the design, functions, and capabilities of expert systems and the "Turing Test" for machine intelligence; and lists additional sources for information on artificial intelligence. Languages of AI are…

  14. The Society of Brains: How Alan Turing and Marvin Minsky Were Both Right

    NASA Astrophysics Data System (ADS)

    Struzik, Zbigniew R.

    2015-04-01

    In his well-known prediction, Alan Turing stated that computer intelligence would surpass human intelligence by the year 2000. Although the Turing Test, as it became known, was devised to be played by one human against one computer, this is not a fair setup. Every human is a part of a social network, and a fairer comparison would be a contest between one human at the console and a network of computers behind the console. Around the year 2000, the number of web pages on the WWW overtook the number of neurons in the human brain. But these websites would be of little use without the ability to search for knowledge. By the year 2000 Google Inc. had become the search engine of choice, and the WWW became an intelligent entity. This was not without good reason. The basis for the search engine was the analysis of the ’network of knowledge’. The PageRank algorithm, linking information on the web according to the hierarchy of ‘link popularity’, continues to provide the basis for all of Google's web search tools. While PageRank was developed by Larry Page and Sergey Brin in 1996 as part of a research project about a new kind of search engine, PageRank is in its essence the key to representing and using static knowledge in an emergent intelligent system. Here I argue that Alan Turing was right, as hybrid human-computer internet machines have already surpassed our individual intelligence - this was done around the year 2000 by the Internet - the socially-minded, human-computer hybrid Homo computabilis-socialis. Ironically, the Internet's intelligence also emerged to a large extent from ‘exploiting’ humans - the key to the emergence of machine intelligence has been discussed by Marvin Minsky in his work on the foundations of intelligence through interacting agents’ knowledge. As a consequence, a decade and a half decade into the 21st century, we appear to be much better equipped to tackle the problem of the social origins of humanity - in particular thanks to the power of the intelligent partner-in-the-quest machine, however, we should not wait too long...

  15. Abstract quantum computing machines and quantum computational logics

    NASA Astrophysics Data System (ADS)

    Chiara, Maria Luisa Dalla; Giuntini, Roberto; Sergioli, Giuseppe; Leporini, Roberto

    2016-06-01

    Classical and quantum parallelism are deeply different, although it is sometimes claimed that quantum Turing machines are nothing but special examples of classical probabilistic machines. We introduce the concepts of deterministic state machine, classical probabilistic state machine and quantum state machine. On this basis, we discuss the question: To what extent can quantum state machines be simulated by classical probabilistic state machines? Each state machine is devoted to a single task determined by its program. Real computers, however, behave differently, being able to solve different kinds of problems. This capacity can be modeled, in the quantum case, by the mathematical notion of abstract quantum computing machine, whose different programs determine different quantum state machines. The computations of abstract quantum computing machines can be linguistically described by the formulas of a particular form of quantum logic, termed quantum computational logic.

  16. Computational complexity of symbolic dynamics at the onset of chaos

    NASA Astrophysics Data System (ADS)

    Lakdawala, Porus

    1996-05-01

    In a variety of studies of dynamical systems, the edge of order and chaos has been singled out as a region of complexity. It was suggested by Wolfram, on the basis of qualitative behavior of cellular automata, that the computational basis for modeling this region is the universal Turing machine. In this paper, following a suggestion of Crutchfield, we try to show that the Turing machine model may often be too powerful as a computational model to describe the boundary of order and chaos. In particular we study the region of the first accumulation of period doubling in unimodal and bimodal maps of the interval, from the point of view of language theory. We show that in relation to the ``extended'' Chomsky hierarchy, the relevant computational model in the unimodal case is the nested stack automaton or the related indexed languages, while the bimodal case is modeled by the linear bounded automaton or the related context-sensitive languages.

  17. Probability Simulations by Non-Lipschitz Chaos

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-Lipschitz dynamics, without utilization of any man-made devices. Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  18. Autopoiesis + extended cognition + nature = can buildings think?

    PubMed Central

    Dollens, Dennis

    2015-01-01

    To incorporate metabolic, bioremedial functions into the performance of buildings and to balance generative architecture's dominant focus on computational programming and digital fabrication, this text first discusses hybridizing Maturana and Varela's biological theory of autopoiesis with Andy Clark's hypothesis of extended cognition. Doing so establishes a procedural protocol to research biological domains from which design could source data/insight from biosemiotics, sensory plants, and biocomputation. I trace computation and botanic simulations back to Alan Turing's little-known 1950s Morphogenetic drawings, reaction-diffusion algorithms, and pioneering artificial intelligence (AI) in order to establish bioarchitecture's generative point of origin. I ask provocatively, Can buildings think? as a question echoing Turing's own, "Can machines think?" PMID:26478784

  19. Simulations of Probabilities for Quantum Computing

    NASA Technical Reports Server (NTRS)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  20. A truly human interface: interacting face-to-face with someone whose words are determined by a computer program

    PubMed Central

    Corti, Kevin; Gillespie, Alex

    2015-01-01

    We use speech shadowing to create situations wherein people converse in person with a human whose words are determined by a conversational agent computer program. Speech shadowing involves a person (the shadower) repeating vocal stimuli originating from a separate communication source in real-time. Humans shadowing for conversational agent sources (e.g., chat bots) become hybrid agents (“echoborgs”) capable of face-to-face interlocution. We report three studies that investigated people’s experiences interacting with echoborgs and the extent to which echoborgs pass as autonomous humans. First, participants in a Turing Test spoke with a chat bot via either a text interface or an echoborg. Human shadowing did not improve the chat bot’s chance of passing but did increase interrogators’ ratings of how human-like the chat bot seemed. In our second study, participants had to decide whether their interlocutor produced words generated by a chat bot or simply pretended to be one. Compared to those who engaged a text interface, participants who engaged an echoborg were more likely to perceive their interlocutor as pretending to be a chat bot. In our third study, participants were naïve to the fact that their interlocutor produced words generated by a chat bot. Unlike those who engaged a text interface, the vast majority of participants who engaged an echoborg did not sense a robotic interaction. These findings have implications for android science, the Turing Test paradigm, and human–computer interaction. The human body, as the delivery mechanism of communication, fundamentally alters the social psychological dynamics of interactions with machine intelligence. PMID:26042066

  1. Cook-Levin Theorem Algorithmic-Reducibility/Completeness = Wilson Renormalization-(Semi)-Group Fixed-Points; ``Noise''-Induced Phase-Transitions (NITs) to Accelerate Algorithmics (``NIT-Picking'') REPLACING CRUTCHES!!!: Models: Turing-machine, finite-state-models, finite-automata

    NASA Astrophysics Data System (ADS)

    Young, Frederic; Siegel, Edward

    Cook-Levin theorem theorem algorithmic computational-complexity(C-C) algorithmic-equivalence reducibility/completeness equivalence to renormalization-(semi)-group phase-transitions critical-phenomena statistical-physics universality-classes fixed-points, is exploited via Siegel FUZZYICS =CATEGORYICS = ANALOGYICS =PRAGMATYICS/CATEGORY-SEMANTICS ONTOLOGY COGNITION ANALYTICS-Aristotle ``square-of-opposition'' tabular list-format truth-table matrix analytics predicts and implements ''noise''-induced phase-transitions (NITs) to accelerate versus to decelerate Harel [Algorithmics (1987)]-Sipser[Intro.Thy. Computation(`97)] algorithmic C-C: ''NIT-picking''(!!!), to optimize optimization-problems optimally(OOPO). Versus iso-''noise'' power-spectrum quantitative-only amplitude/magnitude-only variation stochastic-resonance, ''NIT-picking'' is ''noise'' power-spectrum QUALitative-type variation via quantitative critical-exponents variation. Computer-''science''/SEANCE algorithmic C-C models: Turing-machine, finite-state-models, finite-automata,..., discrete-maths graph-theory equivalence to physics Feynman-diagrams are identified as early-days once-workable valid but limiting IMPEDING CRUTCHES(!!!), ONLY IMPEDE latter-days new-insights!!!

  2. Efficient classical simulation of the Deutsch-Jozsa and Simon's algorithms

    NASA Astrophysics Data System (ADS)

    Johansson, Niklas; Larsson, Jan-Åke

    2017-09-01

    A long-standing aim of quantum information research is to understand what gives quantum computers their advantage. This requires separating problems that need genuinely quantum resources from those for which classical resources are enough. Two examples of quantum speed-up are the Deutsch-Jozsa and Simon's problem, both efficiently solvable on a quantum Turing machine, and both believed to lack efficient classical solutions. Here we present a framework that can simulate both quantum algorithms efficiently, solving the Deutsch-Jozsa problem with probability 1 using only one oracle query, and Simon's problem using linearly many oracle queries, just as expected of an ideal quantum computer. The presented simulation framework is in turn efficiently simulatable in a classical probabilistic Turing machine. This shows that the Deutsch-Jozsa and Simon's problem do not require any genuinely quantum resources, and that the quantum algorithms show no speed-up when compared with their corresponding classical simulation. Finally, this gives insight into what properties are needed in the two algorithms and calls for further study of oracle separation between quantum and classical computation.

  3. Niépce-Bell or Turing: how to test odour reproduction.

    PubMed

    Harel, David

    2016-12-01

    Decades before the existence of anything resembling an artificial intelligence system, Alan Turing raised the question of how to test whether machines can think, or, in modern terminology, whether a computer claimed to exhibit intelligence indeed does so. This paper raises the analogous issue for olfaction: how to test the validity of a system claimed to reproduce arbitrary odours artificially, in a way recognizable to humans. Although odour reproduction systems are still far from being viable, the question of how to test candidates thereof is claimed to be interesting and non-trivial, and a novel method is proposed. Despite the similarity between the two questions and their surfacing long before the tested systems exist, the present question cannot be answered adequately by a Turing-like method. Instead, our test is very different: it is conditional, requiring from the artificial no more than is required from the original, and it employs a novel method of immersion that takes advantage of the availability of easily recognizable reproduction methods for sight and sound, a la Nicéphore Niépce and Alexander Graham Bell. © 2016 The Authors.

  4. Niépce–Bell or Turing: how to test odour reproduction

    PubMed Central

    2016-01-01

    Decades before the existence of anything resembling an artificial intelligence system, Alan Turing raised the question of how to test whether machines can think, or, in modern terminology, whether a computer claimed to exhibit intelligence indeed does so. This paper raises the analogous issue for olfaction: how to test the validity of a system claimed to reproduce arbitrary odours artificially, in a way recognizable to humans. Although odour reproduction systems are still far from being viable, the question of how to test candidates thereof is claimed to be interesting and non-trivial, and a novel method is proposed. Despite the similarity between the two questions and their surfacing long before the tested systems exist, the present question cannot be answered adequately by a Turing-like method. Instead, our test is very different: it is conditional, requiring from the artificial no more than is required from the original, and it employs a novel method of immersion that takes advantage of the availability of easily recognizable reproduction methods for sight and sound, a la Nicéphore Niépce and Alexander Graham Bell. PMID:28003527

  5. Programming in Polygon R&D: Explorations with a Spatial Language II

    ERIC Educational Resources Information Center

    Morey, Jim

    2006-01-01

    This paper introduces the language associated with a polygon microworld called Polygon R&D, which has the mathematical crispness of Logo and has the discreteness and simplicity of a Turing machine. In this microworld, polygons serve two purposes: as agents (similar to the turtles in Logo), and as data (landmarks in the plane). Programming the…

  6. Implications of the Turing machine model of computation for processor and programming language design

    NASA Astrophysics Data System (ADS)

    Hunter, Geoffrey

    2004-01-01

    A computational process is classified according to the theoretical model that is capable of executing it; computational processes that require a non-predeterminable amount of intermediate storage for their execution are Turing-machine (TM) processes, while those whose storage are predeterminable are Finite Automation (FA) processes. Simple processes (such as traffic light controller) are executable by Finite Automation, whereas the most general kind of computation requires a Turing Machine for its execution. This implies that a TM process must have a non-predeterminable amount of memory allocated to it at intermediate instants of its execution; i.e. dynamic memory allocation. Many processes encountered in practice are TM processes. The implication for computational practice is that the hardware (CPU) architecture and its operating system must facilitate dynamic memory allocation, and that the programming language used to specify TM processes must have statements with the semantic attribute of dynamic memory allocation, for in Alan Turing"s thesis on computation (1936) the "standard description" of a process is invariant over the most general data that the process is designed to process; i.e. the program describing the process should never have to be modified to allow for differences in the data that is to be processed in different instantiations; i.e. data-invariant programming. Any non-trivial program is partitioned into sub-programs (procedures, subroutines, functions, modules, etc). Examination of the calls/returns between the subprograms reveals that they are nodes in a tree-structure; this tree-structure is independent of the programming language used to encode (define) the process. Each sub-program typically needs some memory for its own use (to store values intermediate between its received data and its computed results); this locally required memory is not needed before the subprogram commences execution, and it is not needed after its execution terminates; it may be allocated as its execution commences, and deallocated as its execution terminates, and if the amount of this local memory is not known until just before execution commencement, then it is essential that it be allocated dynamically as the first action of its execution. This dynamically allocated/deallocated storage of each subprogram"s intermediate values, conforms with the stack discipline; i.e. last allocated = first to be deallocated, an incidental benefit of which is automatic overlaying of variables. This stack-based dynamic memory allocation was a semantic implication of the nested block structure that originated in the ALGOL-60 programming language. AGLOL-60 was a TM language, because the amount of memory allocated on subprogram (block/procedure) entry (for arrays, etc) was computable at execution time. A more general requirement of a Turing machine process is for code generation at run-time; this mandates access to the source language processor (compiler/interpretor) during execution of the process. This fundamental aspect of computer science is important to the future of system design, because it has been overlooked throughout the 55 years since modern computing began in 1048. The popular computer systems of this first half-century of computing were constrained by compile-time (or even operating system boot-time) memory allocation, and were thus limited to executing FA processes. The practical effect was that the distinction between the data-invariant program and its variable data was blurred; programmers had to make trial and error executions, modifying the program"s compile-time constants (array dimensions) to iterate towards the values required at run-time by the data being processed. This era of trial and error computing still persists; it pervades the culture of current (2003) computing practice.

  7. Learning Computer Programming: Implementing a Fractal in a Turing Machine

    ERIC Educational Resources Information Center

    Pereira, Hernane B. de B.; Zebende, Gilney F.; Moret, Marcelo A.

    2010-01-01

    It is common to start a course on computer programming logic by teaching the algorithm concept from the point of view of natural languages, but in a schematic way. In this sense we note that the students have difficulties in understanding and implementation of the problems proposed by the teacher. The main idea of this paper is to show that the…

  8. Undecidability in macroeconomics

    NASA Technical Reports Server (NTRS)

    Chandra, Siddharth; Chandra, Tushar Deepak

    1993-01-01

    In this paper we study the difficulty of solving problems in economics. For this purpose, we adopt the notion of undecidability from recursion theory. We show that certain problems in economics are undecidable, i.e., cannot be solved by a Turing Machine, a device that is at least as powerful as any computational device that can be constructed. In particular, we prove that even in finite closed economies subject to a variable initial condition, in which a social planner knows the behavior of every agent in the economy, certain important social planning problems are undecidable. Thus, it may be impossible to make effective policy decisions. Philosophically, this result formally brings into question the Rational Expectations Hypothesis which assumes that each agent is able to determine what it should do if it wishes to maximize its utility. We show that even when an optimal rational forecast exists for each agency (based on the information currently available to it), agents may lack the ability to make these forecasts. For example, Lucas describes economic models as 'mechanical, artificial world(s), populated by ... interacting robots'. Since any mechanical robot can be at most as computationally powerful as a Turing Machine, such economies are vulnerable to the phenomenon of undecidability.

  9. Decision theory with resource-bounded agents.

    PubMed

    Halpern, Joseph Y; Pass, Rafael; Seeman, Lior

    2014-04-01

    There have been two major lines of research aimed at capturing resource-bounded players in game theory. The first, initiated by Rubinstein (), charges an agent for doing costly computation; the second, initiated by Neyman (), does not charge for computation, but limits the computation that agents can do, typically by modeling agents as finite automata. We review recent work on applying both approaches in the context of decision theory. For the first approach, we take the objects of choice in a decision problem to be Turing machines, and charge players for the "complexity" of the Turing machine chosen (e.g., its running time). This approach can be used to explain well-known phenomena like first-impression-matters biases (i.e., people tend to put more weight on evidence they hear early on) and belief polarization (two people with different prior beliefs, hearing the same evidence, can end up with diametrically opposed conclusions) as the outcomes of quite rational decisions. For the second approach, we model people as finite automata, and provide a simple algorithm that, on a problem that captures a number of settings of interest, provably performs optimally as the number of states in the automaton increases. Copyright © 2014 Cognitive Science Society, Inc.

  10. Quantum Iterative Deepening with an Application to the Halting Problem

    PubMed Central

    Tarrataca, Luís; Wichert, Andreas

    2013-01-01

    Classical models of computation traditionally resort to halting schemes in order to enquire about the state of a computation. In such schemes, a computational process is responsible for signaling an end of a calculation by setting a halt bit, which needs to be systematically checked by an observer. The capacity of quantum computational models to operate on a superposition of states requires an alternative approach. From a quantum perspective, any measurement of an equivalent halt qubit would have the potential to inherently interfere with the computation by provoking a random collapse amongst the states. This issue is exacerbated by undecidable problems such as the Entscheidungsproblem which require universal computational models, e.g. the classical Turing machine, to be able to proceed indefinitely. In this work we present an alternative view of quantum computation based on production system theory in conjunction with Grover's amplitude amplification scheme that allows for (1) a detection of halt states without interfering with the final result of a computation; (2) the possibility of non-terminating computation and (3) an inherent speedup to occur during computations susceptible of parallelization. We discuss how such a strategy can be employed in order to simulate classical Turing machines. PMID:23520465

  11. Rediscovery of Good-Turing estimators via Bayesian nonparametrics.

    PubMed

    Favaro, Stefano; Nipoti, Bernardo; Teh, Yee Whye

    2016-03-01

    The problem of estimating discovery probabilities originated in the context of statistical ecology, and in recent years it has become popular due to its frequent appearance in challenging applications arising in genetics, bioinformatics, linguistics, designs of experiments, machine learning, etc. A full range of statistical approaches, parametric and nonparametric as well as frequentist and Bayesian, has been proposed for estimating discovery probabilities. In this article, we investigate the relationships between the celebrated Good-Turing approach, which is a frequentist nonparametric approach developed in the 1940s, and a Bayesian nonparametric approach recently introduced in the literature. Specifically, under the assumption of a two parameter Poisson-Dirichlet prior, we show that Bayesian nonparametric estimators of discovery probabilities are asymptotically equivalent, for a large sample size, to suitably smoothed Good-Turing estimators. As a by-product of this result, we introduce and investigate a methodology for deriving exact and asymptotic credible intervals to be associated with the Bayesian nonparametric estimators of discovery probabilities. The proposed methodology is illustrated through a comprehensive simulation study and the analysis of Expressed Sequence Tags data generated by sequencing a benchmark complementary DNA library. © 2015, The International Biometric Society.

  12. One Dimensional Turing-Like Handshake Test for Motor Intelligence

    PubMed Central

    Karniel, Amir; Avraham, Guy; Peles, Bat-Chen; Levy-Tzedek, Shelly; Nisky, Ilana

    2010-01-01

    In the Turing test, a computer model is deemed to "think intelligently" if it can generate answers that are not distinguishable from those of a human. However, this test is limited to the linguistic aspects of machine intelligence. A salient function of the brain is the control of movement, and the movement of the human hand is a sophisticated demonstration of this function. Therefore, we propose a Turing-like handshake test, for machine motor intelligence. We administer the test through a telerobotic system in which the interrogator is engaged in a task of holding a robotic stylus and interacting with another party (human or artificial). Instead of asking the interrogator whether the other party is a person or a computer program, we employ a two-alternative forced choice method and ask which of two systems is more human-like. We extract a quantitative grade for each model according to its resemblance to the human handshake motion and name it "Model Human-Likeness Grade" (MHLG). We present three methods to estimate the MHLG. (i) By calculating the proportion of subjects' answers that the model is more human-like than the human; (ii) By comparing two weighted sums of human and model handshakes we fit a psychometric curve and extract the point of subjective equality (PSE); (iii) By comparing a given model with a weighted sum of human and random signal, we fit a psychometric curve to the answers of the interrogator and extract the PSE for the weight of the human in the weighted sum. Altogether, we provide a protocol to test computational models of the human handshake. We believe that building a model is a necessary step in understanding any phenomenon and, in this case, in understanding the neural mechanisms responsible for the generation of the human handshake. PMID:21206462

  13. Low-Resistance Spin Injection into Silicon Using Graphene Tunnel Barriers

    DTIC Science & Technology

    2012-11-01

    compromise spin injection/transport/detection. Ferromagnetic metals readily form silicides even at room tempera- ture19, and diffusion of the ferromagnetic... metal /tunnel barrier/Si contacts using 2 nm SiO2 (triangles), 1.5 nm Al2O3 (diamond) and monolayer graphene (circles) tunnel barriers prepared from...and B. T. Jonker* Spin manipulation in a semiconductor offers a new paradigm for device operation beyond Moore’s law. Ferromagnetic metals are ideal

  14. Programmable and autonomous computing machine made of biomolecules

    PubMed Central

    Benenson, Yaakov; Paz-Elizur, Tamar; Adar, Rivka; Keinan, Ehud; Livneh, Zvi; Shapiro, Ehud

    2013-01-01

    Devices that convert information from one form into another according to a definite procedure are known as automata. One such hypothetical device is the universal Turing machine1, which stimulated work leading to the development of modern computers. The Turing machine and its special cases2, including finite automata3, operate by scanning a data tape, whose striking analogy to information-encoding biopolymers inspired several designs for molecular DNA computers4–8. Laboratory-scale computing using DNA and human-assisted protocols has been demonstrated9–15, but the realization of computing devices operating autonomously on the molecular scale remains rare16–20. Here we describe a programmable finite automaton comprising DNA and DNA-manipulating enzymes that solves computational problems autonomously. The automaton’s hardware consists of a restriction nuclease and ligase, the software and input are encoded by double-stranded DNA, and programming amounts to choosing appropriate software molecules. Upon mixing solutions containing these components, the automaton processes the input molecule via a cascade of restriction, hybridization and ligation cycles, producing a detectable output molecule that encodes the automaton’s final state, and thus the computational result. In our implementation 1012 automata sharing the same software run independently and in parallel on inputs (which could, in principle, be distinct) in 120 μl solution at room temperature at a combined rate of 109 transitions per second with a transition fidelity greater than 99.8%, consuming less than 10−10 W. PMID:11719800

  15. Annual Progress Report for July 1, 1981 through June 30, 1982,

    DTIC Science & Technology

    1982-08-01

    Online Search Service .....................93 14.5 Database Analyses ......................................... 0000093 14.6 Automatic Detection of...D. Dow, "Deformatio potentials of "uperlattices and Interfaces, L. at Vsaunm Sienc Ma Tchnolly. vol. 19, pp. $64-566, 1981. 4.17 3. D. Oberstar, No...cince, vol. 15, no. 3, pp. 311-320, Sept. 1981. 12.11 M. C. Loi, "Simulations among multidimensional Turing machines," Theoretilna Comanaz Sience (to

  16. Expert Systems Development Methodology

    DTIC Science & Technology

    1989-07-28

    application. Chapter 9, Design and Prototyping, discusses the problems of designing the user interface and other characteristics of the ES and the prototyping...severely in question as to whether they were computable. In order to work with this problem , Turing created what he called the universal machine. These...about the theory of computers and their relationship to problem solving. It was here at Princeton that he first began to experiment directly with

  17. Cellular automata in photonic cavity arrays.

    PubMed

    Li, Jing; Liew, T C H

    2016-10-31

    We propose theoretically a photonic Turing machine based on cellular automata in arrays of nonlinear cavities coupled with artificial gauge fields. The state of the system is recorded making use of the bistability of driven cavities, in which losses are fully compensated by an external continuous drive. The sequential update of the automaton layers is achieved automatically, by the local switching of bistable states, without requiring any additional synchronization or temporal control.

  18. Darwin's "strange inversion of reasoning".

    PubMed

    Dennett, Daniel

    2009-06-16

    Darwin's theory of evolution by natural selection unifies the world of physics with the world of meaning and purpose by proposing a deeply counterintuitive "inversion of reasoning" (according to a 19th century critic): "to make a perfect and beautiful machine, it is not requisite to know how to make it" [MacKenzie RB (1868) (Nisbet & Co., London)]. Turing proposed a similar inversion: to be a perfect and beautiful computing machine, it is not requisite to know what arithmetic is. Together, these ideas help to explain how we human intelligences came to be able to discern the reasons for all of the adaptations of life, including our own.

  19. Cooperating with machines.

    PubMed

    Crandall, Jacob W; Oudah, Mayada; Tennom; Ishowo-Oloko, Fatimah; Abdallah, Sherief; Bonnefon, Jean-François; Cebrian, Manuel; Shariff, Azim; Goodrich, Michael A; Rahwan, Iyad

    2018-01-16

    Since Alan Turing envisioned artificial intelligence, technical progress has often been measured by the ability to defeat humans in zero-sum encounters (e.g., Chess, Poker, or Go). Less attention has been given to scenarios in which human-machine cooperation is beneficial but non-trivial, such as scenarios in which human and machine preferences are neither fully aligned nor fully in conflict. Cooperation does not require sheer computational power, but instead is facilitated by intuition, cultural norms, emotions, signals, and pre-evolved dispositions. Here, we develop an algorithm that combines a state-of-the-art reinforcement-learning algorithm with mechanisms for signaling. We show that this algorithm can cooperate with people and other algorithms at levels that rival human cooperation in a variety of two-player repeated stochastic games. These results indicate that general human-machine cooperation is achievable using a non-trivial, but ultimately simple, set of algorithmic mechanisms.

  20. Translations on USSR Science and Technology, Physical Sciences and Technology, Number 16

    DTIC Science & Technology

    1977-08-05

    34INVESTIGATION OF SPLITTING OF LIGHT NUCLEI WITH HIGH-ENERGY y -RAYS WITH THE METHOD OF WILSON’S CHAMBER OPERATING IN POWERFUL BEAMS OF ELECTRONIC...boast high reliability, high speed, and extremely modest power requirements. Information oh the Screen Visual display devices greatly facilitate...area of application of these units Includes navigation, control of power systems, machine tools, and manufac- turing processes. Th» ^»abilities of

  1. Tactics for mechanized reasoning: a commentary on Milner (1984) ‘The use of machines to assist in rigorous proof’

    PubMed Central

    Gordon, M. J. C.

    2015-01-01

    Robin Milner's paper, ‘The use of machines to assist in rigorous proof’, introduces methods for automating mathematical reasoning that are a milestone in the development of computer-assisted theorem proving. His ideas, particularly his theory of tactics, revolutionized the architecture of proof assistants. His methodology for automating rigorous proof soundly, particularly his theory of type polymorphism in programing, led to major contributions to the theory and design of programing languages. His citation for the 1991 ACM A.M. Turing award, the most prestigious award in computer science, credits him with, among other achievements, ‘probably the first theoretically based yet practical tool for machine assisted proof construction’. This commentary was written to celebrate the 350th anniversary of the journal Philosophical Transactions of the Royal Society. PMID:25750147

  2. Cognitive evaluation for the diagnosis of Alzheimer's disease based on Turing Test and Virtual Environments.

    PubMed

    Fernandez Montenegro, Juan Manuel; Argyriou, Vasileios

    2017-05-01

    Alzheimer's screening tests are commonly used by doctors to diagnose the patient's condition and stage as early as possible. Most of these tests are based on pen-paper interaction and do not embrace the advantages provided by new technologies. This paper proposes novel Alzheimer's screening tests based on virtual environments and game principles using new immersive technologies combined with advanced Human Computer Interaction (HCI) systems. These new tests are focused on the immersion of the patient in a virtual room, in order to mislead and deceive the patient's mind. In addition, we propose two novel variations of Turing Test proposed by Alan Turing as a method to detect dementia. As a result, four tests are introduced demonstrating the wide range of screening mechanisms that could be designed using virtual environments and game concepts. The proposed tests are focused on the evaluation of memory loss related to common objects, recent conversations and events; the diagnosis of problems in expressing and understanding language; the ability to recognize abnormalities; and to differentiate between virtual worlds and reality, or humans and machines. The proposed screening tests were evaluated and tested using both patients and healthy adults in a comparative study with state-of-the-art Alzheimer's screening tests. The results show the capacity of the new tests to distinguish healthy people from Alzheimer's patients. Copyright © 2017. Published by Elsevier Inc.

  3. Quantum Computing since Democritus

    NASA Astrophysics Data System (ADS)

    Aaronson, Scott

    2013-03-01

    1. Atoms and the void; 2. Sets; 3. Gödel, Turing, and friends; 4. Minds and machines; 5. Paleocomplexity; 6. P, NP, and friends; 7. Randomness; 8. Crypto; 9. Quantum; 10. Quantum computing; 11. Penrose; 12. Decoherence and hidden variables; 13. Proofs; 14. How big are quantum states?; 15. Skepticism of quantum computing; 16. Learning; 17. Interactive proofs and more; 18. Fun with the Anthropic Principle; 19. Free will; 20. Time travel; 21. Cosmology and complexity; 22. Ask me anything.

  4. A Non-Verbal Turing Test: Differentiating Mind from Machine in Gaze-Based Social Interaction

    PubMed Central

    Pfeiffer, Ulrich J.; Timmermans, Bert; Bente, Gary; Vogeley, Kai; Schilbach, Leonhard

    2011-01-01

    In social interaction, gaze behavior provides important signals that have a significant impact on our perception of others. Previous investigations, however, have relied on paradigms in which participants are passive observers of other persons’ gazes and do not adjust their gaze behavior as is the case in real-life social encounters. We used an interactive eye-tracking paradigm that allows participants to interact with an anthropomorphic virtual character whose gaze behavior is responsive to where the participant looks on the stimulus screen in real time. The character’s gaze reactions were systematically varied along a continuum from a maximal probability of gaze aversion to a maximal probability of gaze-following during brief interactions, thereby varying contingency and congruency of the reactions. We investigated how these variations influenced whether participants believed that the character was controlled by another person (i.e., a confederate) or a computer program. In a series of experiments, the human confederate was either introduced as naïve to the task, cooperative, or competitive. Results demonstrate that the ascription of humanness increases with higher congruency of gaze reactions when participants are interacting with a naïve partner. In contrast, humanness ascription is driven by the degree of contingency irrespective of congruency when the confederate was introduced as cooperative. Conversely, during interaction with a competitive confederate, judgments were neither based on congruency nor on contingency. These results offer important insights into what renders the experience of an interaction truly social: Humans appear to have a default expectation of reciprocation that can be influenced drastically by the presumed disposition of the interactor to either cooperate or compete. PMID:22096599

  5. Optical selectionist approach to optical connectionist systems

    NASA Astrophysics Data System (ADS)

    Caulfield, H. John

    1994-03-01

    Two broad approaches to computing are known - connectionist (which includes Turing Machines but is demonstrably more powerful) and selectionist. Human computer engineers tend to prefer the connectionist approach which includes neural networks. Nature uses both but may show an overall preference for selectionism. "Looking back into the history of biology, it appears that whenever a phenomenon resembles learning, an instructive theory was first proposed to account for the underlying mechanisms. In every case, this was later replaced by a selective theory." - N. K. Jeme, Nobelist in Immunology.

  6. Tradeoffs in the design of a system for high level language interpretation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osorio, F.C.C.; Patt, Y.N.

    The problem of designing a system for high-level language interpretation (HLLI) is considered. First, a model of the design process is presented where several styles of design, e.g. turing machine interpretation, CISC architecture interpretation and RISC architecture interpretation are treated uniformly. Second, the most significant characteristics of HLLI are analysed in the context of different design styles, and some guidelines are presented on how to identify the most suitable design style for a given high-level language problem. 12 references.

  7. Paradigms for machine learning

    NASA Technical Reports Server (NTRS)

    Schlimmer, Jeffrey C.; Langley, Pat

    1991-01-01

    Five paradigms are described for machine learning: connectionist (neural network) methods, genetic algorithms and classifier systems, empirical methods for inducing rules and decision trees, analytic learning methods, and case-based approaches. Some dimensions are considered along with these paradigms vary in their approach to learning, and the basic methods are reviewed that are used within each framework, together with open research issues. It is argued that the similarities among the paradigms are more important than their differences, and that future work should attempt to bridge the existing boundaries. Finally, some recent developments in the field of machine learning are discussed, and their impact on both research and applications is examined.

  8. Rosen's (M,R) system as an X-machine.

    PubMed

    Palmer, Michael L; Williams, Richard A; Gatherer, Derek

    2016-11-07

    Robert Rosen's (M,R) system is an abstract biological network architecture that is allegedly both irreducible to sub-models of its component states and non-computable on a Turing machine. (M,R) stands as an obstacle to both reductionist and mechanistic presentations of systems biology, principally due to its self-referential structure. If (M,R) has the properties claimed for it, computational systems biology will not be possible, or at best will be a science of approximate simulations rather than accurate models. Several attempts have been made, at both empirical and theoretical levels, to disprove this assertion by instantiating (M,R) in software architectures. So far, these efforts have been inconclusive. In this paper, we attempt to demonstrate why - by showing how both finite state machine and stream X-machine formal architectures fail to capture the self-referential requirements of (M,R). We then show that a solution may be found in communicating X-machines, which remove self-reference using parallel computation, and then synthesise such machine architectures with object-orientation to create a formal basis for future software instantiations of (M,R) systems. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Square Turing patterns in reaction-diffusion systems with coupled layers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Jing; Wang, Hongli, E-mail: hlwang@pku.edu.cn, E-mail: qi@pku.edu.cn; Center for Quantitative Biology, Peking University, Beijing 100871

    Square Turing patterns are usually unstable in reaction-diffusion systems and are rarely observed in corresponding experiments and simulations. We report here an example of spontaneous formation of square Turing patterns with the Lengyel-Epstein model of two coupled layers. The squares are found to be a result of the resonance between two supercritical Turing modes with an appropriate ratio. Besides, the spatiotemporal resonance of Turing modes resembles to the mode-locking phenomenon. Analysis of the general amplitude equations for square patterns reveals that the fixed point corresponding to square Turing patterns is stationary when the parameters adopt appropriate values.

  10. Establishing consciousness in non-communicative patients: a modern-day version of the Turing test.

    PubMed

    Stins, John F

    2009-03-01

    In a recent study of a patient in a persistent vegetative state, [Owen, A. M., Coleman, M. R., Boly, M., Davis, M. H., Laureys, S., & Pickard, J. D. (2006). Detecting awareness in the vegetative state. Science, 313, 1402] claimed that they had demonstrated the presence of consciousness in this patient. This bold conclusion was based on the isomorphy between brain activity in this patient and a set of conscious control subjects, obtained in various imagery tasks. However, establishing consciousness in unresponsive patients is fraught with methodological and conceptual difficulties. The aim of this paper is to demonstrate that the current debate surrounding consciousness in VS patients has parallels in the artificial intelligence (AI) debate as to whether machines can think. Basically, (Owen et al., 2006) used a method analogous to the Turing test to reveal the presence of consciousness, whereas their adversaries adopted a line of reasoning akin to Searle's Chinese room argument. Highlighting the correspondence between these two debates can help to clarify the issues surrounding consciousness in non-communicative agents.

  11. Paradigms for Realizing Machine Learning Algorithms.

    PubMed

    Agneeswaran, Vijay Srinivas; Tonpay, Pranay; Tiwary, Jayati

    2013-12-01

    The article explains the three generations of machine learning algorithms-with all three trying to operate on big data. The first generation tools are SAS, SPSS, etc., while second generation realizations include Mahout and RapidMiner (that work over Hadoop), and the third generation paradigms include Spark and GraphLab, among others. The essence of the article is that for a number of machine learning algorithms, it is important to look beyond the Hadoop's Map-Reduce paradigm in order to make them work on big data. A number of promising contenders have emerged in the third generation that can be exploited to realize deep analytics on big data.

  12. Computing exponentially faster: implementing a non-deterministic universal Turing machine using DNA

    PubMed Central

    Currin, Andrew; Korovin, Konstantin; Ababi, Maria; Roper, Katherine; Kell, Douglas B.; Day, Philip J.

    2017-01-01

    The theory of computer science is based around universal Turing machines (UTMs): abstract machines able to execute all possible algorithms. Modern digital computers are physical embodiments of classical UTMs. For the most important class of problem in computer science, non-deterministic polynomial complete problems, non-deterministic UTMs (NUTMs) are theoretically exponentially faster than both classical UTMs and quantum mechanical UTMs (QUTMs). However, no attempt has previously been made to build an NUTM, and their construction has been regarded as impossible. Here, we demonstrate the first physical design of an NUTM. This design is based on Thue string rewriting systems, and thereby avoids the limitations of most previous DNA computing schemes: all the computation is local (simple edits to strings) so there is no need for communication, and there is no need to order operations. The design exploits DNA's ability to replicate to execute an exponential number of computational paths in P time. Each Thue rewriting step is embodied in a DNA edit implemented using a novel combination of polymerase chain reactions and site-directed mutagenesis. We demonstrate that the design works using both computational modelling and in vitro molecular biology experimentation: the design is thermodynamically favourable, microprogramming can be used to encode arbitrary Thue rules, all classes of Thue rule can be implemented, and non-deterministic rule implementation. In an NUTM, the resource limitation is space, which contrasts with classical UTMs and QUTMs where it is time. This fundamental difference enables an NUTM to trade space for time, which is significant for both theoretical computer science and physics. It is also of practical importance, for to quote Richard Feynman ‘there's plenty of room at the bottom’. This means that a desktop DNA NUTM could potentially utilize more processors than all the electronic computers in the world combined, and thereby outperform the world's current fastest supercomputer, while consuming a tiny fraction of its energy. PMID:28250099

  13. From Turing machines to computer viruses.

    PubMed

    Marion, Jean-Yves

    2012-07-28

    Self-replication is one of the fundamental aspects of computing where a program or a system may duplicate, evolve and mutate. Our point of view is that Kleene's (second) recursion theorem is essential to understand self-replication mechanisms. An interesting example of self-replication codes is given by computer viruses. This was initially explained in the seminal works of Cohen and of Adleman in the 1980s. In fact, the different variants of recursion theorems provide and explain constructions of self-replicating codes and, as a result, of various classes of malware. None of the results are new from the point of view of computability theory. We now propose a self-modifying register machine as a model of computation in which we can effectively deal with the self-reproduction and in which new offsprings can be activated as independent organisms.

  14. Turing's Man, Turing's Woman, or Turing's Person?: Gender, Language, and Computers. Working Paper No. 166.

    ERIC Educational Resources Information Center

    Rothschild, Joan

    This essay compares two recent books on computer technology in terms of their usage of gendered or gender-free language. The two books examined are "Turing's Man: Western Culture in the Computer Age" by J. David Bolter and "The Second Self: Computers and the Human Spirit" by Sherry Turkle. It is argued that the two authors' gender differences in…

  15. Small Universal Bacteria and Plasmid Computing Systems.

    PubMed

    Wang, Xun; Zheng, Pan; Ma, Tongmao; Song, Tao

    2018-05-29

    Bacterial computing is a known candidate in natural computing, the aim being to construct "bacterial computers" for solving complex problems. In this paper, a new kind of bacterial computing system, named the bacteria and plasmid computing system (BP system), is proposed. We investigate the computational power of BP systems with finite numbers of bacteria and plasmids. Specifically, it is obtained in a constructive way that a BP system with 2 bacteria and 34 plasmids is Turing universal. The results provide a theoretical cornerstone to construct powerful bacterial computers and demonstrate a concept of paradigms using a "reasonable" number of bacteria and plasmids for such devices.

  16. [Styles of programming 1952-1972].

    PubMed

    van den Bogaard, Adrienne

    2008-01-01

    In the field of history of computing, the construction of the early computers has received much scholarly attention. However, these machines have not only been important because of their logical design and their engineering, but also because of the programming practices that emerged around these first machines. This article compares two styles of programming that developed around Dutch 'first computers'. The first style is represented by Edsger Wybe Dijkstra (1930-2002), who would receive the Turing Award for his work in 1972. Dijkstra developed a mathematical style of programming--a program was something you should be able to design mathematically and prove it logically. The second style is represented by Willem Louis van der Poel (born 1926). For him, programming is 'trickology'. A program is primarily a technical artefact that should work: a program is something you play with, comparable to the way one solves a puzzle.

  17. The sensitivity of Turing self-organization to biological feedback delays: 2D models of fish pigmentation.

    PubMed

    Gaffney, E A; Lee, S Seirin

    2015-03-01

    Turing morphogen models have been extensively explored in the context of large-scale self-organization in multicellular biological systems. However, reconciling the detailed biology of morphogen dynamics, while accounting for time delays associated with gene expression, reveals aberrant behaviours that are not consistent with early developmental self-organization, especially the requirement for exquisite temporal control. Attempts to reconcile the interpretation of Turing's ideas with an increasing understanding of the mechanisms driving zebrafish pigmentation suggests that one should reconsider Turing's model in terms of pigment cells rather than morphogens (Nakamasu et al., 2009, PNAS, 106: , 8429-8434; Yamaguchi et al., 2007, PNAS, 104: , 4790-4793). Here the dynamics of pigment cells is subject to response delays implicit in the cell cycle and apoptosis. Hence we explore simulations of fish skin patterning, focussing on the dynamical influence of gene expression delays in morphogen-based Turing models and response delays for cell-based Turing models. We find that reconciling the mechanisms driving the behaviour of Turing systems with observations of fish skin patterning remains a fundamental challenge. © The Authors 2013. Published by Oxford University Press on behalf of the Institute of Mathematics and its Applications. All rights reserved.

  18. Globally Stable Microresonator Turing Pattern Formation for Coherent High-Power THz Radiation On-Chip

    NASA Astrophysics Data System (ADS)

    Huang, Shu-Wei; Yang, Jinghui; Yang, Shang-Hua; Yu, Mingbin; Kwong, Dim-Lee; Zelevinsky, T.; Jarrahi, Mona; Wong, Chee Wei

    2017-10-01

    In nonlinear microresonators driven by continuous-wave (cw) lasers, Turing patterns have been studied in the formalism of the Lugiato-Lefever equation with emphasis on their high coherence and exceptional robustness against perturbations. Destabilization of Turing patterns and the transition to spatiotemporal chaos, however, limit the available energy carried in the Turing rolls and prevent further harvest of their high coherence and robustness to noise. Here, we report a novel scheme to circumvent such destabilization, by incorporating the effect of local mode hybridizations, and we attain globally stable Turing pattern formation in chip-scale nonlinear oscillators with significantly enlarged parameter space, achieving a record-high power-conversion efficiency of 45% and an elevated peak-to-valley contrast of 100. The stationary Turing pattern is discretely tunable across 430 GHz on a THz carrier, with a fractional frequency sideband nonuniformity measured at 7.3 ×10-14 . We demonstrate the simultaneous microwave and optical coherence of the Turing rolls at different evolution stages through ultrafast optical correlation techniques. The free-running Turing-roll coherence, 9 kHz in 200 ms and 160 kHz in 20 minutes, is transferred onto a plasmonic photomixer for one of the highest-power THz coherent generations at room temperature, with 1.1% optical-to-THz power conversion. Its long-term stability can be further improved by more than 2 orders of magnitude, reaching an Allan deviation of 6 ×10-10 at 100 s, with a simple computer-aided slow feedback control. The demonstrated on-chip coherent high-power Turing-THz system is promising to find applications in astrophysics, medical imaging, and wireless communications.

  19. Time for paradigmatic substitution in psychology. What are the alternatives?

    PubMed

    Kolstad, Arnulf

    2010-03-01

    This article focuses on the "machine paradigm" in psychology and its consequences for (mis)understanding of human beings. It discusses causes of the mainstream epistemology in Western societies, referring to philosophical traditions, the prestige of some natural sciences and mathematical statistics. It emphasizes how the higher psychological functions develop dialectically from a biological basis and how the brain due to its plasticity changes with mental and physical activity. This makes a causal machine paradigm unfit to describe and explain human psychology and human development. Some concepts for an alternative paradigm are suggested.

  20. Text-based CAPTCHAs over the years

    NASA Astrophysics Data System (ADS)

    Chow, Y. W.; Susilo, W.

    2017-11-01

    The notion of CAPTCHAs has been around for more than two decades. Since its introduction, CAPTCHAs have now become a ubiquitous part of the Internet. Over the years, research on various aspects of CAPTCHAs has evolved and different design principles have emerged. This article discusses text-based CAPTCHAs in terms of their fundamental requirements, namely, security and usability. Practicality necessitates that humans must be able to correctly solve CAPTCHA challenges, while at the same time automated computer programs should have difficulty solving the challenges. This article also presents alternative paradigms to text-based CAPTCHA design that have been examined in previous work. With the advances in techniques to defeat CAPTCHAs, the future of auto- mated Turing tests is an open question.

  1. Turing instability in reaction-diffusion systems with nonlinear diffusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zemskov, E. P., E-mail: zemskov@ccas.ru

    2013-10-15

    The Turing instability is studied in two-component reaction-diffusion systems with nonlinear diffusion terms, and the regions in parametric space where Turing patterns can form are determined. The boundaries between super- and subcritical bifurcations are found. Calculations are performed for one-dimensional brusselator and oregonator models.

  2. Investigation occurrences of turing pattern in Schnakenberg and Gierer-Meinhardt equation

    NASA Astrophysics Data System (ADS)

    Nurahmi, Annisa Fitri; Putra, Prama Setia; Nuraini, Nuning

    2018-03-01

    There are several types of animals with unusual, varied patterns on their skin. The skin pigmentation system influences this in the animal. On the other side, in 1950 Alan Turing formulated the mathematical theory of morphogenesis, where this model can bring up a spatial pattern or so-called Turing pattern. This research discusses the identification of Turing's model that can produce animal skin pattern. Investigations conducted on two types of equations: Schnakenberg (1979), and Gierer-Meinhardt (1972). In this research, parameters were explored to produce Turing's patter on that both equation. The numerical simulation in this research done using Neumann Homogeneous and Dirichlet Homogeneous boundary condition. The investigation of Schnakenberg equation yielded poison dart frog (Andinobates dorisswansonae) and ladybird (Coccinellidae septempunctata) pattern while skin fish pattern was showed by Gierer-Meinhardt equation.

  3. The paradigm compiler: Mapping a functional language for the connection machine

    NASA Technical Reports Server (NTRS)

    Dennis, Jack B.

    1989-01-01

    The Paradigm Compiler implements a new approach to compiling programs written in high level languages for execution on highly parallel computers. The general approach is to identify the principal data structures constructed by the program and to map these structures onto the processing elements of the target machine. The mapping is chosen to maximize performance as determined through compile time global analysis of the source program. The source language is Sisal, a functional language designed for scientific computations, and the target language is Paris, the published low level interface to the Connection Machine. The data structures considered are multidimensional arrays whose dimensions are known at compile time. Computations that build such arrays usually offer opportunities for highly parallel execution; they are data parallel. The Connection Machine is an attractive target for these computations, and the parallel for construct of the Sisal language is a convenient high level notation for data parallel algorithms. The principles and organization of the Paradigm Compiler are discussed.

  4. Turing Trade: A Hybrid of a Turing Test and a Prediction Market

    NASA Astrophysics Data System (ADS)

    Farfel, Joseph; Conitzer, Vincent

    We present Turing Trade, a web-based game that is a hybrid of a Turing test and a prediction market. In this game, there is a mystery conversation partner, the “target,” who is trying to appear human, but may in reality be either a human or a bot. There are multiple judges (or “bettors”), who interrogate the target in order to assess whether it is a human or a bot. Throughout the interrogation, each bettor bets on the nature of the target by buying or selling human (or bot) securities, which pay out if the target is a human (bot). The resulting market price represents the bettors’ aggregate belief that the target is a human. This game offers multiple advantages over standard variants of the Turing test. Most significantly, our game gathers much more fine-grained data, since we obtain not only the judges’ final assessment of the target’s humanity, but rather the entire progression of their aggregate belief over time. This gives us the precise moments in conversations where the target’s response caused a significant shift in the aggregate belief, indicating that the response was decidedly human or unhuman. An additional benefit is that (we believe) the game is more enjoyable to participants than a standard Turing test. This is important because otherwise, we will fail to collect significant amounts of data. In this paper, we describe in detail how Turing Trade works, exhibit some example logs, and analyze how well Turing Trade functions as a prediction market by studying the calibration and sharpness of its forecasts (from real user data).

  5. Synthetic biology and its alternatives. Descartes, Kant and the idea of engineering biological machines.

    PubMed

    Kogge, Werner; Richter, Michael

    2013-06-01

    The engineering-based approach of synthetic biology is characterized by an assumption that 'engineering by design' enables the construction of 'living machines'. These 'machines', as biological machines, are expected to display certain properties of life, such as adapting to changing environments and acting in a situated way. This paper proposes that a tension exists between the expectations placed on biological artefacts and the notion of producing such systems by means of engineering; this tension makes it seem implausible that biological systems, especially those with properties characteristic of living beings, can in fact be produced using the specific methods of engineering. We do not claim that engineering techniques have nothing to contribute to the biotechnological construction of biological artefacts. However, drawing on Descartes's and Kant's thinking on the relationship between the organism and the machine, we show that it is considerably more plausible to assume that distinctively biological artefacts emerge within a paradigm different from the paradigm of the Cartesian machine that underlies the engineering approach. We close by calling for increased attention to be paid to approaches within molecular biology and chemistry that rest on conceptions different from those of synthetic biology's engineering paradigm. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Turing patterns in parabolic systems of conservation laws and numerically observed stability of periodic waves

    NASA Astrophysics Data System (ADS)

    Barker, Blake; Jung, Soyeun; Zumbrun, Kevin

    2018-03-01

    Turing patterns on unbounded domains have been widely studied in systems of reaction-diffusion equations. However, up to now, they have not been studied for systems of conservation laws. Here, we (i) derive conditions for Turing instability in conservation laws and (ii) use these conditions to find families of periodic solutions bifurcating from uniform states, numerically continuing these families into the large-amplitude regime. For the examples studied, numerical stability analysis suggests that stable periodic waves can emerge either from supercritical Turing bifurcations or, via secondary bifurcation as amplitude is increased, from subcritical Turing bifurcations. This answers in the affirmative a question of Oh-Zumbrun whether stable periodic solutions of conservation laws can occur. Determination of a full small-amplitude stability diagram - specifically, determination of rigorous Eckhaus-type stability conditions - remains an interesting open problem.

  7. Polyamide membranes with nanoscale Turing structures for water purification

    NASA Astrophysics Data System (ADS)

    Tan, Zhe; Chen, Shengfu; Peng, Xinsheng; Zhang, Lin; Gao, Congjie

    2018-05-01

    The emergence of Turing structures is of fundamental importance, and designing these structures and developing their applications have practical effects in chemistry and biology. We use a facile route based on interfacial polymerization to generate Turing-type polyamide membranes for water purification. Manipulation of shapes by control of reaction conditions enabled the creation of membranes with bubble or tube structures. These membranes exhibit excellent water-salt separation performance that surpasses the upper-bound line of traditional desalination membranes. Furthermore, we show the existence of high water permeability sites in the Turing structures, where water transport through the membranes is enhanced.

  8. Mind as Space

    NASA Astrophysics Data System (ADS)

    McKinstry, Chris

    The present article describes a possible method for the automatic discovery of a universal human semantic-affective hyperspatial approximation of the human subcognitive substrate - the associative network which French (1990) asserts is the ultimate foundation of the human ability to pass the Turing Test - that does not require a machine to have direct human experience or a physical human body. This method involves automatic programming - such as Koza's genetic programming (1992) - guided in the discovery of the proposed universal hypergeometry by feedback from a Minimum Intelligent Signal Test or MIST (McKinstry, 1997) constructed from a very large number of human validated probabilistic propositions collected from a large population of Internet users. It will be argued that though a lifetime of human experience is required to pass a rigorous Turing Test, a probabilistic propositional approximation of this experience can be constructed via public participation on the Internet, and then used as a fitness function to direct the artificial evolution of a universal hypergeometry capable of classifying arbitrary propositions. A model of this hypergeometry will be presented; it predicts Miller's "Magical Number Seven" (1956) as the size of human short-term memory from fundamental hypergeometric properties. A system that can lead to the generation of novel propositions or "artificial thoughts" will also be described.

  9. Feasibility of Turing-Style Tests for Autonomous Aerial Vehicle "Intelligence"

    NASA Technical Reports Server (NTRS)

    Young, Larry A.

    2007-01-01

    A new approach is suggested to define and evaluate key metrics as to autonomous aerial vehicle performance. This approach entails the conceptual definition of a "Turing Test" for UAVs. Such a "UAV Turing test" would be conducted by means of mission simulations and/or tailored flight demonstrations of vehicles under the guidance of their autonomous system software. These autonomous vehicle mission simulations and flight demonstrations would also have to be benchmarked against missions "flown" with pilots/human-operators in the loop. In turn, scoring criteria for such testing could be based upon both quantitative mission success metrics (unique to each mission) and by turning to analog "handling quality" metrics similar to the well-known Cooper-Harper pilot ratings used for manned aircraft. Autonomous aerial vehicles would be considered to have successfully passed this "UAV Turing Test" if the aggregate mission success metrics and handling qualities for the autonomous aerial vehicle matched or exceeded the equivalent metrics for missions conducted with pilots/human-operators in the loop. Alternatively, an independent, knowledgeable observer could provide the "UAV Turing Test" ratings of whether a vehicle is autonomous or "piloted." This observer ideally would, in the more sophisticated mission simulations, also have the enhanced capability of being able to override the scripted mission scenario and instigate failure modes and change of flight profile/plans. If a majority of mission tasks are rated as "piloted" by the observer, when in reality the vehicle/simulation is fully- or semi- autonomously controlled, then the vehicle/simulation "passes" the "UAV Turing Test." In this regards, this second "UAV Turing Test" approach is more consistent with Turing s original "imitation game" proposal. The overall feasibility, and important considerations and limitations, of such an approach for judging/evaluating autonomous aerial vehicle "intelligence" will be discussed from a theoretical perspective.

  10. Cooperativity to increase Turing pattern space for synthetic biology.

    PubMed

    Diambra, Luis; Senthivel, Vivek Raj; Menendez, Diego Barcena; Isalan, Mark

    2015-02-20

    It is hard to bridge the gap between mathematical formulations and biological implementations of Turing patterns, yet this is necessary for both understanding and engineering these networks with synthetic biology approaches. Here, we model a reaction-diffusion system with two morphogens in a monostable regime, inspired by components that we recently described in a synthetic biology study in mammalian cells.1 The model employs a single promoter to express both the activator and inhibitor genes and produces Turing patterns over large regions of parameter space, using biologically interpretable Hill function reactions. We applied a stability analysis and identified rules for choosing biologically tunable parameter relationships to increase the likelihood of successful patterning. We show how to control Turing pattern sizes and time evolution by manipulating the values for production and degradation relationships. More importantly, our analysis predicts that steep dose-response functions arising from cooperativity are mandatory for Turing patterns. Greater steepness increases parameter space and even reduces the requirement for differential diffusion between activator and inhibitor. These results demonstrate some of the limitations of linear scenarios for reaction-diffusion systems and will help to guide projects to engineer synthetic Turing patterns.

  11. Control of Turing patterns and their usage as sensors, memory arrays, and logic gates

    NASA Astrophysics Data System (ADS)

    Muzika, František; Schreiber, Igor

    2013-10-01

    We study a model system of three diffusively coupled reaction cells arranged in a linear array that display Turing patterns with special focus on the case of equal coupling strength for all components. As a suitable model reaction we consider a two-variable core model of glycolysis. Using numerical continuation and bifurcation techniques we analyze the dependence of the system's steady states on varying rate coefficient of the recycling step while the coupling coefficients of the inhibitor and activator are fixed and set at the ratios 100:1, 1:1, and 4:5. We show that stable Turing patterns occur at all three ratios but, as expected, spontaneous transition from the spatially uniform steady state to the spatially nonuniform Turing patterns occurs only in the first case. The other two cases possess multiple Turing patterns, which are stabilized by secondary bifurcations and coexist with stable uniform periodic oscillations. For the 1:1 ratio we examine modular spatiotemporal perturbations, which allow for controllable switching between the uniform oscillations and various Turing patterns. Such modular perturbations are then used to construct chemical computing devices utilizing the multiple Turing patterns. By classifying various responses we propose: (a) a single-input resettable sensor capable of reading certain value of concentration, (b) two-input and three-input memory arrays capable of storing logic information, (c) three-input, three-output logic gates performing combinations of logical functions OR, XOR, AND, and NAND.

  12. Applications of Machine Learning and Rule Induction,

    DTIC Science & Technology

    1995-02-15

    An important area of application for machine learning is in automating the acquisition of knowledge bases required for expert systems. In this paper...we review the major paradigms for machine learning , including neural networks, instance-based methods, genetic learning, rule induction, and analytic

  13. Bird's-eye view on noise-based logic.

    PubMed

    Kish, Laszlo B; Granqvist, Claes G; Horvath, Tamas; Klappenecker, Andreas; Wen, He; Bezrukov, Sergey M

    2014-01-01

    Noise-based logic is a practically deterministic logic scheme inspired by the randomness of neural spikes and uses a system of uncorrelated stochastic processes and their superposition to represent the logic state. We briefly discuss various questions such as ( i ) What does practical determinism mean? ( ii ) Is noise-based logic a Turing machine? ( iii ) Is there hope to beat (the dreams of) quantum computation by a classical physical noise-based processor, and what are the minimum hardware requirements for that? Finally, ( iv ) we address the problem of random number generators and show that the common belief that quantum number generators are superior to classical (thermal) noise-based generators is nothing but a myth.

  14. Bird's-eye view on noise-based logic

    NASA Astrophysics Data System (ADS)

    Kish, Laszlo B.; Granqvist, Claes G.; Horvath, Tamas; Klappenecker, Andreas; Wen, He; Bezrukov, Sergey M.

    2014-09-01

    Noise-based logic is a practically deterministic logic scheme inspired by the randomness of neural spikes and uses a system of uncorrelated stochastic processes and their superposition to represent the logic state. We briefly discuss various questions such as (i) What does practical determinism mean? (ii) Is noise-based logic a Turing machine? (iii) Is there hope to beat (the dreams of) quantum computation by a classical physical noise-based processor, and what are the minimum hardware requirements for that? Finally, (iv) we address the problem of random number generators and show that the common belief that quantum number generators are superior to classical (thermal) noise-based generators is nothing but a myth.

  15. Wavelength selection beyond turing

    NASA Astrophysics Data System (ADS)

    Zelnik, Yuval R.; Tzuk, Omer

    2017-06-01

    Spatial patterns arising spontaneously due to internal processes are ubiquitous in nature, varying from periodic patterns of dryland vegetation to complex structures of bacterial colonies. Many of these patterns can be explained in the context of a Turing instability, where patterns emerge due to two locally interacting components that diffuse with different speeds in the medium. Turing patterns are multistable, meaning that many different patterns with different wavelengths are possible for the same set of parameters. Nevertheless, in a given region typically only one such wavelength is dominant. In the Turing instability region, random initial conditions will mostly lead to a wavelength that is similar to that of the leading eigenvector that arises from the linear stability analysis, but when venturing beyond, little is known about the pattern that will emerge. Using dryland vegetation as a case study, we use different models of drylands ecosystems to study the wavelength pattern that is selected in various scenarios beyond the Turing instability region, focusing on the phenomena of localized states and repeated local disturbances.

  16. Introduction to the Natural Anticipator and the Artificial Anticipator

    NASA Astrophysics Data System (ADS)

    Dubois, Daniel M.

    2010-11-01

    This short communication deals with the introduction of the concept of anticipator, which is one who anticipates, in the framework of computing anticipatory systems. The definition of anticipation deals with the concept of program. Indeed, the word program, comes from "pro-gram" meaning "to write before" by anticipation, and means a plan for the programming of a mechanism, or a sequence of coded instructions that can be inserted into a mechanism, or a sequence of coded instructions, as genes or behavioural responses, that is part of an organism. Any natural or artificial programs are thus related to anticipatory rewriting systems, as shown in this paper. All the cells in the body, and the neurons in the brain, are programmed by the anticipatory genetic code, DNA, in a low-level language with four signs. The programs in computers are also computing anticipatory systems. It will be shown, at one hand, that the genetic code DNA is a natural anticipator. As demonstrated by Nobel laureate McClintock [8], genomes are programmed. The fundamental program deals with the DNA genetic code. The properties of the DNA consist in self-replication and self-modification. The self-replicating process leads to reproduction of the species, while the self-modifying process leads to new species or evolution and adaptation in existing ones. The genetic code DNA keeps its instructions in memory in the DNA coding molecule. The genetic code DNA is a rewriting system, from DNA coding to DNA template molecule. The DNA template molecule is a rewriting system to the Messenger RNA molecule. The information is not destroyed during the execution of the rewriting program. On the other hand, it will be demonstrated that Turing machine is an artificial anticipator. The Turing machine is a rewriting system. The head reads and writes, modifying the content of the tape. The information is destroyed during the execution of the program. This is an irreversible process. The input data are lost.

  17. Basic difference between brain and computer: integration of asynchronous processes implemented as hardware model of the retina.

    PubMed

    Przybyszewski, Andrzej W; Linsay, Paul S; Gaudiano, Paolo; Wilson, Christopher M

    2007-01-01

    There exists a common view that the brain acts like a Turing machine: The machine reads information from an infinite tape (sensory data) and, on the basis of the machine's state and information from the tape, an action (decision) is made. The main problem with this model lies in how to synchronize a large number of tapes in an adaptive way so that the machine is able to accomplish tasks such as object classification. We propose that such mechanisms exist already in the eye. A popular view is that the retina, typically associated with high gain and adaptation for light processing, is actually performing local preprocessing by means of its center-surround receptive field. We would like to show another property of the retina: The ability to integrate many independent processes. We believe that this integration is implemented by synchronization of neuronal oscillations. In this paper, we present a model of the retina consisting of a series of coupled oscillators which can synchronize on several scales. Synchronization is an analog process which is converted into a digital spike train in the output of the retina. We have developed a hardware implementation of this model, which enables us to carry out rapid simulation of multineuron oscillatory dynamics. We show that the properties of the spike trains in our model are similar to those found in vivo in the cat retina.

  18. Inter-dependent tissue growth and Turing patterning in a model for long bone development

    NASA Astrophysics Data System (ADS)

    Tanaka, Simon; Iber, Dagmar

    2013-10-01

    The development of long bones requires a sophisticated spatial organization of cellular signalling, proliferation, and differentiation programs. How such spatial organization emerges on the growing long bone domain is still unresolved. Based on the reported biochemical interactions we developed a regulatory model for the core signalling factors IHH, PTCH1, and PTHrP and included two cell types, proliferating/resting chondrocytes and (pre-)hypertrophic chondrocytes. We show that the reported IHH-PTCH1 interaction gives rise to a Schnakenberg-type Turing kinetics, and that inclusion of PTHrP is important to achieve robust patterning when coupling patterning and tissue dynamics. The model reproduces relevant spatiotemporal gene expression patterns, as well as a number of relevant mutant phenotypes. In summary, we propose that a ligand-receptor based Turing mechanism may control the emergence of patterns during long bone development, with PTHrP as an important mediator to confer patterning robustness when the sensitive Turing system is coupled to the dynamics of a growing and differentiating tissue. We have previously shown that ligand-receptor based Turing mechanisms can also result from BMP-receptor, SHH-receptor, and GDNF-receptor interactions, and that these reproduce the wildtype and mutant patterns during digit formation in limbs and branching morphogenesis in lung and kidneys. Receptor-ligand interactions may thus constitute a general mechanism to generate Turing patterns in nature.

  19. The difficult legacy of Turing's wager.

    PubMed

    Thwaites, Andrew; Soltan, Andrew; Wieser, Eric; Nimmo-Smith, Ian

    2017-08-01

    Describing the human brain in mathematical terms is an important ambition of neuroscience research, yet the challenges remain considerable. It was Alan Turing, writing in 1950, who first sought to demonstrate how time-consuming such an undertaking would be. Through analogy to the computer program, Turing argued that arriving at a complete mathematical description of the mind would take well over a thousand years. In this opinion piece, we argue that - despite seventy years of progress in the field - his arguments remain both prescient and persuasive.

  20. Universal Computation and Construction in GoL Cellular Automata

    NASA Astrophysics Data System (ADS)

    Goucher, Adam P.

    This chapter is concerned with the developments of universal computation and construction within Conway's Game of Life (GoL). I will begin by describing the history of the concepts and mechanisms for universal computation and construction in GoL, before explaining how a Universal Computer-Constructor (UCC) would operate in this automaton. Moreover, I shall present the design of a working UCC in the rule. It is both capable of computing any calculation (i.e. it is Turing-complete) and constructing most, if not all, of the constructible configurations within the rule. It cannot construct patterns which have no predecessor; neither can any machine in the rule (for obvious reasons). As such, it is more accurately a general constructor, rather than a universal constructor.

  1. Stabilization of a spatially uniform steady state in two systems exhibiting Turing patterns

    NASA Astrophysics Data System (ADS)

    Konishi, Keiji; Hara, Naoyuki

    2018-05-01

    This paper deals with the stabilization of a spatially uniform steady state in two coupled one-dimensional reaction-diffusion systems with Turing instability. This stabilization corresponds to amplitude death that occurs in a coupled system with Turing instability. Stability analysis of the steady state shows that stabilization does not occur if the two reaction-diffusion systems are identical. We derive a sufficient condition for the steady state to be stable for any length of system and any boundary conditions. Our analytical results are supported with numerical examples.

  2. The AGINAO Self-Programming Engine

    NASA Astrophysics Data System (ADS)

    Skaba, Wojciech

    2013-01-01

    The AGINAO is a project to create a human-level artificial general intelligence system (HL AGI) embodied in the Aldebaran Robotics' NAO humanoid robot. The dynamical and open-ended cognitive engine of the robot is represented by an embedded and multi-threaded control program, that is self-crafted rather than hand-crafted, and is executed on a simulated Universal Turing Machine (UTM). The actual structure of the cognitive engine emerges as a result of placing the robot in a natural preschool-like environment and running a core start-up system that executes self-programming of the cognitive layer on top of the core layer. The data from the robot's sensory devices supplies the training samples for the machine learning methods, while the commands sent to actuators enable testing hypotheses and getting a feedback. The individual self-created subroutines are supposed to reflect the patterns and concepts of the real world, while the overall program structure reflects the spatial and temporal hierarchy of the world dependencies. This paper focuses on the details of the self-programming approach, limiting the discussion of the applied cognitive architecture to a necessary minimum.

  3. Chromatin Computation

    PubMed Central

    Bryant, Barbara

    2012-01-01

    In living cells, DNA is packaged along with protein and RNA into chromatin. Chemical modifications to nucleotides and histone proteins are added, removed and recognized by multi-functional molecular complexes. Here I define a new computational model, in which chromatin modifications are information units that can be written onto a one-dimensional string of nucleosomes, analogous to the symbols written onto cells of a Turing machine tape, and chromatin-modifying complexes are modeled as read-write rules that operate on a finite set of adjacent nucleosomes. I illustrate the use of this “chromatin computer” to solve an instance of the Hamiltonian path problem. I prove that chromatin computers are computationally universal – and therefore more powerful than the logic circuits often used to model transcription factor control of gene expression. Features of biological chromatin provide a rich instruction set for efficient computation of nontrivial algorithms in biological time scales. Modeling chromatin as a computer shifts how we think about chromatin function, suggests new approaches to medical intervention, and lays the groundwork for the engineering of a new class of biological computing machines. PMID:22567109

  4. Modeling digits. Digit patterning is controlled by a Bmp-Sox9-Wnt Turing network modulated by morphogen gradients.

    PubMed

    Raspopovic, J; Marcon, L; Russo, L; Sharpe, J

    2014-08-01

    During limb development, digits emerge from the undifferentiated mesenchymal tissue that constitutes the limb bud. It has been proposed that this process is controlled by a self-organizing Turing mechanism, whereby diffusible molecules interact to produce a periodic pattern of digital and interdigital fates. However, the identities of the molecules remain unknown. By combining experiments and modeling, we reveal evidence that a Turing network implemented by Bmp, Sox9, and Wnt drives digit specification. We develop a realistic two-dimensional simulation of digit patterning and show that this network, when modulated by morphogen gradients, recapitulates the expression patterns of Sox9 in the wild type and in perturbation experiments. Our systems biology approach reveals how a combination of growth, morphogen gradients, and a self-organizing Turing network can achieve robust and reproducible pattern formation. Copyright © 2014, American Association for the Advancement of Science.

  5. Patterns induced by super cross-diffusion in a predator-prey system with Michaelis-Menten type harvesting.

    PubMed

    Liu, Biao; Wu, Ranchao; Chen, Liping

    2018-04-01

    Turing instability and pattern formation in a super cross-diffusion predator-prey system with Michaelis-Menten type predator harvesting are investigated. Stability of equilibrium points is first explored with or without super cross-diffusion. It is found that cross-diffusion could induce instability of equilibria. To further derive the conditions of Turing instability, the linear stability analysis is carried out. From theoretical analysis, note that cross-diffusion is the key mechanism for the formation of spatial patterns. By taking cross-diffusion rate as bifurcation parameter, we derive amplitude equations near the Turing bifurcation point for the excited modes by means of weakly nonlinear theory. Dynamical analysis of the amplitude equations interprets the structural transitions and stability of various forms of Turing patterns. Furthermore, the theoretical results are illustrated via numerical simulations. Copyright © 2018. Published by Elsevier Inc.

  6. Is pigment patterning in fish skin determined by the Turing mechanism?

    PubMed

    Watanabe, Masakatsu; Kondo, Shigeru

    2015-02-01

    More than half a century ago, Alan Turing postulated that pigment patterns may arise from a mechanism that could be mathematically modeled based on the diffusion of two substances that interact with each other. Over the past 15 years, the molecular and genetic tools to verify this prediction have become available. Here, we review experimental studies aimed at identifying the mechanism underlying pigment pattern formation in zebrafish. Extensive molecular genetic studies in this model organism have revealed the interactions between the pigment cells that are responsible for the patterns. The mechanism discovered is substantially different from that predicted by the mathematical model, but it retains the property of 'local activation and long-range inhibition', a necessary condition for Turing pattern formation. Although some of the molecular details of pattern formation remain to be elucidated, current evidence confirms that the underlying mechanism is mathematically equivalent to the Turing mechanism. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Turing mechanism underlying a branching model for lung morphogenesis.

    PubMed

    Xu, Hui; Sun, Mingzhu; Zhao, Xin

    2017-01-01

    The mammalian lung develops through branching morphogenesis. Two primary forms of branching, which occur in order, in the lung have been identified: tip bifurcation and side branching. However, the mechanisms of lung branching morphogenesis remain to be explored. In our previous study, a biological mechanism was presented for lung branching pattern formation through a branching model. Here, we provide a mathematical mechanism underlying the branching patterns. By decoupling the branching model, we demonstrated the existence of Turing instability. We performed Turing instability analysis to reveal the mathematical mechanism of the branching patterns. Our simulation results show that the Turing patterns underlying the branching patterns are spot patterns that exhibit high local morphogen concentration. The high local morphogen concentration induces the growth of branching. Furthermore, we found that the sparse spot patterns underlie the tip bifurcation patterns, while the dense spot patterns underlies the side branching patterns. The dispersion relation analysis shows that the Turing wavelength affects the branching structure. As the wavelength decreases, the spot patterns change from sparse to dense, the rate of tip bifurcation decreases and side branching eventually occurs instead. In the process of transformation, there may exists hybrid branching that mixes tip bifurcation and side branching. Since experimental studies have reported that branching mode switching from side branching to tip bifurcation in the lung is under genetic control, our simulation results suggest that genes control the switch of the branching mode by regulating the Turing wavelength. Our results provide a novel insight into and understanding of the formation of branching patterns in the lung and other biological systems.

  8. Language Recognition via Sparse Coding

    DTIC Science & Technology

    2016-09-08

    a posteriori (MAP) adaptation scheme that further optimizes the discriminative quality of sparse-coded speech fea - tures. We empirically validate the...significantly improve the discriminative quality of sparse-coded speech fea - tures. In Section 4, we evaluate the proposed approaches against an i-vector

  9. A Computational Behaviorist Takes Turing's Test

    NASA Astrophysics Data System (ADS)

    Whalen, Thomas E.

    Behaviorism is a school of thought in experimental psychology that has given rise to powerful techniques for managing behavior. Because the Turing Test is a test of linguistic behavior rather than mental processes, approaching the test from a behavioristic perspective is worth examining. A behavioral approach begins by observing the kinds of questions that judges ask, then links the invariant features of those questions to pre-written answers. Because this approach is simple and powerful, it has been more successful in Turing competitions than the more ambitious linguistic approaches. Computational behaviorism may prove successful in other areas of Artificial Intelligence.

  10. Spontaneous Symmetry Breaking Turing-Type Pattern Formation in a Confined Dictyostelium Cell Mass

    NASA Astrophysics Data System (ADS)

    Sawai, Satoshi; Maeda, Yasuo; Sawada, Yasuji

    2000-09-01

    We have discovered a new type of patterning which occurs in a two-dimensionally confined cell mass of the cellular slime mold Dictyostelium discoideum. Besides the longitudinal structure reported earlier, we observed a spontaneous symmetry breaking spot pattern whose wavelength shows similar strain dependency to that of the longitudinal pattern. We propose that these structures are due to a reaction-diffusion Turing instability similar to the one which has been exemplified by CIMA (chlorite-iodide-malonic acid) reaction. The present finding may exhibit the first biochemical Turing structure in a developmental system with a controllable boundary condition.

  11. Modeling and analyzing stripe patterns in fish skin

    NASA Astrophysics Data System (ADS)

    Zheng, Yibo; Zhang, Lei; Wang, Yuan; Liang, Ping; Kang, Junjian

    2009-11-01

    The formation mechanism of stripe patterns in the skin of tropical fishes has been investigated by a coupled two variable reaction diffusion model. Two types of spatial inhomogeneities have been introduced into a homogenous system. Several Turing modes pumped by the Turing instability give rise to a simple stripe pattern. It is found that the Turing mechanism can only determine the wavelength of stripe pattern. The orientation of stripe pattern is determined by the spatial inhomogeneity. Our numerical results suggest that it may be the most possible mechanism for the forming process of fish skin patterns.

  12. Testing of Flame Screens and Flame Arresters as Devices Designed to Prevent the Passage of Flame (DPPF) into Tanks Containing Flammable Atmospheres According to an IMO Standard

    DTIC Science & Technology

    1989-10-01

    flashback tests FM does not speci- fy the type of enclosure to contain the explosive fuel/air mix -ture. 3.4 INTERNATIONAL CONVENTION FOR THE SAFETY OF...2) Continuous burn tests: ... "Same mix - ture and concentration as for explosion tests; flow rate of the gasoline vapor-air mixture is specified as a...gas temperature of the flammable hexane/air mix - ture on the tank side was used as the representative endu ance burn test temperature for the following

  13. Forging patterns and making waves from biology to geology: a commentary on Turing (1952) 'The chemical basis of morphogenesis'.

    PubMed

    Ball, Philip

    2015-04-19

    Alan Turing was neither a biologist nor a chemist, and yet the paper he published in 1952, 'The chemical basis of morphogenesis', on the spontaneous formation of patterns in systems undergoing reaction and diffusion of their ingredients has had a substantial impact on both fields, as well as in other areas as disparate as geomorphology and criminology. Motivated by the question of how a spherical embryo becomes a decidedly non-spherical organism such as a human being, Turing devised a mathematical model that explained how random fluctuations can drive the emergence of pattern and structure from initial uniformity. The spontaneous appearance of pattern and form in a system far away from its equilibrium state occurs in many types of natural process, and in some artificial ones too. It is often driven by very general mechanisms, of which Turing's model supplies one of the most versatile. For that reason, these patterns show striking similarities in systems that seem superficially to share nothing in common, such as the stripes of sand ripples and of pigmentation on a zebra skin. New examples of 'Turing patterns' in biology and beyond are still being discovered today. This commentary was written to celebrate the 350th anniversary of the journal Philosophical Transactions of the Royal Society.

  14. Some foundational aspects of quantum computers and quantum robots.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benioff, P.; Physics

    1998-01-01

    This paper addresses foundational issues related to quantum computing. The need for a universally valid theory such as quantum mechanics to describe to some extent its own validation is noted. This includes quantum mechanical descriptions of systems that do theoretical calculations (i.e. quantum computers) and systems that perform experiments. Quantum robots interacting with an environment are a small first step in this direction. Quantum robots are described here as mobile quantum systems with on-board quantum computers that interact with environments. Included are discussions on the carrying out of tasks and the division of tasks into computation and action phases. Specificmore » models based on quantum Turing machines are described. Differences and similarities between quantum robots plus environments and quantum computers are discussed.« less

  15. Virtual Manufacturing Techniques Designed and Applied to Manufacturing Activities in the Manufacturing Integration and Technology Branch

    NASA Technical Reports Server (NTRS)

    Shearrow, Charles A.

    1999-01-01

    One of the identified goals of EM3 is to implement virtual manufacturing by the time the year 2000 has ended. To realize this goal of a true virtual manufacturing enterprise the initial development of a machinability database and the infrastructure must be completed. This will consist of the containment of the existing EM-NET problems and developing machine, tooling, and common materials databases. To integrate the virtual manufacturing enterprise with normal day to day operations the development of a parallel virtual manufacturing machinability database, virtual manufacturing database, virtual manufacturing paradigm, implementation/integration procedure, and testable verification models must be constructed. Common and virtual machinability databases will include the four distinct areas of machine tools, available tooling, common machine tool loads, and a materials database. The machine tools database will include the machine envelope, special machine attachments, tooling capacity, location within NASA-JSC or with a contractor, and availability/scheduling. The tooling database will include available standard tooling, custom in-house tooling, tool properties, and availability. The common materials database will include materials thickness ranges, strengths, types, and their availability. The virtual manufacturing databases will consist of virtual machines and virtual tooling directly related to the common and machinability databases. The items to be completed are the design and construction of the machinability databases, virtual manufacturing paradigm for NASA-JSC, implementation timeline, VNC model of one bridge mill and troubleshoot existing software and hardware problems with EN4NET. The final step of this virtual manufacturing project will be to integrate other production sites into the databases bringing JSC's EM3 into a position of becoming a clearing house for NASA's digital manufacturing needs creating a true virtual manufacturing enterprise.

  16. Automated discovery systems and the inductivist controversy

    NASA Astrophysics Data System (ADS)

    Giza, Piotr

    2017-09-01

    The paper explores possible influences that some developments in the field of branches of AI, called automated discovery and machine learning systems, might have upon some aspects of the old debate between Francis Bacon's inductivism and Karl Popper's falsificationism. Donald Gillies facetiously calls this controversy 'the duel of two English knights', and claims, after some analysis of historical cases of discovery, that Baconian induction had been used in science very rarely, or not at all, although he argues that the situation has changed with the advent of machine learning systems. (Some clarification of terms machine learning and automated discovery is required here. The key idea of machine learning is that, given data with associated outcomes, software can be trained to make those associations in future cases which typically amounts to inducing some rules from individual cases classified by the experts. Automated discovery (also called machine discovery) deals with uncovering new knowledge that is valuable for human beings, and its key idea is that discovery is like other intellectual tasks and that the general idea of heuristic search in problem spaces applies also to discovery tasks. However, since machine learning systems discover (very low-level) regularities in data, throughout this paper I use the generic term automated discovery for both kinds of systems. I will elaborate on this later on). Gillies's line of argument can be generalised: thanks to automated discovery systems, philosophers of science have at their disposal a new tool for empirically testing their philosophical hypotheses. Accordingly, in the paper, I will address the question, which of the two philosophical conceptions of scientific method is better vindicated in view of the successes and failures of systems developed within three major research programmes in the field: machine learning systems in the Turing tradition, normative theory of scientific discovery formulated by Herbert Simon's group and the programme called HHNT, proposed by J. Holland, K. Holyoak, R. Nisbett and P. Thagard.

  17. Diverse set of Turing nanopatterns coat corneae across insect lineages

    PubMed Central

    Blagodatski, Artem; Sergeev, Anton; Kryuchkov, Mikhail; Lopatina, Yuliya; Katanaev, Vladimir L.

    2015-01-01

    Nipple-like nanostructures covering the corneal surfaces of moths, butterflies, and Drosophila have been studied by electron and atomic force microscopy, and their antireflective properties have been described. In contrast, corneal nanostructures of the majority of other insect orders have either been unexamined or examined by methods that did not allow precise morphological characterization. Here we provide a comprehensive analysis of corneal surfaces in 23 insect orders, revealing a rich diversity of insect corneal nanocoatings. These nanocoatings are categorized into four major morphological patterns and various transitions between them, many, to our knowledge, never described before. Remarkably, this unexpectedly diverse range of the corneal nanostructures replicates the complete set of Turing patterns, thus likely being a result of processes similar to those modeled by Alan Turing in his famous reaction−diffusion system. These findings reveal a beautiful diversity of insect corneal nanostructures and shed light on their molecular origin and evolutionary diversification. They may also be the first-ever biological example of Turing nanopatterns. PMID:26307762

  18. On the Universality and Non-Universality of Spiking Neural P Systems With Rules on Synapses.

    PubMed

    Song, Tao; Xu, Jinbang; Pan, Linqiang

    2015-12-01

    Spiking neural P systems with rules on synapses are a new variant of spiking neural P systems. In the systems, the neuron contains only spikes, while the spiking/forgetting rules are moved on the synapses. It was obtained that such system with 30 neurons (using extended spiking rules) or with 39 neurons (using standard spiking rules) is Turing universal. In this work, this number is improved to 6. Specifically, we construct a Turing universal spiking neural P system with rules on synapses having 6 neurons, which can generate any set of Turing computable natural numbers. As well, it is obtained that spiking neural P system with rules on synapses having less than two neurons are not Turing universal: i) such systems having one neuron can characterize the family of finite sets of natural numbers; ii) the family of sets of numbers generated by the systems having two neurons is included in the family of semi-linear sets of natural numbers.

  19. The fin-to-limb transition as the re-organization of a Turing pattern

    PubMed Central

    Onimaru, Koh; Marcon, Luciano; Musy, Marco; Tanaka, Mikiko; Sharpe, James

    2016-01-01

    A Turing mechanism implemented by BMP, SOX9 and WNT has been proposed to control mouse digit patterning. However, its generality and contribution to the morphological diversity of fins and limbs has not been explored. Here we provide evidence that the skeletal patterning of the catshark Scyliorhinus canicula pectoral fin is likely driven by a deeply conserved Bmp–Sox9–Wnt Turing network. In catshark fins, the distal nodular elements arise from a periodic spot pattern of Sox9 expression, in contrast to the stripe pattern in mouse digit patterning. However, our computer model shows that the Bmp–Sox9–Wnt network with altered spatial modulation can explain the Sox9 expression in catshark fins. Finally, experimental perturbation of Bmp or Wnt signalling in catshark embryos produces skeletal alterations which match in silico predictions. Together, our results suggest that the broad morphological diversity of the distal fin and limb elements arose from the spatial re-organization of a deeply conserved Turing mechanism. PMID:27211489

  20. On the Nature of Intelligence

    NASA Astrophysics Data System (ADS)

    Churchland, Paul M.

    Alan Turing is the consensus patron saint of the classical research program in Artificial Intelligence (AI), and his behavioral test for the possession of conscious intelligence has become his principal legacy in the mind of the academic public. Both takes are mistakes. That test is a dialectical throwaway line even for Turing himself, a tertiary gesture aimed at softening the intellectual resistance to a research program which, in his hands, possessed real substance, both mathematical and theoretical. The wrangling over his celebrated test has deflected attention away from those more substantial achievements, and away from the enduring obligation to construct a substantive theory of what conscious intelligence really is, as opposed to an epistemological account of how to tell when you are confronting an instance of it. This essay explores Turing's substantive research program on the nature of intelligence, and argues that the classical AI program is not its best expression, nor even the expression intended by Turing. It then attempts to put the famous Test into its proper, and much reduced, perspective.

  1. Spatiotemporal chaos involving wave instability.

    PubMed

    Berenstein, Igal; Carballido-Landeira, Jorge

    2017-01-01

    In this paper, we investigate pattern formation in a model of a reaction confined in a microemulsion, in a regime where both Turing and wave instability occur. In one-dimensional systems, the pattern corresponds to spatiotemporal intermittency where the behavior of the systems alternates in both time and space between stationary Turing patterns and traveling waves. In two-dimensional systems, the behavior initially may correspond to Turing patterns, which then turn into wave patterns. The resulting pattern also corresponds to a chaotic state, where the system alternates in both space and time between standing wave patterns and traveling waves, and the local dynamics may show vanishing amplitude of the variables.

  2. Spatiotemporal chaos involving wave instability

    NASA Astrophysics Data System (ADS)

    Berenstein, Igal; Carballido-Landeira, Jorge

    2017-01-01

    In this paper, we investigate pattern formation in a model of a reaction confined in a microemulsion, in a regime where both Turing and wave instability occur. In one-dimensional systems, the pattern corresponds to spatiotemporal intermittency where the behavior of the systems alternates in both time and space between stationary Turing patterns and traveling waves. In two-dimensional systems, the behavior initially may correspond to Turing patterns, which then turn into wave patterns. The resulting pattern also corresponds to a chaotic state, where the system alternates in both space and time between standing wave patterns and traveling waves, and the local dynamics may show vanishing amplitude of the variables.

  3. Competing Turing and Faraday Instabilities in Longitudinally Modulated Passive Resonators.

    PubMed

    Copie, François; Conforti, Matteo; Kudlinski, Alexandre; Mussot, Arnaud; Trillo, Stefano

    2016-04-08

    We experimentally investigate the interplay of Turing (modulational) and Faraday (parametric) instabilities in a bistable passive nonlinear resonator. The Faraday branch is induced via parametric resonance owing to a periodic modulation of the resonator dispersion. We show that the bistable switching dynamics is dramatically affected by the competition between the two instability mechanisms, which dictates two completely novel scenarios. At low detunings from resonance, switching occurs between the stable stationary lower branch and the Faraday-unstable upper branch, whereas at high detunings we observe the crossover between the Turing and Faraday periodic structures. The results are well explained in terms of the universal Lugiato-Lefever model.

  4. Extending Landauer's bound from bit erasure to arbitrary computation

    NASA Astrophysics Data System (ADS)

    Wolpert, David

    The minimal thermodynamic work required to erase a bit, known as Landauer's bound, has been extensively investigated both theoretically and experimentally. However, when viewed as a computation that maps inputs to outputs, bit erasure has a very special property: the output does not depend on the input. Existing analyses of thermodynamics of bit erasure implicitly exploit this property, and thus cannot be directly extended to analyze the computation of arbitrary input-output maps. Here we show how to extend these earlier analyses of bit erasure to analyze the thermodynamics of arbitrary computations. Doing this establishes a formal connection between the thermodynamics of computers and much of theoretical computer science. We use this extension to analyze the thermodynamics of the canonical ``general purpose computer'' considered in computer science theory: a universal Turing machine (UTM). We consider a UTM which maps input programs to output strings, where inputs are drawn from an ensemble of random binary sequences, and prove: i) The minimal work needed by a UTM to run some particular input program X and produce output Y is the Kolmogorov complexity of Y minus the log of the ``algorithmic probability'' of Y. This minimal amount of thermodynamic work has a finite upper bound, which is independent of the output Y, depending only on the details of the UTM. ii) The expected work needed by a UTM to compute some given output Y is infinite. As a corollary, the overall expected work to run a UTM is infinite. iii) The expected work needed by an arbitrary Turing machine T (not necessarily universal) to compute some given output Y can either be infinite or finite, depending on Y and the details of T. To derive these results we must combine ideas from nonequilibrium statistical physics with fundamental results from computer science, such as Levin's coding theorem and other theorems about universal computation. I would like to ackowledge the Santa Fe Institute, Grant No. TWCF0079/AB47 from the Templeton World Charity Foundation, Grant No. FQXi-RHl3-1349 from the FQXi foundation, and Grant No. CHE-1648973 from the U.S. National Science Foundation.

  5. Computation and brain processes, with special reference to neuroendocrine systems.

    PubMed

    Toni, Roberto; Spaletta, Giulia; Casa, Claudia Della; Ravera, Simone; Sandri, Giorgio

    2007-01-01

    The development of neural networks and brain automata has made neuroscientists aware that the performance limits of these brain-like devices lies, at least in part, in their computational power. The computational basis of a. standard cybernetic design, in fact, refers to that of a discrete and finite state machine or Turing Machine (TM). In contrast, it has been suggested that a number of human cerebral activites, from feedback controls up to mental processes, rely on a mixing of both finitary, digital-like and infinitary, continuous-like procedures. Therefore, the central nervous system (CNS) of man would exploit a form of computation going beyond that of a TM. This "non conventional" computation has been called hybrid computation. Some basic structures for hybrid brain computation are believed to be the brain computational maps, in which both Turing-like (digital) computation and continuous (analog) forms of calculus might occur. The cerebral cortex and brain stem appears primary candidate for this processing. However, also neuroendocrine structures like the hypothalamus are believed to exhibit hybrid computional processes, and might give rise to computational maps. Current theories on neural activity, including wiring and volume transmission, neuronal group selection and dynamic evolving models of brain automata, bring fuel to the existence of natural hybrid computation, stressing a cooperation between discrete and continuous forms of communication in the CNS. In addition, the recent advent of neuromorphic chips, like those to restore activity in damaged retina and visual cortex, suggests that assumption of a discrete-continuum polarity in designing biocompatible neural circuitries is crucial for their ensuing performance. In these bionic structures, in fact, a correspondence exists between the original anatomical architecture and synthetic wiring of the chip, resulting in a correspondence between natural and cybernetic neural activity. Thus, chip "form" provides a continuum essential to chip "function". We conclude that it is reasonable to predict the existence of hybrid computational processes in the course of many human, brain integrating activities, urging development of cybernetic approaches based on this modelling for adequate reproduction of a variety of cerebral performances.

  6. Forging patterns and making waves from biology to geology: a commentary on Turing (1952) ‘The chemical basis of morphogenesis’

    PubMed Central

    Ball, Philip

    2015-01-01

    Alan Turing was neither a biologist nor a chemist, and yet the paper he published in 1952, ‘The chemical basis of morphogenesis’, on the spontaneous formation of patterns in systems undergoing reaction and diffusion of their ingredients has had a substantial impact on both fields, as well as in other areas as disparate as geomorphology and criminology. Motivated by the question of how a spherical embryo becomes a decidedly non-spherical organism such as a human being, Turing devised a mathematical model that explained how random fluctuations can drive the emergence of pattern and structure from initial uniformity. The spontaneous appearance of pattern and form in a system far away from its equilibrium state occurs in many types of natural process, and in some artificial ones too. It is often driven by very general mechanisms, of which Turing's model supplies one of the most versatile. For that reason, these patterns show striking similarities in systems that seem superficially to share nothing in common, such as the stripes of sand ripples and of pigmentation on a zebra skin. New examples of ‘Turing patterns' in biology and beyond are still being discovered today. This commentary was written to celebrate the 350th anniversary of the journal Philosophical Transactions of the Royal Society. PMID:25750229

  7. The super-Turing computational power of plastic recurrent neural networks.

    PubMed

    Cabessa, Jérémie; Siegelmann, Hava T

    2014-12-01

    We study the computational capabilities of a biologically inspired neural model where the synaptic weights, the connectivity pattern, and the number of neurons can evolve over time rather than stay static. Our study focuses on the mere concept of plasticity of the model so that the nature of the updates is assumed to be not constrained. In this context, we show that the so-called plastic recurrent neural networks (RNNs) are capable of the precise super-Turing computational power--as the static analog neural networks--irrespective of whether their synaptic weights are modeled by rational or real numbers, and moreover, irrespective of whether their patterns of plasticity are restricted to bi-valued updates or expressed by any other more general form of updating. Consequently, the incorporation of only bi-valued plastic capabilities in a basic model of RNNs suffices to break the Turing barrier and achieve the super-Turing level of computation. The consideration of more general mechanisms of architectural plasticity or of real synaptic weights does not further increase the capabilities of the networks. These results support the claim that the general mechanism of plasticity is crucially involved in the computational and dynamical capabilities of biological neural networks. They further show that the super-Turing level of computation reflects in a suitable way the capabilities of brain-like models of computation.

  8. Effects of intrinsic stochasticity on delayed reaction-diffusion patterning systems.

    PubMed

    Woolley, Thomas E; Baker, Ruth E; Gaffney, Eamonn A; Maini, Philip K; Seirin-Lee, Sungrim

    2012-05-01

    Cellular gene expression is a complex process involving many steps, including the transcription of DNA and translation of mRNA; hence the synthesis of proteins requires a considerable amount of time, from ten minutes to several hours. Since diffusion-driven instability has been observed to be sensitive to perturbations in kinetic delays, the application of Turing patterning mechanisms to the problem of producing spatially heterogeneous differential gene expression has been questioned. In deterministic systems a small delay in the reactions can cause a large increase in the time it takes a system to pattern. Recently, it has been observed that in undelayed systems intrinsic stochasticity can cause pattern initiation to occur earlier than in the analogous deterministic simulations. Here we are interested in adding both stochasticity and delays to Turing systems in order to assess whether stochasticity can reduce the patterning time scale in delayed Turing systems. As analytical insights to this problem are difficult to attain and often limited in their use, we focus on stochastically simulating delayed systems. We consider four different Turing systems and two different forms of delay. Our results are mixed and lead to the conclusion that, although the sensitivity to delays in the Turing mechanism is not completely removed by the addition of intrinsic noise, the effects of the delays are clearly ameliorated in certain specific cases.

  9. Social Studies and Emerging Paradigms: Artificial Intelligence and Consciousness Education.

    ERIC Educational Resources Information Center

    Braun, Joseph A., Jr.

    1987-01-01

    Asks three questions: (1) Are machines capable of thinking as people do? (2) How is the thinking of computers similar and different from human thinking? and (3) What exactly is thinking? Examines research in artificial intelligence. Describes the theory and research of consciousness education and discusses an emerging paradigm for human thinking…

  10. Faith in the algorithm, part 1: beyond the turing test

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodriguez, Marko A; Pepe, Alberto

    2009-01-01

    Since the Turing test was first proposed by Alan Turing in 1950, the goal of artificial intelligence has been predicated on the ability for computers to imitate human intelligence. However, the majority of uses for the computer can be said to fall outside the domain of human abilities and it is exactly outside of this domain where computers have demonstrated their greatest contribution. Another definition for artificial intelligence is one that is not predicated on human mimicry, but instead, on human amplification, where the algorithms that are best at accomplishing this are deemed the most intelligent. This article surveys variousmore » systems that augment human and social intelligence.« less

  11. Object-Oriented Programming in High Schools the Turing Way.

    ERIC Educational Resources Information Center

    Holt, Richard C.

    This paper proposes an approach to introducing object-oriented concepts to high school computer science students using the Object-Oriented Turing (OOT) language. Students can learn about basic object-oriented (OO) principles such as classes and inheritance by using and expanding a collection of classes that draw pictures like circles and happy…

  12. Cultivating Critique: A (Humanoid) Response to the Online Teaching of Critical Thinking

    ERIC Educational Resources Information Center

    Waggoner, Matt

    2013-01-01

    The Turing era, defined by British mathematician and computer science pioneer Alan Turing's question about whether or not computers can think, is not over. Philosophers and scientists will continue to haggle over whether thought necessitates intentionality, and whether computation can rise to that level. Meanwhile, another frontier is emerging in…

  13. High-throughput mathematical analysis identifies Turing networks for patterning with equally diffusing signals.

    PubMed

    Marcon, Luciano; Diego, Xavier; Sharpe, James; Müller, Patrick

    2016-04-08

    The Turing reaction-diffusion model explains how identical cells can self-organize to form spatial patterns. It has been suggested that extracellular signaling molecules with different diffusion coefficients underlie this model, but the contribution of cell-autonomous signaling components is largely unknown. We developed an automated mathematical analysis to derive a catalog of realistic Turing networks. This analysis reveals that in the presence of cell-autonomous factors, networks can form a pattern with equally diffusing signals and even for any combination of diffusion coefficients. We provide a software (available at http://www.RDNets.com) to explore these networks and to constrain topologies with qualitative and quantitative experimental data. We use the software to examine the self-organizing networks that control embryonic axis specification and digit patterning. Finally, we demonstrate how existing synthetic circuits can be extended with additional feedbacks to form Turing reaction-diffusion systems. Our study offers a new theoretical framework to understand multicellular pattern formation and enables the wide-spread use of mathematical biology to engineer synthetic patterning systems.

  14. High-throughput mathematical analysis identifies Turing networks for patterning with equally diffusing signals

    PubMed Central

    Marcon, Luciano; Diego, Xavier; Sharpe, James; Müller, Patrick

    2016-01-01

    The Turing reaction-diffusion model explains how identical cells can self-organize to form spatial patterns. It has been suggested that extracellular signaling molecules with different diffusion coefficients underlie this model, but the contribution of cell-autonomous signaling components is largely unknown. We developed an automated mathematical analysis to derive a catalog of realistic Turing networks. This analysis reveals that in the presence of cell-autonomous factors, networks can form a pattern with equally diffusing signals and even for any combination of diffusion coefficients. We provide a software (available at http://www.RDNets.com) to explore these networks and to constrain topologies with qualitative and quantitative experimental data. We use the software to examine the self-organizing networks that control embryonic axis specification and digit patterning. Finally, we demonstrate how existing synthetic circuits can be extended with additional feedbacks to form Turing reaction-diffusion systems. Our study offers a new theoretical framework to understand multicellular pattern formation and enables the wide-spread use of mathematical biology to engineer synthetic patterning systems. DOI: http://dx.doi.org/10.7554/eLife.14022.001 PMID:27058171

  15. Supervised Machine Learning for Population Genetics: A New Paradigm

    PubMed Central

    Schrider, Daniel R.; Kern, Andrew D.

    2018-01-01

    As population genomic datasets grow in size, researchers are faced with the daunting task of making sense of a flood of information. To keep pace with this explosion of data, computational methodologies for population genetic inference are rapidly being developed to best utilize genomic sequence data. In this review we discuss a new paradigm that has emerged in computational population genomics: that of supervised machine learning (ML). We review the fundamentals of ML, discuss recent applications of supervised ML to population genetics that outperform competing methods, and describe promising future directions in this area. Ultimately, we argue that supervised ML is an important and underutilized tool that has considerable potential for the world of evolutionary genomics. PMID:29331490

  16. Hardware Development and Locomotion Control Strategy for an Over-Ground Gait Trainer: NaTUre-Gaits.

    PubMed

    Luu, Trieu Phat; Low, Kin Huat; Qu, Xingda; Lim, Hup Boon; Hoon, Kay Hiang

    2014-01-01

    Therapist-assisted body weight supported (TABWS) gait rehabilitation was introduced two decades ago. The benefit of TABWS in functional recovery of walking in spinal cord injury and stroke patients has been demonstrated and reported. However, shortage of therapists, labor-intensiveness, and short duration of training are some limitations of this approach. To overcome these deficiencies, robotic-assisted gait rehabilitation systems have been suggested. These systems have gained attentions from researchers and clinical practitioner in recent years. To achieve the same objective, an over-ground gait rehabilitation system, NaTUre-gaits, was developed at the Nanyang Technological University. The design was based on a clinical approach to provide four main features, which are pelvic motion, body weight support, over-ground walking experience, and lower limb assistance. These features can be achieved by three main modules of NaTUre-gaits: 1) pelvic assistance mechanism, mobile platform, and robotic orthosis. Predefined gait patterns are required for a robotic assisted system to follow. In this paper, the gait pattern planning for NaTUre-gaits was accomplished by an individual-specific gait pattern prediction model. The model generates gait patterns that resemble natural gait patterns of the targeted subjects. The features of NaTUre-gaits have been demonstrated by walking trials with several subjects. The trials have been evaluated by therapists and doctors. The results show that 10-m walking trial with a reduction in manpower. The task-specific repetitive training approach and natural walking gait patterns were also successfully achieved.

  17. Nonlinear Chemical Dynamics and Synchronization

    NASA Astrophysics Data System (ADS)

    Li, Ning

    Alan Turing's work on morphogenesis, more than half a century ago, continues to motivate and inspire theoretical and experimental biologists even today. That said, there are very few experimental systems for which Turing's theory is applicable. In this thesis we present an experimental reaction-diffusion system ideally suited for testing Turing's ideas in synthetic "cells" consisting of microfluidically produced surfactant-stabilized emulsions in which droplets containing the Belousov-Zhabotinsky (BZ) oscillatory chemical reactants are dispersed in oil. The BZ reaction has become the prototype of nonlinear dynamics in chemistry and a preferred system for exploring the behavior of coupled nonlinear oscillators. Our system consists of a surfactant stabilized monodisperse emulsion of drops of aqueous BZ solution dispersed in a continuous phase of oil. In contrast to biology, here the chemistry is understood, rate constants are measured and interdrop coupling is purely diffusive. We explore a large set of parameters through control of rate constants, drop size, spacing, and spatial arrangement of the drops in lines and rings in one-dimension (1D) and hexagonal arrays in two-dimensions (2D). The Turing model is regarded as a metaphor for morphogenesis in biology but not for prediction. Here, we develop a quantitative and falsifiable reaction-diffusion model that we experimentally test with synthetic cells. We quantitatively establish the extent to which the Turing model in 1D describes both stationary pattern formation and temporal synchronization of chemical oscillators via reaction-diffusion and in 2D demonstrate that chemical morphogenesis drives physical differentiation in synthetic cells.

  18. Hardware Development and Locomotion Control Strategy for an Over-Ground Gait Trainer: NaTUre-Gaits

    PubMed Central

    Low, Kin Huat; Qu, Xingda; Lim, Hup Boon; Hoon, Kay Hiang

    2014-01-01

    Therapist-assisted body weight supported (TABWS) gait rehabilitation was introduced two decades ago. The benefit of TABWS in functional recovery of walking in spinal cord injury and stroke patients has been demonstrated and reported. However, shortage of therapists, labor-intensiveness, and short duration of training are some limitations of this approach. To overcome these deficiencies, robotic-assisted gait rehabilitation systems have been suggested. These systems have gained attentions from researchers and clinical practitioner in recent years. To achieve the same objective, an over-ground gait rehabilitation system, NaTUre-gaits, was developed at the Nanyang Technological University. The design was based on a clinical approach to provide four main features, which are pelvic motion, body weight support, over-ground walking experience, and lower limb assistance. These features can be achieved by three main modules of NaTUre-gaits: 1) pelvic assistance mechanism, mobile platform, and robotic orthosis. Predefined gait patterns are required for a robotic assisted system to follow. In this paper, the gait pattern planning for NaTUre-gaits was accomplished by an individual-specific gait pattern prediction model. The model generates gait patterns that resemble natural gait patterns of the targeted subjects. The features of NaTUre-gaits have been demonstrated by walking trials with several subjects. The trials have been evaluated by therapists and doctors. The results show that 10-m walking trial with a reduction in manpower. The task-specific repetitive training approach and natural walking gait patterns were also successfully achieved. PMID:27170876

  19. A new necessary condition for Turing instabilities.

    PubMed

    Elragig, Aiman; Townley, Stuart

    2012-09-01

    Reactivity (a.k.a initial growth) is necessary for diffusion driven instability (Turing instability). Using a notion of common Lyapunov function we show that this necessary condition is a special case of a more powerful (i.e. tighter) necessary condition. Specifically, we show that if the linearised reaction matrix and the diffusion matrix share a common Lyapunov function, then Turing instability is not possible. The existence of common Lyapunov functions is readily checked using semi-definite programming. We apply this result to the Gierer-Meinhardt system modelling regenerative properties of Hydra, the Oregonator, to a host-parasite-hyperparasite system with diffusion and to a reaction-diffusion-chemotaxis model for a multi-species host-parasitoid community. Copyright © 2012 Elsevier Inc. All rights reserved.

  20. Doing Justice to the Imitation Game

    NASA Astrophysics Data System (ADS)

    Lassègue, Jean

    My claim in this article is that the 1950 paper in which Turing describes the world-famous set-up of the Imitation Game is much richer and intriguing than the formalist ersatz coined in the early 1970s under the name "Turing Test". Therefore, doing justice to the Imitation Game implies showing first, that the formalist interpretation misses some crucial points in Turing's line of thought and second, that the 1950 paper should not be understood as the Magna Chartaof strong Artificial Intelligence (AI) but as a work in progressfocused on the notion of Form. This has unexpected consequences about the status of Mind, and from a more general point of view, about the way we interpret the notions of Science and Language.

  1. Ischemia and reperfusion injury in renal transplantation: hemodynamic and immunological paradigms

    PubMed Central

    Requião-Moura, Lúcio Roberto; Durão, Marcelino de Souza; de Matos, Ana Cristina Carvalho; Pacheco-Silva, Alvaro

    2015-01-01

    Ischemia and reperfusion injury is an inevitable event in renal transplantation. The most important consequences are delayed graft function, longer length of stay, higher hospital costs, high risk of acute rejection, and negative impact of long-term follow-up. Currently, many factors are involved in their pathophysiology and could be classified into two different paradigms for education purposes: hemodynamic and immune. The hemodynamic paradigm is described as the reduction of oxygen delivery due to blood flow interruption, involving many hormone systems, and oxygen-free radicals produced after reperfusion. The immune paradigm has been recently described and involves immune system cells, especially T cells, with a central role in this injury. According to these concepts, new strategies to prevent ischemia and reperfusion injury have been studied, particularly the more physiological forms of storing the kidney, such as the pump machine and the use of antilymphocyte antibody therapy before reperfusion. Pump machine perfusion reduces delayed graft function prevalence and length of stay at hospital, and increases long-term graft survival. The use of antilymphocyte antibody therapy before reperfusion, such as Thymoglobulin™, can reduce the prevalence of delayed graft function and chronic graft dysfunction. PMID:25993079

  2. Paradigms and nursing management, analysis of the current organizational structure in a large hospital.

    PubMed

    Wilson, D

    1992-01-01

    Hospitals developed over the period of time when positivism become a predominant world view. Positivism was founded by four Western trends: preponderance of hierarchy and autocracy, popularization of bureaucracy, extensive application of a machine orientation to work and predominance of "scientific" inquiry. Organizational theory developed largely from quantitative research findings arising from a positivistic world view. A case study, analyzing a current nursing organizational structure at one large hospital, is presented. Nursing management was found to be based upon the positivistic paradigm. The predominance of a machine orientation, and an autocratic and bureaucratic structure are evidence of this. A change to shared governance had been attempted, indicating a shift to a more modern organizational structure based on a different paradigm. The article concludes by emphasizing that managers are largely responsible for facilitating change; change that will meet internal human resource needs and the cost-effectiveness crises of hospitals today through more effective use of human resources.

  3. Use of Intraoperative Temporary Invasive Distraction to Reduce a Chronic Talar Neck Fracture-Dislocation

    DTIC Science & Technology

    2011-04-01

    tures. J Orthop Trauma. 2004;18(5):265-270. 2. Metzger M, Levin J, Clancy J. Talar neck frac- tures and rates of avascular necrosis . J Foot Ankle Surg...of the talus.4 Given the risk for osteo- necrosis with talar neck fractures, early operative intervention is con- sidered the standard of care.5

  4. A Graph-Based Impact Metric for Mitigating Lateral Movement Cyber Attacks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Purvine, Emilie AH; Johnson, John R.; Lo, Chaomei

    Most cyber network attacks begin with an adversary gain- ing a foothold within the network and proceed with lateral movement until a desired goal is achieved. The mechanism by which lateral movement occurs varies but the basic signa- ture of hopping between hosts by exploiting vulnerabilities is the same. Because of the nature of the vulnerabilities typ- ically exploited, lateral movement is very difficult to detect and defend against. In this paper we define a dynamic reach- ability graph model of the network to discover possible paths that an adversary could take using different vulnerabilities, and how those paths evolvemore » over time. We use this reacha- bility graph to develop dynamic machine-level and network- level impact scores. Lateral movement mitigation strategies which make use of our impact scores are also discussed, and we detail an example using a freely available data set.« less

  5. The challenge of computer mathematics.

    PubMed

    Barendregt, Henk; Wiedijk, Freek

    2005-10-15

    Progress in the foundations of mathematics has made it possible to formulate all thinkable mathematical concepts, algorithms and proofs in one language and in an impeccable way. This is not in spite of, but partially based on the famous results of Gödel and Turing. In this way statements are about mathematical objects and algorithms, proofs show the correctness of statements and computations, and computations are dealing with objects and proofs. Interactive computer systems for a full integration of defining, computing and proving are based on this. The human defines concepts, constructs algorithms and provides proofs, while the machine checks that the definitions are well formed and the proofs and computations are correct. Results formalized so far demonstrate the feasibility of this 'computer mathematics'. Also there are very good applications. The challenge is to make the systems more mathematician-friendly, by building libraries and tools. The eventual goal is to help humans to learn, develop, communicate, referee and apply mathematics.

  6. Visualization and characterization of individual type III protein secretion machines in live bacteria

    PubMed Central

    Lara-Tejero, María; Bewersdorf, Jörg; Galán, Jorge E.

    2017-01-01

    Type III protein secretion machines have evolved to deliver bacterially encoded effector proteins into eukaryotic cells. Although electron microscopy has provided a detailed view of these machines in isolation or fixed samples, little is known about their organization in live bacteria. Here we report the visualization and characterization of the Salmonella type III secretion machine in live bacteria by 2D and 3D single-molecule switching superresolution microscopy. This approach provided access to transient components of this machine, which previously could not be analyzed. We determined the subcellular distribution of individual machines, the stoichiometry of the different components of this machine in situ, and the spatial distribution of the substrates of this machine before secretion. Furthermore, by visualizing this machine in Salmonella mutants we obtained major insights into the machine’s assembly. This study bridges a major resolution gap in the visualization of this nanomachine and may serve as a paradigm for the examination of other bacterially encoded molecular machines. PMID:28533372

  7. A new learning paradigm: learning using privileged information.

    PubMed

    Vapnik, Vladimir; Vashist, Akshay

    2009-01-01

    In the Afterword to the second edition of the book "Estimation of Dependences Based on Empirical Data" by V. Vapnik, an advanced learning paradigm called Learning Using Hidden Information (LUHI) was introduced. This Afterword also suggested an extension of the SVM method (the so called SVM(gamma)+ method) to implement algorithms which address the LUHI paradigm (Vapnik, 1982-2006, Sections 2.4.2 and 2.5.3 of the Afterword). See also (Vapnik, Vashist, & Pavlovitch, 2008, 2009) for further development of the algorithms. In contrast to the existing machine learning paradigm where a teacher does not play an important role, the advanced learning paradigm considers some elements of human teaching. In the new paradigm along with examples, a teacher can provide students with hidden information that exists in explanations, comments, comparisons, and so on. This paper discusses details of the new paradigm and corresponding algorithms, introduces some new algorithms, considers several specific forms of privileged information, demonstrates superiority of the new learning paradigm over the classical learning paradigm when solving practical problems, and discusses general questions related to the new ideas.

  8. Emergent structures in reaction-advection-diffusion systems on a sphere.

    PubMed

    Krause, Andrew L; Burton, Abigail M; Fadai, Nabil T; Van Gorder, Robert A

    2018-04-01

    We demonstrate unusual effects due to the addition of advection into a two-species reaction-diffusion system on the sphere. We find that advection introduces emergent behavior due to an interplay of the traditional Turing patterning mechanisms with the compact geometry of the sphere. Unidirectional advection within the Turing space of the reaction-diffusion system causes patterns to be generated at one point of the sphere, and transported to the antipodal point where they are destroyed. We illustrate these effects numerically and deduce conditions for Turing instabilities on local projections to understand the mechanisms behind these behaviors. We compare this behavior to planar advection which is shown to only transport patterns across the domain. Analogous transport results seem to hold for the sphere under azimuthal transport or away from the antipodal points in unidirectional flow regimes.

  9. Emergent structures in reaction-advection-diffusion systems on a sphere

    NASA Astrophysics Data System (ADS)

    Krause, Andrew L.; Burton, Abigail M.; Fadai, Nabil T.; Van Gorder, Robert A.

    2018-04-01

    We demonstrate unusual effects due to the addition of advection into a two-species reaction-diffusion system on the sphere. We find that advection introduces emergent behavior due to an interplay of the traditional Turing patterning mechanisms with the compact geometry of the sphere. Unidirectional advection within the Turing space of the reaction-diffusion system causes patterns to be generated at one point of the sphere, and transported to the antipodal point where they are destroyed. We illustrate these effects numerically and deduce conditions for Turing instabilities on local projections to understand the mechanisms behind these behaviors. We compare this behavior to planar advection which is shown to only transport patterns across the domain. Analogous transport results seem to hold for the sphere under azimuthal transport or away from the antipodal points in unidirectional flow regimes.

  10. Spiking Neural P Systems With Rules on Synapses Working in Maximum Spiking Strategy.

    PubMed

    Tao Song; Linqiang Pan

    2015-06-01

    Spiking neural P systems (called SN P systems for short) are a class of parallel and distributed neural-like computation models inspired by the way the neurons process information and communicate with each other by means of impulses or spikes. In this work, we introduce a new variant of SN P systems, called SN P systems with rules on synapses working in maximum spiking strategy, and investigate the computation power of the systems as both number and vector generators. Specifically, we prove that i) if no limit is imposed on the number of spikes in any neuron during any computation, such systems can generate the sets of Turing computable natural numbers and the sets of vectors of positive integers computed by k-output register machine; ii) if an upper bound is imposed on the number of spikes in each neuron during any computation, such systems can characterize semi-linear sets of natural numbers as number generating devices; as vector generating devices, such systems can only characterize the family of sets of vectors computed by sequential monotonic counter machine, which is strictly included in family of semi-linear sets of vectors. This gives a positive answer to the problem formulated in Song et al., Theor. Comput. Sci., vol. 529, pp. 82-95, 2014.

  11. The "extended mind" approach for a new paradigm of psychology.

    PubMed

    Kono, Tetsuya

    2010-12-01

    In this paper, I would like to propose the idea of "extended mind" for a new paradigm of psychology. Kohler (Integrative Psychology & Behavioral Science 44:39-57, 2010) correctly pointed out the serious problems of the machine paradigm, and proposed the "organic" view as a new paradigm. But the term "organic" signifying the processes inside the body, is inadequate to express the characteristic of human mind. The recent philosophy of mind suggests that the mind is realized neither only in the brain nor only in the body, but in the whole system of brain-body-environment, namely, in the "extended mind". The characteristic of human mind resides in the interaction with the mediating tools, artifacts, and the humanized environment. We should propose an "extended mind approach" or an "ecological approach to humanized environment" as a new paradigm for a psychology.

  12. MapReduce SVM Game

    DOE PAGES

    Vineyard, Craig M.; Verzi, Stephen J.; James, Conrad D.; ...

    2015-08-10

    Despite technological advances making computing devices faster, smaller, and more prevalent in today's age, data generation and collection has outpaced data processing capabilities. Simply having more compute platforms does not provide a means of addressing challenging problems in the big data era. Rather, alternative processing approaches are needed and the application of machine learning to big data is hugely important. The MapReduce programming paradigm is an alternative to conventional supercomputing approaches, and requires less stringent data passing constrained problem decompositions. Rather, MapReduce relies upon defining a means of partitioning the desired problem so that subsets may be computed independently andmore » recom- bined to yield the net desired result. However, not all machine learning algorithms are amenable to such an approach. Game-theoretic algorithms are often innately distributed, consisting of local interactions between players without requiring a central authority and are iterative by nature rather than requiring extensive retraining. Effectively, a game-theoretic approach to machine learning is well suited for the MapReduce paradigm and provides a novel, alternative new perspective to addressing the big data problem. In this paper we present a variant of our Support Vector Machine (SVM) Game classifier which may be used in a distributed manner, and show an illustrative example of applying this algorithm.« less

  13. MapReduce SVM Game

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vineyard, Craig M.; Verzi, Stephen J.; James, Conrad D.

    Despite technological advances making computing devices faster, smaller, and more prevalent in today's age, data generation and collection has outpaced data processing capabilities. Simply having more compute platforms does not provide a means of addressing challenging problems in the big data era. Rather, alternative processing approaches are needed and the application of machine learning to big data is hugely important. The MapReduce programming paradigm is an alternative to conventional supercomputing approaches, and requires less stringent data passing constrained problem decompositions. Rather, MapReduce relies upon defining a means of partitioning the desired problem so that subsets may be computed independently andmore » recom- bined to yield the net desired result. However, not all machine learning algorithms are amenable to such an approach. Game-theoretic algorithms are often innately distributed, consisting of local interactions between players without requiring a central authority and are iterative by nature rather than requiring extensive retraining. Effectively, a game-theoretic approach to machine learning is well suited for the MapReduce paradigm and provides a novel, alternative new perspective to addressing the big data problem. In this paper we present a variant of our Support Vector Machine (SVM) Game classifier which may be used in a distributed manner, and show an illustrative example of applying this algorithm.« less

  14. European Science Notes Information Bulletin Reports on Current European/Middle Eastern Science,

    DTIC Science & Technology

    1989-07-01

    behavior at high rates of strain, and composite materials at high rates of strain. ESNIB 89-07 International Conference on Interaction of Steels with... drug mole-armacology,. ture will be the sterility, energy and mass transfer, shearcults possess N-alkyl functions, usually in saturated struc- tures...tnerapcutic agents. This is usually cell densities and high metabolically active cells, the achieved by N-dcalklyating the parent drug molecule to

  15. Robust stochastic Turing patterns in the development of a one-dimensional cyanobacterial organism.

    PubMed

    Di Patti, Francesca; Lavacchi, Laura; Arbel-Goren, Rinat; Schein-Lubomirsky, Leora; Fanelli, Duccio; Stavans, Joel

    2018-05-01

    Under nitrogen deprivation, the one-dimensional cyanobacterial organism Anabaena sp. PCC 7120 develops patterns of single, nitrogen-fixing cells separated by nearly regular intervals of photosynthetic vegetative cells. We study a minimal, stochastic model of developmental patterns in Anabaena that includes a nondiffusing activator, two diffusing inhibitor morphogens, demographic fluctuations in the number of morphogen molecules, and filament growth. By tracking developing filaments, we provide experimental evidence for different spatiotemporal roles of the two inhibitors during pattern maintenance and for small molecular copy numbers, justifying a stochastic approach. In the deterministic limit, the model yields Turing patterns within a region of parameter space that shrinks markedly as the inhibitor diffusivities become equal. Transient, noise-driven, stochastic Turing patterns are produced outside this region, which can then be fixed by downstream genetic commitment pathways, dramatically enhancing the robustness of pattern formation, also in the biologically relevant situation in which the inhibitors' diffusivities may be comparable.

  16. Turing pattern dynamics and adaptive discretization for a super-diffusive Lotka-Volterra model.

    PubMed

    Bendahmane, Mostafa; Ruiz-Baier, Ricardo; Tian, Canrong

    2016-05-01

    In this paper we analyze the effects of introducing the fractional-in-space operator into a Lotka-Volterra competitive model describing population super-diffusion. First, we study how cross super-diffusion influences the formation of spatial patterns: a linear stability analysis is carried out, showing that cross super-diffusion triggers Turing instabilities, whereas classical (self) super-diffusion does not. In addition we perform a weakly nonlinear analysis yielding a system of amplitude equations, whose study shows the stability of Turing steady states. A second goal of this contribution is to propose a fully adaptive multiresolution finite volume method that employs shifted Grünwald gradient approximations, and which is tailored for a larger class of systems involving fractional diffusion operators. The scheme is aimed at efficient dynamic mesh adaptation and substantial savings in computational burden. A numerical simulation of the model was performed near the instability boundaries, confirming the behavior predicted by our analysis.

  17. Periodic waves of the Lugiato-Lefever equation at the onset of Turing instability.

    PubMed

    Delcey, Lucie; Haraguss, Mariana

    2018-04-13

    We study the existence and the stability of periodic steady waves for a nonlinear model, the Lugiato-Lefever equation, arising in optics. Starting from a detailed description of the stability properties of constant solutions, we then focus on the periodic steady waves which bifurcate at the onset of Turing instability. Using a centre manifold reduction, we analyse these Turing bifurcations, and prove the existence of periodic steady waves. This approach also allows us to conclude on the nonlinear orbital stability of these waves for co-periodic perturbations, i.e. for periodic perturbations which have the same period as the wave. This stability result is completed by a spectral stability result for general bounded perturbations. In particular, this spectral analysis shows that instabilities are always due to co-periodic perturbations.This article is part of the theme issue 'Stability of nonlinear waves and patterns and related topics'. © 2018 The Author(s).

  18. Automated annotation of functional imaging experiments via multi-label classification

    PubMed Central

    Turner, Matthew D.; Chakrabarti, Chayan; Jones, Thomas B.; Xu, Jiawei F.; Fox, Peter T.; Luger, George F.; Laird, Angela R.; Turner, Jessica A.

    2013-01-01

    Identifying the experimental methods in human neuroimaging papers is important for grouping meaningfully similar experiments for meta-analyses. Currently, this can only be done by human readers. We present the performance of common machine learning (text mining) methods applied to the problem of automatically classifying or labeling this literature. Labeling terms are from the Cognitive Paradigm Ontology (CogPO), the text corpora are abstracts of published functional neuroimaging papers, and the methods use the performance of a human expert as training data. We aim to replicate the expert's annotation of multiple labels per abstract identifying the experimental stimuli, cognitive paradigms, response types, and other relevant dimensions of the experiments. We use several standard machine learning methods: naive Bayes (NB), k-nearest neighbor, and support vector machines (specifically SMO or sequential minimal optimization). Exact match performance ranged from only 15% in the worst cases to 78% in the best cases. NB methods combined with binary relevance transformations performed strongly and were robust to overfitting. This collection of results demonstrates what can be achieved with off-the-shelf software components and little to no pre-processing of raw text. PMID:24409112

  19. A Prototype SSVEP Based Real Time BCI Gaming System

    PubMed Central

    Martišius, Ignas

    2016-01-01

    Although brain-computer interface technology is mainly designed with disabled people in mind, it can also be beneficial to healthy subjects, for example, in gaming or virtual reality systems. In this paper we discuss the typical architecture, paradigms, requirements, and limitations of electroencephalogram-based gaming systems. We have developed a prototype three-class brain-computer interface system, based on the steady state visually evoked potentials paradigm and the Emotiv EPOC headset. An online target shooting game, implemented in the OpenViBE environment, has been used for user feedback. The system utilizes wave atom transform for feature extraction, achieving an average accuracy of 78.2% using linear discriminant analysis classifier, 79.3% using support vector machine classifier with a linear kernel, and 80.5% using a support vector machine classifier with a radial basis function kernel. PMID:27051414

  20. A Prototype SSVEP Based Real Time BCI Gaming System.

    PubMed

    Martišius, Ignas; Damaševičius, Robertas

    2016-01-01

    Although brain-computer interface technology is mainly designed with disabled people in mind, it can also be beneficial to healthy subjects, for example, in gaming or virtual reality systems. In this paper we discuss the typical architecture, paradigms, requirements, and limitations of electroencephalogram-based gaming systems. We have developed a prototype three-class brain-computer interface system, based on the steady state visually evoked potentials paradigm and the Emotiv EPOC headset. An online target shooting game, implemented in the OpenViBE environment, has been used for user feedback. The system utilizes wave atom transform for feature extraction, achieving an average accuracy of 78.2% using linear discriminant analysis classifier, 79.3% using support vector machine classifier with a linear kernel, and 80.5% using a support vector machine classifier with a radial basis function kernel.

  1. A fully programmable 100-spin coherent Ising machine with all-to-all connections

    NASA Astrophysics Data System (ADS)

    McMahon, Peter; Marandi, Alireza; Haribara, Yoshitaka; Hamerly, Ryan; Langrock, Carsten; Tamate, Shuhei; Inagaki, Takahiro; Takesue, Hiroki; Utsunomiya, Shoko; Aihara, Kazuyuki; Byer, Robert; Fejer, Martin; Mabuchi, Hideo; Yamamoto, Yoshihisa

    We present a scalable optical processor with electronic feedback, based on networks of optical parametric oscillators. The design of our machine is inspired by adiabatic quantum computers, although it is not an AQC itself. Our prototype machine is able to find exact solutions of, or sample good approximate solutions to, a variety of hard instances of Ising problems with up to 100 spins and 10,000 spin-spin connections. This research was funded by the Impulsing Paradigm Change through Disruptive Technologies (ImPACT) Program of the Council of Science, Technology and Innovation (Cabinet Office, Government of Japan).

  2. Bi Sparsity Pursuit: A Paradigm for Robust Subspace Recovery

    DTIC Science & Technology

    2016-09-27

    16. SECURITY CLASSIFICATION OF: The success of sparse models in computer vision and machine learning is due to the fact that, high dimensional data...Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 Signal recovery, Sparse learning , Subspace modeling REPORT DOCUMENTATION PAGE 11...vision and machine learning is due to the fact that, high dimensional data is distributed in a union of low dimensional subspaces in many real-world

  3. Proprioceptive Actuation Design for Dynamic Legged locomotion

    NASA Astrophysics Data System (ADS)

    Kim, Sangbae; Wensing, Patrick; Biomimetic Robotics Lab Team

    Designing an actuator system for highly-dynamic legged locomotion exhibited by animals has been one of the grand challenges in robotics research. Conventional actuators designed for manufacturing applications have difficulty satisfying challenging requirements for high-speed locomotion, such as the need for high torque density and the ability to manage dynamic physical interactions. It is critical to introduce a new actuator design paradigm and provide guidelines for its incorporation in future mobile robots for research and industry. To this end, we suggest a paradigm called proprioceptive actuation, which enables highly- dynamic operation in legged machines. Proprioceptive actuation uses collocated force control at the joints to effectively control contact interactions at the feet under dynamic conditions. In the realm of legged machines, this paradigm provides a unique combination of high torque density, high-bandwidth force control, and the ability to mitigate impacts through backdrivability. Results show that the proposed design provides an impact mitigation factor that is comparable to other quadruped designs with series springs to handle impact. The paradigm is shown to enable the MIT Cheetah to manage the application of contact forces during dynamic bounding, with results given down to contact times of 85ms and peak forces over 450N. As a result, the MIT Cheetah achieves high-speed 3D running up to 13mph and jumping over an 18-inch high obstacle. The project is sponsored by DARPA M3 program.

  4. A computer architecture for intelligent machines

    NASA Technical Reports Server (NTRS)

    Lefebvre, D. R.; Saridis, G. N.

    1992-01-01

    The theory of intelligent machines proposes a hierarchical organization for the functions of an autonomous robot based on the principle of increasing precision with decreasing intelligence. An analytic formulation of this theory using information-theoretic measures of uncertainty for each level of the intelligent machine has been developed. The authors present a computer architecture that implements the lower two levels of the intelligent machine. The architecture supports an event-driven programming paradigm that is independent of the underlying computer architecture and operating system. Execution-level controllers for motion and vision systems are briefly addressed, as well as the Petri net transducer software used to implement coordination-level functions. A case study illustrates how this computer architecture integrates real-time and higher-level control of manipulator and vision systems.

  5. Machine learning for Big Data analytics in plants.

    PubMed

    Ma, Chuang; Zhang, Hao Helen; Wang, Xiangfeng

    2014-12-01

    Rapid advances in high-throughput genomic technology have enabled biology to enter the era of 'Big Data' (large datasets). The plant science community not only needs to build its own Big-Data-compatible parallel computing and data management infrastructures, but also to seek novel analytical paradigms to extract information from the overwhelming amounts of data. Machine learning offers promising computational and analytical solutions for the integrative analysis of large, heterogeneous and unstructured datasets on the Big-Data scale, and is gradually gaining popularity in biology. This review introduces the basic concepts and procedures of machine-learning applications and envisages how machine learning could interface with Big Data technology to facilitate basic research and biotechnology in the plant sciences. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Machine Learning Approaches for Clinical Psychology and Psychiatry.

    PubMed

    Dwyer, Dominic B; Falkai, Peter; Koutsouleris, Nikolaos

    2018-05-07

    Machine learning approaches for clinical psychology and psychiatry explicitly focus on learning statistical functions from multidimensional data sets to make generalizable predictions about individuals. The goal of this review is to provide an accessible understanding of why this approach is important for future practice given its potential to augment decisions associated with the diagnosis, prognosis, and treatment of people suffering from mental illness using clinical and biological data. To this end, the limitations of current statistical paradigms in mental health research are critiqued, and an introduction is provided to critical machine learning methods used in clinical studies. A selective literature review is then presented aiming to reinforce the usefulness of machine learning methods and provide evidence of their potential. In the context of promising initial results, the current limitations of machine learning approaches are addressed, and considerations for future clinical translation are outlined.

  7. Towards human behavior recognition based on spatio temporal features and support vector machines

    NASA Astrophysics Data System (ADS)

    Ghabri, Sawsen; Ouarda, Wael; Alimi, Adel M.

    2017-03-01

    Security and surveillance are vital issues in today's world. The recent acts of terrorism have highlighted the urgent need for efficient surveillance. There is indeed a need for an automated system for video surveillance which can detect identity and activity of person. In this article, we propose a new paradigm to recognize an aggressive human behavior such as boxing action. Our proposed system for human activity detection includes the use of a fusion between Spatio Temporal Interest Point (STIP) and Histogram of Oriented Gradient (HoG) features. The novel feature called Spatio Temporal Histogram Oriented Gradient (STHOG). To evaluate the robustness of our proposed paradigm with a local application of HoG technique on STIP points, we made experiments on KTH human action dataset based on Multi Class Support Vector Machines classification. The proposed scheme outperforms basic descriptors like HoG and STIP to achieve 82.26% us an accuracy value of classification rate.

  8. Elastic Multi-scale Mechanisms: Computation and Biological Evolution.

    PubMed

    Diaz Ochoa, Juan G

    2018-01-01

    Explanations based on low-level interacting elements are valuable and powerful since they contribute to identify the key mechanisms of biological functions. However, many dynamic systems based on low-level interacting elements with unambiguous, finite, and complete information of initial states generate future states that cannot be predicted, implying an increase of complexity and open-ended evolution. Such systems are like Turing machines, that overlap with dynamical systems that cannot halt. We argue that organisms find halting conditions by distorting these mechanisms, creating conditions for a constant creativity that drives evolution. We introduce a modulus of elasticity to measure the changes in these mechanisms in response to changes in the computed environment. We test this concept in a population of predators and predated cells with chemotactic mechanisms and demonstrate how the selection of a given mechanism depends on the entire population. We finally explore this concept in different frameworks and postulate that the identification of predictive mechanisms is only successful with small elasticity modulus.

  9. Computing by physical interaction in neurons.

    PubMed

    Aur, Dorian; Jog, Mandar; Poznanski, Roman R

    2011-12-01

    The electrodynamics of action potentials represents the fundamental level where information is integrated and processed in neurons. The Hodgkin-Huxley model cannot explain the non-stereotyped spatial charge density dynamics that occur during action potential propagation. Revealed in experiments as spike directivity, the non-uniform charge density dynamics within neurons carry meaningful information and suggest that fragments of information regarding our memories are endogenously stored in structural patterns at a molecular level and are revealed only during spiking activity. The main conceptual idea is that under the influence of electric fields, efficient computation by interaction occurs between charge densities embedded within molecular structures and the transient developed flow of electrical charges. This process of computation underlying electrical interactions and molecular mechanisms at the subcellular level is dissimilar from spiking neuron models that are completely devoid of physical interactions. Computation by interaction describes a more powerful continuous model of computation than the one that consists of discrete steps as represented in Turing machines.

  10. Modeling Reality - How Computers Mirror Life

    NASA Astrophysics Data System (ADS)

    Bialynicki-Birula, Iwo; Bialynicka-Birula, Iwona

    2005-01-01

    The bookModeling Reality covers a wide range of fascinating subjects, accessible to anyone who wants to learn about the use of computer modeling to solve a diverse range of problems, but who does not possess a specialized training in mathematics or computer science. The material presented is pitched at the level of high-school graduates, even though it covers some advanced topics (cellular automata, Shannon's measure of information, deterministic chaos, fractals, game theory, neural networks, genetic algorithms, and Turing machines). These advanced topics are explained in terms of well known simple concepts: Cellular automata - Game of Life, Shannon's formula - Game of twenty questions, Game theory - Television quiz, etc. The book is unique in explaining in a straightforward, yet complete, fashion many important ideas, related to various models of reality and their applications. Twenty-five programs, written especially for this book, are provided on an accompanying CD. They greatly enhance its pedagogical value and make learning of even the more complex topics an enjoyable pleasure.

  11. Science and Human Experience

    NASA Astrophysics Data System (ADS)

    Cooper, Leon N.

    2015-01-01

    Part I. Science and Society: 1. Science and human experience; 2. Does science undermine our values?; 3. Can science serve mankind?; 4. Modern science and contemporary discomfort: metaphor and reality; 5. Faith and science; 6. Art and science; 7. Fraud in science; 8. Why study science? The keys to the cathedral; 9. Is evolution a theory? A modest proposal; 10. The silence of the second; 11. Introduction to Copenhagen; 12. The unpaid debt; Part II. Thought and Consciousness: 13. Source and limits of human intellect; 14. Neural networks; 15. Thought and mental experience: the Turing test; 16. Mind as machine: will we rubbish human experience?; 17. Memory and memories: a physicist's approach to the brain; 18. On the problem of consciousness; Part III. On the Nature and Limits of Science: 19. What is a good theory?; 20. Shall we deconstruct science?; 21. Visible and invisible in physical theory; 22. Experience and order; 23. The language of physics; 24. The structure of space; 25. Superconductivity and other insoluble problems; 26. From gravity to light and consciousness: does science have limits?

  12. Science and Human Experience

    NASA Astrophysics Data System (ADS)

    Cooper, Leon N.

    2014-12-01

    Part I. Science and Society: 1. Science and human experience; 2. Does science undermine our values?; 3. Can science serve mankind?; 4. Modern science and contemporary discomfort: metaphor and reality; 5. Faith and science; 6. Art and science; 7. Fraud in science; 8. Why study science? The keys to the cathedral; 9. Is evolution a theory? A modest proposal; 10. The silence of the second; 11. Introduction to Copenhagen; 12. The unpaid debt; Part II. Thought and Consciousness: 13. Source and limits of human intellect; 14. Neural networks; 15. Thought and mental experience: the Turing test; 16. Mind as machine: will we rubbish human experience?; 17. Memory and memories: a physicist's approach to the brain; 18. On the problem of consciousness; Part III. On the Nature and Limits of Science: 19. What is a good theory?; 20. Shall we deconstruct science?; 21. Visible and invisible in physical theory; 22. Experience and order; 23. The language of physics; 24. The structure of space; 25. Superconductivity and other insoluble problems; 26. From gravity to light and consciousness: does science have limits?

  13. Nature as a network of morphological infocomputational processes for cognitive agents

    NASA Astrophysics Data System (ADS)

    Dodig-Crnkovic, Gordana

    2017-01-01

    This paper presents a view of nature as a network of infocomputational agents organized in a dynamical hierarchy of levels. It provides a framework for unification of currently disparate understandings of natural, formal, technical, behavioral and social phenomena based on information as a structure, differences in one system that cause the differences in another system, and computation as its dynamics, i.e. physical process of morphological change in the informational structure. We address some of the frequent misunderstandings regarding the natural/morphological computational models and their relationships to physical systems, especially cognitive systems such as living beings. Natural morphological infocomputation as a conceptual framework necessitates generalization of models of computation beyond the traditional Turing machine model presenting symbol manipulation, and requires agent-based concurrent resource-sensitive models of computation in order to be able to cover the whole range of phenomena from physics to cognition. The central role of agency, particularly material vs. cognitive agency is highlighted.

  14. Algorithmic-Reducibility = Renormalization-Group Fixed-Points; ``Noise''-Induced Phase-Transitions (NITs) to Accelerate Algorithmics (``NIT-Picking'') Replacing CRUTCHES!!!: Gauss Modular/Clock-Arithmetic Congruences = Signal X Noise PRODUCTS..

    NASA Astrophysics Data System (ADS)

    Siegel, J.; Siegel, Edward Carl-Ludwig

    2011-03-01

    Cook-Levin computational-"complexity"(C-C) algorithmic-equivalence reduction-theorem reducibility equivalence to renormalization-(semi)-group phase-transitions critical-phenomena statistical-physics universality-classes fixed-points, is exploited with Gauss modular/clock-arithmetic/model congruences = signal X noise PRODUCT reinterpretation. Siegel-Baez FUZZYICS=CATEGORYICS(SON of ``TRIZ''): Category-Semantics(C-S) tabular list-format truth-table matrix analytics predicts and implements "noise"-induced phase-transitions (NITs) to accelerate versus to decelerate Harel [Algorithmics(1987)]-Sipser[Intro. Theory Computation(1997) algorithmic C-C: "NIT-picking" to optimize optimization-problems optimally(OOPO). Versus iso-"noise" power-spectrum quantitative-only amplitude/magnitude-only variation stochastic-resonance, this "NIT-picking" is "noise" power-spectrum QUALitative-type variation via quantitative critical-exponents variation. Computer-"science" algorithmic C-C models: Turing-machine, finite-state-models/automata, are identified as early-days once-workable but NOW ONLY LIMITING CRUTCHES IMPEDING latter-days new-insights!!!

  15. Self-organization in the limb: a Turing mechanism for digit development.

    PubMed

    Cooper, Kimberly L

    2015-06-01

    The statistician George E. P. Box stated, 'Essentially all models are wrong, but some are useful.' (Box GEP, Draper NR: Empirical Model-Building and Response Surfaces. Wiley; 1987). Modeling biological processes is challenging for many of the reasons classically trained developmental biologists often resist the idea that black and white equations can explain the grayscale subtleties of living things. Although a simplified mathematical model of development will undoubtedly fall short of precision, a good model is exceedingly useful if it raises at least as many testable questions as it answers. Self-organizing Turing models that simulate the pattern of digits in the hand replicate events that have not yet been explained by classical approaches. The union of theory and experimentation has recently identified and validated the minimal components of a Turing network for digit pattern and triggered a cascade of questions that will undoubtedly be well-served by the continued merging of disciplines. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. On the Computational Power of Spiking Neural P Systems with Self-Organization.

    PubMed

    Wang, Xun; Song, Tao; Gong, Faming; Zheng, Pan

    2016-06-10

    Neural-like computing models are versatile computing mechanisms in the field of artificial intelligence. Spiking neural P systems (SN P systems for short) are one of the recently developed spiking neural network models inspired by the way neurons communicate. The communications among neurons are essentially achieved by spikes, i. e. short electrical pulses. In terms of motivation, SN P systems fall into the third generation of neural network models. In this study, a novel variant of SN P systems, namely SN P systems with self-organization, is introduced, and the computational power of the system is investigated and evaluated. It is proved that SN P systems with self-organization are capable of computing and accept the family of sets of Turing computable natural numbers. Moreover, with 87 neurons the system can compute any Turing computable recursive function, thus achieves Turing universality. These results demonstrate promising initiatives to solve an open problem arisen by Gh Păun.

  17. On the Computational Power of Spiking Neural P Systems with Self-Organization

    PubMed Central

    Wang, Xun; Song, Tao; Gong, Faming; Zheng, Pan

    2016-01-01

    Neural-like computing models are versatile computing mechanisms in the field of artificial intelligence. Spiking neural P systems (SN P systems for short) are one of the recently developed spiking neural network models inspired by the way neurons communicate. The communications among neurons are essentially achieved by spikes, i. e. short electrical pulses. In terms of motivation, SN P systems fall into the third generation of neural network models. In this study, a novel variant of SN P systems, namely SN P systems with self-organization, is introduced, and the computational power of the system is investigated and evaluated. It is proved that SN P systems with self-organization are capable of computing and accept the family of sets of Turing computable natural numbers. Moreover, with 87 neurons the system can compute any Turing computable recursive function, thus achieves Turing universality. These results demonstrate promising initiatives to solve an open problem arisen by Gh Păun. PMID:27283843

  18. Spatiotemporal pattern formation in a prey-predator model under environmental driving forces

    NASA Astrophysics Data System (ADS)

    Sirohi, Anuj Kumar; Banerjee, Malay; Chakraborti, Anirban

    2015-09-01

    Many existing studies on pattern formation in the reaction-diffusion systems rely on deterministic models. However, environmental noise is often a major factor which leads to significant changes in the spatiotemporal dynamics. In this paper, we focus on the spatiotemporal patterns produced by the predator-prey model with ratio-dependent functional response and density dependent death rate of predator. We get the reaction-diffusion equations incorporating the self-diffusion terms, corresponding to random movement of the individuals within two dimensional habitats, into the growth equations for the prey and predator population. In order to have the noise added model, small amplitude heterogeneous perturbations to the linear intrinsic growth rates are introduced using uncorrelated Gaussian white noise terms. For the noise added system, we then observe spatial patterns for the parameter values lying outside the Turing instability region. With thorough numerical simulations we characterize the patterns corresponding to Turing and Turing-Hopf domain and study their dependence on different system parameters like noise-intensity, etc.

  19. Helical Turing patterns in the Lengyel-Epstein model in thin cylindrical layers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bánsági, T.; Taylor, A. F., E-mail: A.F.Taylor@sheffield.ac.uk

    2015-06-15

    The formation of Turing patterns was investigated in thin cylindrical layers using the Lengyel-Epstein model of the chlorine dioxide-iodine-malonic acid reaction. The influence of the width of the layer W and the diameter D of the inner cylinder on the pattern with intrinsic wavelength l were determined in simulations with initial random noise perturbations to the uniform state for W < l/2 and D ∼ l or lower. We show that the geometric constraints of the reaction domain may result in the formation of helical Turing patterns with parameters that give stripes (b = 0.2) or spots (b = 0.37) in two dimensions. For b = 0.2, the helices weremore » composed of lamellae and defects were likely as the diameter of the cylinder increased. With b = 0.37, the helices consisted of semi-cylinders and the orientation of stripes on the outer surface (and hence winding number) increased with increasing diameter until a new stripe appeared.« less

  20. On the Computational Power of Spiking Neural P Systems with Self-Organization

    NASA Astrophysics Data System (ADS)

    Wang, Xun; Song, Tao; Gong, Faming; Zheng, Pan

    2016-06-01

    Neural-like computing models are versatile computing mechanisms in the field of artificial intelligence. Spiking neural P systems (SN P systems for short) are one of the recently developed spiking neural network models inspired by the way neurons communicate. The communications among neurons are essentially achieved by spikes, i. e. short electrical pulses. In terms of motivation, SN P systems fall into the third generation of neural network models. In this study, a novel variant of SN P systems, namely SN P systems with self-organization, is introduced, and the computational power of the system is investigated and evaluated. It is proved that SN P systems with self-organization are capable of computing and accept the family of sets of Turing computable natural numbers. Moreover, with 87 neurons the system can compute any Turing computable recursive function, thus achieves Turing universality. These results demonstrate promising initiatives to solve an open problem arisen by Gh Păun.

  1. Predicate calculus for an architecture of multiple neural networks

    NASA Astrophysics Data System (ADS)

    Consoli, Robert H.

    1990-08-01

    Future projects with neural networks will require multiple individual network components. Current efforts along these lines are ad hoc. This paper relates the neural network to a classical device and derives a multi-part architecture from that model. Further it provides a Predicate Calculus variant for describing the location and nature of the trainings and suggests Resolution Refutation as a method for determining the performance of the system as well as the location of needed trainings for specific proofs. 2. THE NEURAL NETWORK AND A CLASSICAL DEVICE Recently investigators have been making reports about architectures of multiple neural networksL234. These efforts are appearing at an early stage in neural network investigations they are characterized by architectures suggested directly by the problem space. Touretzky and Hinton suggest an architecture for processing logical statements1 the design of this architecture arises from the syntax of a restricted class of logical expressions and exhibits syntactic limitations. In similar fashion a multiple neural netword arises out of a control problem2 from the sequence learning problem3 and from the domain of machine learning. 4 But a general theory of multiple neural devices is missing. More general attempts to relate single or multiple neural networks to classical computing devices are not common although an attempt is made to relate single neural devices to a Turing machines and Sun et a!. develop a multiple neural architecture that performs pattern classification.

  2. Directing three-dimensional multicellular morphogenesis by self-organization of vascular mesenchymal cells in hyaluronic acid hydrogels.

    PubMed

    Zhu, Xiaolu; Gojgini, Shiva; Chen, Ting-Hsuan; Fei, Peng; Dong, Siyan; Ho, Chih-Ming; Segura, Tatiana

    2017-01-01

    Physical scaffolds are useful for supporting cells to form three-dimensional (3D) tissue. However, it is non-trivial to develop a scheme that can robustly guide cells to self-organize into a tissue with the desired 3D spatial structures. To achieve this goal, the rational regulation of cellular self-organization in 3D extracellular matrix (ECM) such as hydrogel is needed. In this study, we integrated the Turing reaction-diffusion mechanism with the self-organization process of cells and produced multicellular 3D structures with the desired configurations in a rational manner. By optimizing the components of the hydrogel and applying exogenous morphogens, a variety of multicellular 3D architectures composed of multipotent vascular mesenchymal cells (VMCs) were formed inside hyaluronic acid (HA) hydrogels. These 3D architectures could mimic the features of trabecular bones and multicellular nodules. Based on the Turing reaction-diffusion instability of morphogens and cells, a theoretical model was proposed to predict the variations observed in 3D multicellular structures in response to exogenous factors. It enabled the feasibility to obtain diverse types of 3D multicellular structures by addition of Noggin and/or BMP2. The morphological consistency between the simulation prediction and experimental results probably revealed a Turing-type mechanism underlying the 3D self-organization of VMCs in HA hydrogels. Our study has provided new ways to create a variety of self-organized 3D multicellular architectures for regenerating biomaterial and tissues in a Turing mechanism-based approach.

  3. Assisted closed-loop optimization of SSVEP-BCI efficiency

    PubMed Central

    Fernandez-Vargas, Jacobo; Pfaff, Hanns U.; Rodríguez, Francisco B.; Varona, Pablo

    2012-01-01

    We designed a novel assisted closed-loop optimization protocol to improve the efficiency of brain-computer interfaces (BCI) based on steady state visually evoked potentials (SSVEP). In traditional paradigms, the control over the BCI-performance completely depends on the subjects' ability to learn from the given feedback cues. By contrast, in the proposed protocol both the subject and the machine share information and control over the BCI goal. Generally, the innovative assistance consists in the delivery of online information together with the online adaptation of BCI stimuli properties. In our case, this adaptive optimization process is realized by (1) a closed-loop search for the best set of SSVEP flicker frequencies and (2) feedback of actual SSVEP magnitudes to both the subject and the machine. These closed-loop interactions between subject and machine are evaluated in real-time by continuous measurement of their efficiencies, which are used as online criteria to adapt the BCI control parameters. The proposed protocol aims to compensate for variability in possibly unknown subjects' state and trait dimensions. In a study with N = 18 subjects, we found significant evidence that our protocol outperformed classic SSVEP-BCI control paradigms. Evidence is presented that it takes indeed into account interindividual variabilities: e.g., under the new protocol, baseline resting state EEG measures predict subjects' BCI performances. This paper illustrates the promising potential of assisted closed-loop protocols in BCI systems. Probably their applicability might be expanded to innovative uses, e.g., as possible new diagnostic/therapeutic tools for clinical contexts and as new paradigms for basic research. PMID:23443214

  4. Assisted closed-loop optimization of SSVEP-BCI efficiency.

    PubMed

    Fernandez-Vargas, Jacobo; Pfaff, Hanns U; Rodríguez, Francisco B; Varona, Pablo

    2013-01-01

    We designed a novel assisted closed-loop optimization protocol to improve the efficiency of brain-computer interfaces (BCI) based on steady state visually evoked potentials (SSVEP). In traditional paradigms, the control over the BCI-performance completely depends on the subjects' ability to learn from the given feedback cues. By contrast, in the proposed protocol both the subject and the machine share information and control over the BCI goal. Generally, the innovative assistance consists in the delivery of online information together with the online adaptation of BCI stimuli properties. In our case, this adaptive optimization process is realized by (1) a closed-loop search for the best set of SSVEP flicker frequencies and (2) feedback of actual SSVEP magnitudes to both the subject and the machine. These closed-loop interactions between subject and machine are evaluated in real-time by continuous measurement of their efficiencies, which are used as online criteria to adapt the BCI control parameters. The proposed protocol aims to compensate for variability in possibly unknown subjects' state and trait dimensions. In a study with N = 18 subjects, we found significant evidence that our protocol outperformed classic SSVEP-BCI control paradigms. Evidence is presented that it takes indeed into account interindividual variabilities: e.g., under the new protocol, baseline resting state EEG measures predict subjects' BCI performances. This paper illustrates the promising potential of assisted closed-loop protocols in BCI systems. Probably their applicability might be expanded to innovative uses, e.g., as possible new diagnostic/therapeutic tools for clinical contexts and as new paradigms for basic research.

  5. Predictors of return rate discrimination in slot machine play.

    PubMed

    Coates, Ewan; Blaszczynski, Alex

    2014-09-01

    The purpose of this study was to investigate the extent to which accurate estimates of payback percentages and volatility combined with prior learning, enabled players to successfully discriminate between multi-line/multi-credit slot machines that provided differing rates of reinforcement. The aim was to determine if the capacity to discriminate structural characteristics of gaming machines influenced player choices in selecting 'favourite' slot machines. Slot machine gambling history, gambling beliefs and knowledge, impulsivity, illusions of control, and problem solving style were assessed in a sample of 48 first year undergraduate psychology students. Participants were subsequently exposed to a choice paradigm where they could freely select to play either of two concurrently presented PC-simulated slot machines programmed to randomly differ in expected player return rates (payback percentage) and win frequency (volatility). Results suggest that prior learning and cognitions (particularly gambler's fallacy) but not payback, were major contributors to the ability of a player to discriminate volatility between slot machines. Participants displayed a general tendency to discriminate payback, but counter-intuitively placed more bets on the slot machine with lower payback percentage rates.

  6. Validation and Refinement of the DELFIC Cloud Rise Module

    DTIC Science & Technology

    1977-01-15

    Explosion Energy Fraction in the Cloud, f 13 2.4.2 Temper&ture of Condensed-Phase Matter 13 2.4.3 Altitude 14 2.4.4 Rise V0elociy 14 2.4.5 Mass and Volume 15...2.4.1 Explosion Energy Fraction in the Cloud. f. The original NRDL water-surface burst model used an energy fraction of 33%. For the first DELFIC...of explosion energy) is used to heat soil and air to their respective initial tempera- tures. The soil mans and both initial temperatures are

  7. Machine learning and data science in soft materials engineering

    NASA Astrophysics Data System (ADS)

    Ferguson, Andrew L.

    2018-01-01

    In many branches of materials science it is now routine to generate data sets of such large size and dimensionality that conventional methods of analysis fail. Paradigms and tools from data science and machine learning can provide scalable approaches to identify and extract trends and patterns within voluminous data sets, perform guided traversals of high-dimensional phase spaces, and furnish data-driven strategies for inverse materials design. This topical review provides an accessible introduction to machine learning tools in the context of soft and biological materials by ‘de-jargonizing’ data science terminology, presenting a taxonomy of machine learning techniques, and surveying the mathematical underpinnings and software implementations of popular tools, including principal component analysis, independent component analysis, diffusion maps, support vector machines, and relative entropy. We present illustrative examples of machine learning applications in soft matter, including inverse design of self-assembling materials, nonlinear learning of protein folding landscapes, high-throughput antimicrobial peptide design, and data-driven materials design engines. We close with an outlook on the challenges and opportunities for the field.

  8. Machine learning and data science in soft materials engineering.

    PubMed

    Ferguson, Andrew L

    2018-01-31

    In many branches of materials science it is now routine to generate data sets of such large size and dimensionality that conventional methods of analysis fail. Paradigms and tools from data science and machine learning can provide scalable approaches to identify and extract trends and patterns within voluminous data sets, perform guided traversals of high-dimensional phase spaces, and furnish data-driven strategies for inverse materials design. This topical review provides an accessible introduction to machine learning tools in the context of soft and biological materials by 'de-jargonizing' data science terminology, presenting a taxonomy of machine learning techniques, and surveying the mathematical underpinnings and software implementations of popular tools, including principal component analysis, independent component analysis, diffusion maps, support vector machines, and relative entropy. We present illustrative examples of machine learning applications in soft matter, including inverse design of self-assembling materials, nonlinear learning of protein folding landscapes, high-throughput antimicrobial peptide design, and data-driven materials design engines. We close with an outlook on the challenges and opportunities for the field.

  9. PCA-based polling strategy in machine learning framework for coronary artery disease risk assessment in intravascular ultrasound: A link between carotid and coronary grayscale plaque morphology.

    PubMed

    Araki, Tadashi; Ikeda, Nobutaka; Shukla, Devarshi; Jain, Pankaj K; Londhe, Narendra D; Shrivastava, Vimal K; Banchhor, Sumit K; Saba, Luca; Nicolaides, Andrew; Shafique, Shoaib; Laird, John R; Suri, Jasjit S

    2016-05-01

    Percutaneous coronary interventional procedures need advance planning prior to stenting or an endarterectomy. Cardiologists use intravascular ultrasound (IVUS) for screening, risk assessment and stratification of coronary artery disease (CAD). We hypothesize that plaque components are vulnerable to rupture due to plaque progression. Currently, there are no standard grayscale IVUS tools for risk assessment of plaque rupture. This paper presents a novel strategy for risk stratification based on plaque morphology embedded with principal component analysis (PCA) for plaque feature dimensionality reduction and dominant feature selection technique. The risk assessment utilizes 56 grayscale coronary features in a machine learning framework while linking information from carotid and coronary plaque burdens due to their common genetic makeup. This system consists of a machine learning paradigm which uses a support vector machine (SVM) combined with PCA for optimal and dominant coronary artery morphological feature extraction. Carotid artery proven intima-media thickness (cIMT) biomarker is adapted as a gold standard during the training phase of the machine learning system. For the performance evaluation, K-fold cross validation protocol is adapted with 20 trials per fold. For choosing the dominant features out of the 56 grayscale features, a polling strategy of PCA is adapted where the original value of the features is unaltered. Different protocols are designed for establishing the stability and reliability criteria of the coronary risk assessment system (cRAS). Using the PCA-based machine learning paradigm and cross-validation protocol, a classification accuracy of 98.43% (AUC 0.98) with K=10 folds using an SVM radial basis function (RBF) kernel was achieved. A reliability index of 97.32% and machine learning stability criteria of 5% were met for the cRAS. This is the first Computer aided design (CADx) system of its kind that is able to demonstrate the ability of coronary risk assessment and stratification while demonstrating a successful design of the machine learning system based on our assumptions. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. An online hybrid brain-computer interface combining multiple physiological signals for webpage browse.

    PubMed

    Long Chen; Zhongpeng Wang; Feng He; Jiajia Yang; Hongzhi Qi; Peng Zhou; Baikun Wan; Dong Ming

    2015-08-01

    The hybrid brain computer interface (hBCI) could provide higher information transfer rate than did the classical BCIs. It included more than one brain-computer or human-machine interact paradigms, such as the combination of the P300 and SSVEP paradigms. Research firstly constructed independent subsystems of three different paradigms and tested each of them with online experiments. Then we constructed a serial hybrid BCI system which combined these paradigms to achieve the functions of typing letters, moving and clicking cursor, and switching among them for the purpose of browsing webpages. Five subjects were involved in this study. They all successfully realized these functions in the online tests. The subjects could achieve an accuracy above 90% after training, which met the requirement in operating the system efficiently. The results demonstrated that it was an efficient system capable of robustness, which provided an approach for the clinic application.

  11. Communication Studies of DMP and SMP Machines

    NASA Technical Reports Server (NTRS)

    Sohn, Andrew; Biswas, Rupak; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    Understanding the interplay between machines and problems is key to obtaining high performance on parallel machines. This paper investigates the interplay between programming paradigms and communication capabilities of parallel machines. In particular, we explicate the communication capabilities of the IBM SP-2 distributed-memory multiprocessor and the SGI PowerCHALLENGEarray symmetric multiprocessor. Two benchmark problems of bitonic sorting and Fast Fourier Transform are selected for experiments. Communication-efficient algorithms are developed to exploit the overlapping capabilities of the machines. Programs are written in Message-Passing Interface for portability and identical codes are used for both machines. Various data sizes and message sizes are used to test the machines' communication capabilities. Experimental results indicate that the communication performance of the multiprocessors are consistent with the size of messages. The SP-2 is sensitive to message size but yields a much higher communication overlapping because of the communication co-processor. The PowerCHALLENGEarray is not highly sensitive to message size and yields a low communication overlapping. Bitonic sorting yields lower performance compared to FFT due to a smaller computation-to-communication ratio.

  12. Knowledge-based requirements analysis for automating software development

    NASA Technical Reports Server (NTRS)

    Markosian, Lawrence Z.

    1988-01-01

    We present a new software development paradigm that automates the derivation of implementations from requirements. In this paradigm, informally-stated requirements are expressed in a domain-specific requirements specification language. This language is machine-understable and requirements expressed in it are captured in a knowledge base. Once the requirements are captured, more detailed specifications and eventually implementations are derived by the system using transformational synthesis. A key characteristic of the process is that the required human intervention is in the form of providing problem- and domain-specific engineering knowledge, not in writing detailed implementations. We describe a prototype system that applies the paradigm in the realm of communication engineering: the prototype automatically generates implementations of buffers following analysis of the requirements on each buffer.

  13. Long-Term Training with a Brain-Machine Interface-Based Gait Protocol Induces Partial Neurological Recovery in Paraplegic Patients.

    PubMed

    Donati, Ana R C; Shokur, Solaiman; Morya, Edgard; Campos, Debora S F; Moioli, Renan C; Gitti, Claudia M; Augusto, Patricia B; Tripodi, Sandra; Pires, Cristhiane G; Pereira, Gislaine A; Brasil, Fabricio L; Gallo, Simone; Lin, Anthony A; Takigami, Angelo K; Aratanha, Maria A; Joshi, Sanjay; Bleuler, Hannes; Cheng, Gordon; Rudolph, Alan; Nicolelis, Miguel A L

    2016-08-11

    Brain-machine interfaces (BMIs) provide a new assistive strategy aimed at restoring mobility in severely paralyzed patients. Yet, no study in animals or in human subjects has indicated that long-term BMI training could induce any type of clinical recovery. Eight chronic (3-13 years) spinal cord injury (SCI) paraplegics were subjected to long-term training (12 months) with a multi-stage BMI-based gait neurorehabilitation paradigm aimed at restoring locomotion. This paradigm combined intense immersive virtual reality training, enriched visual-tactile feedback, and walking with two EEG-controlled robotic actuators, including a custom-designed lower limb exoskeleton capable of delivering tactile feedback to subjects. Following 12 months of training with this paradigm, all eight patients experienced neurological improvements in somatic sensation (pain localization, fine/crude touch, and proprioceptive sensing) in multiple dermatomes. Patients also regained voluntary motor control in key muscles below the SCI level, as measured by EMGs, resulting in marked improvement in their walking index. As a result, 50% of these patients were upgraded to an incomplete paraplegia classification. Neurological recovery was paralleled by the reemergence of lower limb motor imagery at cortical level. We hypothesize that this unprecedented neurological recovery results from both cortical and spinal cord plasticity triggered by long-term BMI usage.

  14. Long-Term Training with a Brain-Machine Interface-Based Gait Protocol Induces Partial Neurological Recovery in Paraplegic Patients

    PubMed Central

    Donati, Ana R. C.; Shokur, Solaiman; Morya, Edgard; Campos, Debora S. F.; Moioli, Renan C.; Gitti, Claudia M.; Augusto, Patricia B.; Tripodi, Sandra; Pires, Cristhiane G.; Pereira, Gislaine A.; Brasil, Fabricio L.; Gallo, Simone; Lin, Anthony A.; Takigami, Angelo K.; Aratanha, Maria A.; Joshi, Sanjay; Bleuler, Hannes; Cheng, Gordon; Rudolph, Alan; Nicolelis, Miguel A. L.

    2016-01-01

    Brain-machine interfaces (BMIs) provide a new assistive strategy aimed at restoring mobility in severely paralyzed patients. Yet, no study in animals or in human subjects has indicated that long-term BMI training could induce any type of clinical recovery. Eight chronic (3–13 years) spinal cord injury (SCI) paraplegics were subjected to long-term training (12 months) with a multi-stage BMI-based gait neurorehabilitation paradigm aimed at restoring locomotion. This paradigm combined intense immersive virtual reality training, enriched visual-tactile feedback, and walking with two EEG-controlled robotic actuators, including a custom-designed lower limb exoskeleton capable of delivering tactile feedback to subjects. Following 12 months of training with this paradigm, all eight patients experienced neurological improvements in somatic sensation (pain localization, fine/crude touch, and proprioceptive sensing) in multiple dermatomes. Patients also regained voluntary motor control in key muscles below the SCI level, as measured by EMGs, resulting in marked improvement in their walking index. As a result, 50% of these patients were upgraded to an incomplete paraplegia classification. Neurological recovery was paralleled by the reemergence of lower limb motor imagery at cortical level. We hypothesize that this unprecedented neurological recovery results from both cortical and spinal cord plasticity triggered by long-term BMI usage. PMID:27513629

  15. Local control of globally competing patterns in coupled Swift-Hohenberg equations

    NASA Astrophysics Data System (ADS)

    Becker, Maximilian; Frenzel, Thomas; Niedermayer, Thomas; Reichelt, Sina; Mielke, Alexander; Bär, Markus

    2018-04-01

    We present analytical and numerical investigations of two anti-symmetrically coupled 1D Swift-Hohenberg equations (SHEs) with cubic nonlinearities. The SHE provides a generic formulation for pattern formation at a characteristic length scale. A linear stability analysis of the homogeneous state reveals a wave instability in addition to the usual Turing instability of uncoupled SHEs. We performed weakly nonlinear analysis in the vicinity of the codimension-two point of the Turing-wave instability, resulting in a set of coupled amplitude equations for the Turing pattern as well as left- and right-traveling waves. In particular, these complex Ginzburg-Landau-type equations predict two major things: there exists a parameter regime where multiple different patterns are stable with respect to each other and that the amplitudes of different patterns interact by local mutual suppression. In consequence, different patterns can coexist in distinct spatial regions, separated by localized interfaces. We identified specific mechanisms for controlling the position of these interfaces, which distinguish what kinds of patterns the interface connects and thus allow for global pattern selection. Extensive simulations of the original SHEs confirm our results.

  16. Additive noise-induced Turing transitions in spatial systems with application to neural fields and the Swift Hohenberg equation

    NASA Astrophysics Data System (ADS)

    Hutt, Axel; Longtin, Andre; Schimansky-Geier, Lutz

    2008-05-01

    This work studies the spatio-temporal dynamics of a generic integral-differential equation subject to additive random fluctuations. It introduces a combination of the stochastic center manifold approach for stochastic differential equations and the adiabatic elimination for Fokker-Planck equations, and studies analytically the systems’ stability near Turing bifurcations. In addition two types of fluctuation are studied, namely fluctuations uncorrelated in space and time, and global fluctuations, which are constant in space but uncorrelated in time. We show that the global fluctuations shift the Turing bifurcation threshold. This shift is proportional to the fluctuation variance. Applications to a neural field equation and the Swift-Hohenberg equation reveal the shift of the bifurcation to larger control parameters, which represents a stabilization of the system. All analytical results are confirmed by numerical simulations of the occurring mode equations and the full stochastic integral-differential equation. To gain some insight into experimental manifestations, the sum of uncorrelated and global additive fluctuations is studied numerically and the analytical results on global fluctuations are confirmed qualitatively.

  17. Turing mechanism for homeostatic control of synaptic density during C. elegans growth

    NASA Astrophysics Data System (ADS)

    Brooks, Heather A.; Bressloff, Paul C.

    2017-07-01

    We propose a mechanism for the homeostatic control of synapses along the ventral cord of Caenorhabditis elegans during development, based on a form of Turing pattern formation on a growing domain. C. elegans is an important animal model for understanding cellular mechanisms underlying learning and memory. Our mathematical model consists of two interacting chemical species, where one is passively diffusing and the other is actively trafficked by molecular motors, which switch between forward and backward moving states (bidirectional transport). This differs significantly from the standard mechanism for Turing pattern formation based on the interaction between fast and slow diffusing species. We derive evolution equations for the chemical concentrations on a slowly growing one-dimensional domain, and use numerical simulations to demonstrate the insertion of new concentration peaks as the length increases. Taking the passive component to be the protein kinase CaMKII and the active component to be the glutamate receptor GLR-1, we interpret the concentration peaks as sites of new synapses along the length of C. elegans, and thus show how the density of synaptic sites can be maintained.

  18. Instability of turing patterns in reaction-diffusion-ODE systems.

    PubMed

    Marciniak-Czochra, Anna; Karch, Grzegorz; Suzuki, Kanako

    2017-02-01

    The aim of this paper is to contribute to the understanding of the pattern formation phenomenon in reaction-diffusion equations coupled with ordinary differential equations. Such systems of equations arise, for example, from modeling of interactions between cellular processes such as cell growth, differentiation or transformation and diffusing signaling factors. We focus on stability analysis of solutions of a prototype model consisting of a single reaction-diffusion equation coupled to an ordinary differential equation. We show that such systems are very different from classical reaction-diffusion models. They exhibit diffusion-driven instability (turing instability) under a condition of autocatalysis of non-diffusing component. However, the same mechanism which destabilizes constant solutions of such models, destabilizes also all continuous spatially heterogeneous stationary solutions, and consequently, there exist no stable Turing patterns in such reaction-diffusion-ODE systems. We provide a rigorous result on the nonlinear instability, which involves the analysis of a continuous spectrum of a linear operator induced by the lack of diffusion in the destabilizing equation. These results are extended to discontinuous patterns for a class of nonlinearities.

  19. Humanizing machines: Anthropomorphization of slot machines increases gambling.

    PubMed

    Riva, Paolo; Sacchi, Simona; Brambilla, Marco

    2015-12-01

    Do people gamble more on slot machines if they think that they are playing against humanlike minds rather than mathematical algorithms? Research has shown that people have a strong cognitive tendency to imbue humanlike mental states to nonhuman entities (i.e., anthropomorphism). The present research tested whether anthropomorphizing slot machines would increase gambling. Four studies manipulated slot machine anthropomorphization and found that exposing people to an anthropomorphized description of a slot machine increased gambling behavior and reduced gambling outcomes. Such findings emerged using tasks that focused on gambling behavior (Studies 1 to 3) as well as in experimental paradigms that included gambling outcomes (Studies 2 to 4). We found that gambling outcomes decrease because participants primed with the anthropomorphic slot machine gambled more (Study 4). Furthermore, we found that high-arousal positive emotions (e.g., feeling excited) played a role in the effect of anthropomorphism on gambling behavior (Studies 3 and 4). Our research indicates that the psychological process of gambling-machine anthropomorphism can be advantageous for the gaming industry; however, this may come at great expense for gamblers' (and their families') economic resources and psychological well-being. (c) 2015 APA, all rights reserved).

  20. Using human brain activity to guide machine learning.

    PubMed

    Fong, Ruth C; Scheirer, Walter J; Cox, David D

    2018-03-29

    Machine learning is a field of computer science that builds algorithms that learn. In many cases, machine learning algorithms are used to recreate a human ability like adding a caption to a photo, driving a car, or playing a game. While the human brain has long served as a source of inspiration for machine learning, little effort has been made to directly use data collected from working brains as a guide for machine learning algorithms. Here we demonstrate a new paradigm of "neurally-weighted" machine learning, which takes fMRI measurements of human brain activity from subjects viewing images, and infuses these data into the training process of an object recognition learning algorithm to make it more consistent with the human brain. After training, these neurally-weighted classifiers are able to classify images without requiring any additional neural data. We show that our neural-weighting approach can lead to large performance gains when used with traditional machine vision features, as well as to significant improvements with already high-performing convolutional neural network features. The effectiveness of this approach points to a path forward for a new class of hybrid machine learning algorithms which take both inspiration and direct constraints from neuronal data.

  1. Feature Selection for Speech Emotion Recognition in Spanish and Basque: On the Use of Machine Learning to Improve Human-Computer Interaction

    PubMed Central

    Arruti, Andoni; Cearreta, Idoia; Álvarez, Aitor; Lazkano, Elena; Sierra, Basilio

    2014-01-01

    Study of emotions in human–computer interaction is a growing research area. This paper shows an attempt to select the most significant features for emotion recognition in spoken Basque and Spanish Languages using different methods for feature selection. RekEmozio database was used as the experimental data set. Several Machine Learning paradigms were used for the emotion classification task. Experiments were executed in three phases, using different sets of features as classification variables in each phase. Moreover, feature subset selection was applied at each phase in order to seek for the most relevant feature subset. The three phases approach was selected to check the validity of the proposed approach. Achieved results show that an instance-based learning algorithm using feature subset selection techniques based on evolutionary algorithms is the best Machine Learning paradigm in automatic emotion recognition, with all different feature sets, obtaining a mean of 80,05% emotion recognition rate in Basque and a 74,82% in Spanish. In order to check the goodness of the proposed process, a greedy searching approach (FSS-Forward) has been applied and a comparison between them is provided. Based on achieved results, a set of most relevant non-speaker dependent features is proposed for both languages and new perspectives are suggested. PMID:25279686

  2. The eXperience Induction Machine: A New Paradigm for Mixed-Reality Interaction Design and Psychological Experimentation

    NASA Astrophysics Data System (ADS)

    Bernardet, Ulysses; Bermúdez I Badia, Sergi; Duff, Armin; Inderbitzin, Martin; Le Groux, Sylvain; Manzolli, Jônatas; Mathews, Zenon; Mura, Anna; Väljamäe, Aleksander; Verschure, Paul F. M. J.

    The eXperience Induction Machine (XIM) is one of the most advanced mixed-reality spaces available today. XIM is an immersive space that consists of physical sensors and effectors and which is conceptualized as a general-purpose infrastructure for research in the field of psychology and human-artifact interaction. In this chapter, we set out the epistemological rational behind XIM by putting the installation in the context of psychological research. The design and implementation of XIM are based on principles and technologies of neuromorphic control. We give a detailed description of the hardware infrastructure and software architecture, including the logic of the overall behavioral control. To illustrate the approach toward psychological experimentation, we discuss a number of practical applications of XIM. These include the so-called, persistent virtual community, the application in the research of the relationship between human experience and multi-modal stimulation, and an investigation of a mixed-reality social interaction paradigm.

  3. Evolving optimised decision rules for intrusion detection using particle swarm paradigm

    NASA Astrophysics Data System (ADS)

    Sivatha Sindhu, Siva S.; Geetha, S.; Kannan, A.

    2012-12-01

    The aim of this article is to construct a practical intrusion detection system (IDS) that properly analyses the statistics of network traffic pattern and classify them as normal or anomalous class. The objective of this article is to prove that the choice of effective network traffic features and a proficient machine-learning paradigm enhances the detection accuracy of IDS. In this article, a rule-based approach with a family of six decision tree classifiers, namely Decision Stump, C4.5, Naive Baye's Tree, Random Forest, Random Tree and Representative Tree model to perform the detection of anomalous network pattern is introduced. In particular, the proposed swarm optimisation-based approach selects instances that compose training set and optimised decision tree operate over this trained set producing classification rules with improved coverage, classification capability and generalisation ability. Experiment with the Knowledge Discovery and Data mining (KDD) data set which have information on traffic pattern, during normal and intrusive behaviour shows that the proposed algorithm produces optimised decision rules and outperforms other machine-learning algorithm.

  4. Reasoning about real-time systems with temporal interval logic constraints on multi-state automata

    NASA Technical Reports Server (NTRS)

    Gabrielian, Armen

    1991-01-01

    Models of real-time systems using a single paradigm often turn out to be inadequate, whether the paradigm is based on states, rules, event sequences, or logic. A model-based approach to reasoning about real-time systems is presented in which a temporal interval logic called TIL is employed to define constraints on a new type of high level automata. The combination, called hierarchical multi-state (HMS) machines, can be used to model formally a real-time system, a dynamic set of requirements, the environment, heuristic knowledge about planning-related problem solving, and the computational states of the reasoning mechanism. In this framework, mathematical techniques were developed for: (1) proving the correctness of a representation; (2) planning of concurrent tasks to achieve goals; and (3) scheduling of plans to satisfy complex temporal constraints. HMS machines allow reasoning about a real-time system from a model of how truth arises instead of merely depending of what is true in a system.

  5. Adaptive displays and controllers using alternative feedback.

    PubMed

    Repperger, D W

    2004-12-01

    Investigations on the design of haptic (force reflecting joystick or force display) controllers were conducted by viewing the display of force information within the context of several different paradigms. First, using analogies from electrical and mechanical systems, certain schemes of the haptic interface were hypothesized which may improve the human-machine interaction with respect to various criteria. A discussion is given on how this interaction benefits the electrical and mechanical system. To generalize this concept to the design of human-machine interfaces, three studies with haptic mechanisms were then synthesized and analyzed.

  6. Case-Based Reasoning in Mixed Paradigm Settings and with Learning

    DTIC Science & Technology

    1994-04-30

    Learning Prototypical Cases OFF-BROADWAY, MCI and RMHC -* are three CBR-ML systems that learn case prototypes. We feel that methods that enable the...at Irvine Machine Learning Repository, including heart disease and breast cancer databases. OFF-BROADWAY, MCI and RMHC -* made the following notable

  7. Heterogeneity induces spatiotemporal oscillations in reaction-diffusion systems

    NASA Astrophysics Data System (ADS)

    Krause, Andrew L.; Klika, Václav; Woolley, Thomas E.; Gaffney, Eamonn A.

    2018-05-01

    We report on an instability arising in activator-inhibitor reaction-diffusion (RD) systems with a simple spatial heterogeneity. This instability gives rise to periodic creation, translation, and destruction of spike solutions that are commonly formed due to Turing instabilities. While this behavior is oscillatory in nature, it occurs purely within the Turing space such that no region of the domain would give rise to a Hopf bifurcation for the homogeneous equilibrium. We use the shadow limit of the Gierer-Meinhardt system to show that the speed of spike movement can be predicted from well-known asymptotic theory, but that this theory is unable to explain the emergence of these spatiotemporal oscillations. Instead, we numerically explore this system and show that the oscillatory behavior is caused by the destabilization of a steady spike pattern due to the creation of a new spike arising from endogeneous activator production. We demonstrate that on the edge of this instability, the period of the oscillations goes to infinity, although it does not fit the profile of any well-known bifurcation of a limit cycle. We show that nearby stationary states are either Turing unstable or undergo saddle-node bifurcations near the onset of the oscillatory instability, suggesting that the periodic motion does not emerge from a local equilibrium. We demonstrate the robustness of this spatiotemporal oscillation by exploring small localized heterogeneity and showing that this behavior also occurs in the Schnakenberg RD model. Our results suggest that this phenomenon is ubiquitous in spatially heterogeneous RD systems, but that current tools, such as stability of spike solutions and shadow-limit asymptotics, do not elucidate understanding. This opens several avenues for further mathematical analysis and highlights difficulties in explaining how robust patterning emerges from Turing's mechanism in the presence of even small spatial heterogeneity.

  8. An imperialist competitive algorithm for virtual machine placement in cloud computing

    NASA Astrophysics Data System (ADS)

    Jamali, Shahram; Malektaji, Sepideh; Analoui, Morteza

    2017-05-01

    Cloud computing, the recently emerged revolution in IT industry, is empowered by virtualisation technology. In this paradigm, the user's applications run over some virtual machines (VMs). The process of selecting proper physical machines to host these virtual machines is called virtual machine placement. It plays an important role on resource utilisation and power efficiency of cloud computing environment. In this paper, we propose an imperialist competitive-based algorithm for the virtual machine placement problem called ICA-VMPLC. The base optimisation algorithm is chosen to be ICA because of its ease in neighbourhood movement, good convergence rate and suitable terminology. The proposed algorithm investigates search space in a unique manner to efficiently obtain optimal placement solution that simultaneously minimises power consumption and total resource wastage. Its final solution performance is compared with several existing methods such as grouping genetic and ant colony-based algorithms as well as bin packing heuristic. The simulation results show that the proposed method is superior to other tested algorithms in terms of power consumption, resource wastage, CPU usage efficiency and memory usage efficiency.

  9. Exploring the color feature power for psoriasis risk stratification and classification: A data mining paradigm.

    PubMed

    Shrivastava, Vimal K; Londhe, Narendra D; Sonawane, Rajendra S; Suri, Jasjit S

    2015-10-01

    A large percentage of dermatologist׳s decision in psoriasis disease assessment is based on color. The current computer-aided diagnosis systems for psoriasis risk stratification and classification lack the vigor of color paradigm. The paper presents an automated psoriasis computer-aided diagnosis (pCAD) system for classification of psoriasis skin images into psoriatic lesion and healthy skin, which solves the two major challenges: (i) fulfills the color feature requirements and (ii) selects the powerful dominant color features while retaining high classification accuracy. Fourteen color spaces are discovered for psoriasis disease analysis leading to 86 color features. The pCAD system is implemented in a support vector-based machine learning framework where the offline image data set is used for computing machine learning offline color machine learning parameters. These are then used for transformation of the online color features to predict the class labels for healthy vs. diseased cases. The above paradigm uses principal component analysis for color feature selection of dominant features, keeping the original color feature unaltered. Using the cross-validation protocol, the above machine learning protocol is compared against the standalone grayscale features with 60 features and against the combined grayscale and color feature set of 146. Using a fixed data size of 540 images with equal number of healthy and diseased, 10 fold cross-validation protocol, and SVM of polynomial kernel of type two, pCAD system shows an accuracy of 99.94% with sensitivity and specificity of 99.93% and 99.96%. Using a varying data size protocol, the mean classification accuracies for color, grayscale, and combined scenarios are: 92.85%, 93.83% and 93.99%, respectively. The reliability of the system in these three scenarios are: 94.42%, 97.39% and 96.00%, respectively. We conclude that pCAD system using color space alone is compatible to grayscale space or combined color and grayscale spaces. We validated our pCAD system against facial color databases and the results are consistent in accuracy and reliability. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. A computer architecture for intelligent machines

    NASA Technical Reports Server (NTRS)

    Lefebvre, D. R.; Saridis, G. N.

    1991-01-01

    The Theory of Intelligent Machines proposes a hierarchical organization for the functions of an autonomous robot based on the Principle of Increasing Precision With Decreasing Intelligence. An analytic formulation of this theory using information-theoretic measures of uncertainty for each level of the intelligent machine has been developed in recent years. A computer architecture that implements the lower two levels of the intelligent machine is presented. The architecture supports an event-driven programming paradigm that is independent of the underlying computer architecture and operating system. Details of Execution Level controllers for motion and vision systems are addressed, as well as the Petri net transducer software used to implement Coordination Level functions. Extensions to UNIX and VxWorks operating systems which enable the development of a heterogeneous, distributed application are described. A case study illustrates how this computer architecture integrates real-time and higher-level control of manipulator and vision systems.

  11. A Collaboration-Oriented M2M Messaging Mechanism for the Collaborative Automation between Machines in Future Industrial Networks

    PubMed Central

    Gray, John

    2017-01-01

    Machine-to-machine (M2M) communication is a key enabling technology for industrial internet of things (IIoT)-empowered industrial networks, where machines communicate with one another for collaborative automation and intelligent optimisation. This new industrial computing paradigm features high-quality connectivity, ubiquitous messaging, and interoperable interactions between machines. However, manufacturing IIoT applications have specificities that distinguish them from many other internet of things (IoT) scenarios in machine communications. By highlighting the key requirements and the major technical gaps of M2M in industrial applications, this article describes a collaboration-oriented M2M (CoM2M) messaging mechanism focusing on flexible connectivity and discovery, ubiquitous messaging, and semantic interoperability that are well suited for the production line-scale interoperability of manufacturing applications. The designs toward machine collaboration and data interoperability at both the communication and semantic level are presented. Then, the application scenarios of the presented methods are illustrated with a proof-of-concept implementation in the PicknPack food packaging line. Eventually, the advantages and some potential issues are discussed based on the PicknPack practice. PMID:29165347

  12. Brittleness index of machinable dental materials and its relation to the marginal chipping factor.

    PubMed

    Tsitrou, Effrosyni A; Northeast, Simon E; van Noort, Richard

    2007-12-01

    The machinability of a material can be measured with the calculation of its brittleness index (BI). It is possible that different materials with different BI could produce restorations with varied marginal integrity. The degree of marginal chipping of a milled restoration can be estimated by the calculation of the marginal chipping factor (CF). The aim of this study is to investigate any possible correlation between the BI of machinable dental materials and the CF of the final restorations. The CEREC system was used to mill a wide range of materials used with that system; namely the Paradigm MZ100 (3M/ESPE), Vita Mark II (VITA), ProCAD (Ivoclar-Vivadent) and IPS e.max CAD (Ivoclar-Vivadent). A Vickers Hardness Tester was used for the calculation of BI, while for the calculation of CF the percentage of marginal chipping of crowns prepared with bevelled marginal angulations was estimated. The results of this study showed that Paradigm MZ100 had the lowest BI and CF, while IPS e.max CAD demonstrated the highest BI and CF. Vita Mark II and ProCAD had similar BI and CF and were lying between the above materials. Statistical analysis of the results showed that there is a perfect positive correlation between BI and CF for all the materials. The BI and CF could be both regarded as indicators of a material's machinability. Within the limitations of this study it was shown that as the BI increases so does the potential for marginal chipping, indicating that the BI of a material can be used as a predictor of the CF.

  13. Digital optical processing of optical communications: towards an Optical Turing Machine

    NASA Astrophysics Data System (ADS)

    Touch, Joe; Cao, Yinwen; Ziyadi, Morteza; Almaiman, Ahmed; Mohajerin-Ariaei, Amirhossein; Willner, Alan E.

    2017-01-01

    Optical computing is needed to support Tb/s in-network processing in a way that unifies communication and computation using a single data representation that supports in-transit network packet processing, security, and big data filtering. Support for optical computation of this sort requires leveraging the native properties of optical wave mixing to enable computation and switching for programmability. As a consequence, data must be encoded digitally as phase (M-PSK), semantics-preserving regeneration is the key to high-order computation, and data processing at Tb/s rates requires mixing. Experiments have demonstrated viable approaches to phase squeezing and power restoration. This work led our team to develop the first serial, optical Internet hop-count decrement, and to design and simulate optical circuits for calculating the Internet checksum and multiplexing Internet packets. The current exploration focuses on limited-lookback computational models to reduce the need for permanent storage and hybrid nanophotonic circuits that combine phase-aligned comb sources, non-linear mixing, and switching on the same substrate to avoid the macroscopic effects that hamper benchtop prototypes.

  14. Rosen's (M,R) system in process algebra.

    PubMed

    Gatherer, Derek; Galpin, Vashti

    2013-11-17

    Robert Rosen's Metabolism-Replacement, or (M,R), system can be represented as a compact network structure with a single source and three products derived from that source in three consecutive reactions. (M,R) has been claimed to be non-reducible to its components and algorithmically non-computable, in the sense of not being evaluable as a function by a Turing machine. If (M,R)-like structures are present in real biological networks, this suggests that many biological networks will be non-computable, with implications for those branches of systems biology that rely on in silico modelling for predictive purposes. We instantiate (M,R) using the process algebra Bio-PEPA, and discuss the extent to which our model represents a true realization of (M,R). We observe that under some starting conditions and parameter values, stable states can be achieved. Although formal demonstration of algorithmic computability remains elusive for (M,R), we discuss the extent to which our Bio-PEPA representation of (M,R) allows us to sidestep Rosen's fundamental objections to computational systems biology. We argue that the behaviour of (M,R) in Bio-PEPA shows life-like properties.

  15. Incremental Inductive Learning in a Constructivist Agent

    NASA Astrophysics Data System (ADS)

    Perotto, Filipo Studzinski; Älvares, Luís Otávio

    The constructivist paradigm in Artificial Intelligence has been definitively inaugurated in the earlier 1990's by Drescher's pioneer work [10]. He faces the challenge of design an alternative model for machine learning, founded in the human cognitive developmental process described by Piaget [x]. His effort has inspired many other researchers.

  16. Adaptive Automation Design and Implementation

    DTIC Science & Technology

    2015-09-17

    Study : Space Navigator This section demonstrates the player modeling paradigm, focusing specifically on the response generation section of the player ...human-machine system, a real-time player modeling framework for imitating a specific person’s task performance, and the Adaptive Automation System...Model . . . . . . . . . . . . . . . . . . . . . . . 13 Clustering-Based Real-Time Player Modeling . . . . . . . . . . . . . . . . . . . . . . 15 An

  17. Synchrony-induced modes of oscillation of a neural field model

    NASA Astrophysics Data System (ADS)

    Esnaola-Acebes, Jose M.; Roxin, Alex; Avitabile, Daniele; Montbrió, Ernest

    2017-11-01

    We investigate the modes of oscillation of heterogeneous ring networks of quadratic integrate-and-fire (QIF) neurons with nonlocal, space-dependent coupling. Perturbations of the equilibrium state with a particular wave number produce transient standing waves with a specific temporal frequency, analogously to those in a tense string. In the neuronal network, the equilibrium corresponds to a spatially homogeneous, asynchronous state. Perturbations of this state excite the network's oscillatory modes, which reflect the interplay of episodes of synchronous spiking with the excitatory-inhibitory spatial interactions. In the thermodynamic limit, an exact low-dimensional neural field model describing the macroscopic dynamics of the network is derived. This allows us to obtain formulas for the Turing eigenvalues of the spatially homogeneous state and hence to obtain its stability boundary. We find that the frequency of each Turing mode depends on the corresponding Fourier coefficient of the synaptic pattern of connectivity. The decay rate instead is identical for all oscillation modes as a consequence of the heterogeneity-induced desynchronization of the neurons. Finally, we numerically compute the spectrum of spatially inhomogeneous solutions branching from the Turing bifurcation, showing that similar oscillatory modes operate in neural bump states and are maintained away from onset.

  18. Synchrony-induced modes of oscillation of a neural field model.

    PubMed

    Esnaola-Acebes, Jose M; Roxin, Alex; Avitabile, Daniele; Montbrió, Ernest

    2017-11-01

    We investigate the modes of oscillation of heterogeneous ring networks of quadratic integrate-and-fire (QIF) neurons with nonlocal, space-dependent coupling. Perturbations of the equilibrium state with a particular wave number produce transient standing waves with a specific temporal frequency, analogously to those in a tense string. In the neuronal network, the equilibrium corresponds to a spatially homogeneous, asynchronous state. Perturbations of this state excite the network's oscillatory modes, which reflect the interplay of episodes of synchronous spiking with the excitatory-inhibitory spatial interactions. In the thermodynamic limit, an exact low-dimensional neural field model describing the macroscopic dynamics of the network is derived. This allows us to obtain formulas for the Turing eigenvalues of the spatially homogeneous state and hence to obtain its stability boundary. We find that the frequency of each Turing mode depends on the corresponding Fourier coefficient of the synaptic pattern of connectivity. The decay rate instead is identical for all oscillation modes as a consequence of the heterogeneity-induced desynchronization of the neurons. Finally, we numerically compute the spectrum of spatially inhomogeneous solutions branching from the Turing bifurcation, showing that similar oscillatory modes operate in neural bump states and are maintained away from onset.

  19. Cross-Diffusion Induced Turing Instability and Amplitude Equation for a Toxic-Phytoplankton-Zooplankton Model with Nonmonotonic Functional Response

    NASA Astrophysics Data System (ADS)

    Han, Renji; Dai, Binxiang

    2017-06-01

    The spatiotemporal pattern induced by cross-diffusion of a toxic-phytoplankton-zooplankton model with nonmonotonic functional response is investigated in this paper. The linear stability analysis shows that cross-diffusion is the key mechanism for the formation of spatial patterns. By taking cross-diffusion rate as bifurcation parameter, we derive amplitude equations near the Turing bifurcation point for the excited modes in the framework of a weakly nonlinear theory, and the stability analysis of the amplitude equations interprets the structural transitions and stability of various forms of Turing patterns. Furthermore, we illustrate the theoretical results via numerical simulations. It is shown that the spatiotemporal distribution of the plankton is homogeneous in the absence of cross-diffusion. However, when the cross-diffusivity is greater than the critical value, the spatiotemporal distribution of all the plankton species becomes inhomogeneous in spaces and results in different kinds of patterns: spot, stripe, and the mixture of spot and stripe patterns depending on the cross-diffusivity. Simultaneously, the impact of toxin-producing rate of toxic-phytoplankton (TPP) species and natural death rate of zooplankton species on pattern selection is also explored.

  20. Machine learning methods for credibility assessment of interviewees based on posturographic data.

    PubMed

    Saripalle, Sashi K; Vemulapalli, Spandana; King, Gregory W; Burgoon, Judee K; Derakhshani, Reza

    2015-01-01

    This paper discusses the advantages of using posturographic signals from force plates for non-invasive credibility assessment. The contributions of our work are two fold: first, the proposed method is highly efficient and non invasive. Second, feasibility for creating an autonomous credibility assessment system using machine-learning algorithms is studied. This study employs an interview paradigm that includes subjects responding with truthful and deceptive intent while their center of pressure (COP) signal is being recorded. Classification models utilizing sets of COP features for deceptive responses are derived and best accuracy of 93.5% for test interval is reported.

  1. Integration of element specific persistent homology and machine learning for protein-ligand binding affinity prediction.

    PubMed

    Cang, Zixuan; Wei, Guo-Wei

    2018-02-01

    Protein-ligand binding is a fundamental biological process that is paramount to many other biological processes, such as signal transduction, metabolic pathways, enzyme construction, cell secretion, and gene expression. Accurate prediction of protein-ligand binding affinities is vital to rational drug design and the understanding of protein-ligand binding and binding induced function. Existing binding affinity prediction methods are inundated with geometric detail and involve excessively high dimensions, which undermines their predictive power for massive binding data. Topology provides the ultimate level of abstraction and thus incurs too much reduction in geometric information. Persistent homology embeds geometric information into topological invariants and bridges the gap between complex geometry and abstract topology. However, it oversimplifies biological information. This work introduces element specific persistent homology (ESPH) or multicomponent persistent homology to retain crucial biological information during topological simplification. The combination of ESPH and machine learning gives rise to a powerful paradigm for macromolecular analysis. Tests on 2 large data sets indicate that the proposed topology-based machine-learning paradigm outperforms other existing methods in protein-ligand binding affinity predictions. ESPH reveals protein-ligand binding mechanism that can not be attained from other conventional techniques. The present approach reveals that protein-ligand hydrophobic interactions are extended to 40Å  away from the binding site, which has a significant ramification to drug and protein design. Copyright © 2017 John Wiley & Sons, Ltd.

  2. A Hybrid Brain-Computer Interface Based on the Fusion of P300 and SSVEP Scores.

    PubMed

    Yin, Erwei; Zeyl, Timothy; Saab, Rami; Chau, Tom; Hu, Dewen; Zhou, Zongtan

    2015-07-01

    The present study proposes a hybrid brain-computer interface (BCI) with 64 selectable items based on the fusion of P300 and steady-state visually evoked potential (SSVEP) brain signals. With this approach, row/column (RC) P300 and two-step SSVEP paradigms were integrated to create two hybrid paradigms, which we denote as the double RC (DRC) and 4-D spellers. In each hybrid paradigm, the target is simultaneously detected based on both P300 and SSVEP potentials as measured by the electroencephalogram. We further proposed a maximum-probability estimation (MPE) fusion approach to combine the P300 and SSVEP on a score level and compared this approach to other approaches based on linear discriminant analysis, a naïve Bayes classifier, and support vector machines. The experimental results obtained from thirteen participants indicated that the 4-D hybrid paradigm outperformed the DRC paradigm and that the MPE fusion achieved higher accuracy compared with the other approaches. Importantly, 12 of the 13 participants, using the 4-D paradigm achieved an accuracy of over 90% and the average accuracy was 95.18%. These promising results suggest that the proposed hybrid BCI system could be used in the design of a high-performance BCI-based keyboard.

  3. VISUAL AND AUDIO PRESENTATION IN MACHINE PROGRAMED INSTRUCTION. FINAL REPORT.

    ERIC Educational Resources Information Center

    ALLEN, WILLIAM H.

    THIS STUDY WAS PART OF A LARGER RESEARCH PROGRAM AIMED TOWARD DEVELOPMENT OF PARADIGMS OF MESSAGE DESIGN. OBJECTIVES OF THREE PARALLEL EXPERIMENTS WERE TO EVALUATE INTERACTIONS OF PRESENTATION MODE, PROGRAM TYPE, AND CONTENT AS THEY AFFECT LEARNER CHARACTERISTICS. EACH EXPERIMENT USED 18 TREATMENTS IN A FACTORIAL DESIGN WITH RANDOMLY SELECTED…

  4. High-Speed Photonic Reservoir Computing Using a Time-Delay-Based Architecture: Million Words per Second Classification

    NASA Astrophysics Data System (ADS)

    Larger, Laurent; Baylón-Fuentes, Antonio; Martinenghi, Romain; Udaltsov, Vladimir S.; Chembo, Yanne K.; Jacquot, Maxime

    2017-01-01

    Reservoir computing, originally referred to as an echo state network or a liquid state machine, is a brain-inspired paradigm for processing temporal information. It involves learning a "read-out" interpretation for nonlinear transients developed by high-dimensional dynamics when the latter is excited by the information signal to be processed. This novel computational paradigm is derived from recurrent neural network and machine learning techniques. It has recently been implemented in photonic hardware for a dynamical system, which opens the path to ultrafast brain-inspired computing. We report on a novel implementation involving an electro-optic phase-delay dynamics designed with off-the-shelf optoelectronic telecom devices, thus providing the targeted wide bandwidth. Computational efficiency is demonstrated experimentally with speech-recognition tasks. State-of-the-art speed performances reach one million words per second, with very low word error rate. Additionally, to record speed processing, our investigations have revealed computing-efficiency improvements through yet-unexplored temporal-information-processing techniques, such as simultaneous multisample injection and pitched sampling at the read-out compared to information "write-in".

  5. A machine learning approach for automated wide-range frequency tagging analysis in embedded neuromonitoring systems.

    PubMed

    Montagna, Fabio; Buiatti, Marco; Benatti, Simone; Rossi, Davide; Farella, Elisabetta; Benini, Luca

    2017-10-01

    EEG is a standard non-invasive technique used in neural disease diagnostics and neurosciences. Frequency-tagging is an increasingly popular experimental paradigm that efficiently tests brain function by measuring EEG responses to periodic stimulation. Recently, frequency-tagging paradigms have proven successful with low stimulation frequencies (0.5-6Hz), but the EEG signal is intrinsically noisy in this frequency range, requiring heavy signal processing and significant human intervention for response estimation. This limits the possibility to process the EEG on resource-constrained systems and to design smart EEG based devices for automated diagnostic. We propose an algorithm for artifact removal and automated detection of frequency tagging responses in a wide range of stimulation frequencies, which we test on a visual stimulation protocol. The algorithm is rooted on machine learning based pattern recognition techniques and it is tailored for a new generation parallel ultra low power processing platform (PULP), reaching performance of more that 90% accuracy in the frequency detection even for very low stimulation frequencies (<1Hz) with a power budget of 56mW. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Non-Linear Pattern Formation in Bone Growth and Architecture

    PubMed Central

    Salmon, Phil

    2014-01-01

    The three-dimensional morphology of bone arises through adaptation to its required engineering performance. Genetically and adaptively bone travels along a complex spatiotemporal trajectory to acquire optimal architecture. On a cellular, micro-anatomical scale, what mechanisms coordinate the activity of osteoblasts and osteoclasts to produce complex and efficient bone architectures? One mechanism is examined here – chaotic non-linear pattern formation (NPF) – which underlies in a unifying way natural structures as disparate as trabecular bone, swarms of birds flying, island formation, fluid turbulence, and others. At the heart of NPF is the fact that simple rules operating between interacting elements, and Turing-like interaction between global and local signals, lead to complex and structured patterns. The study of “group intelligence” exhibited by swarming birds or shoaling fish has led to an embodiment of NPF called “particle swarm optimization” (PSO). This theoretical model could be applicable to the behavior of osteoblasts, osteoclasts, and osteocytes, seeing them operating “socially” in response simultaneously to both global and local signals (endocrine, cytokine, mechanical), resulting in their clustered activity at formation and resorption sites. This represents problem-solving by social intelligence, and could potentially add further realism to in silico computer simulation of bone modeling. What insights has NPF provided to bone biology? One example concerns the genetic disorder juvenile Pagets disease or idiopathic hyperphosphatasia, where the anomalous parallel trabecular architecture characteristic of this pathology is consistent with an NPF paradigm by analogy with known experimental NPF systems. Here, coupling or “feedback” between osteoblasts and osteoclasts is the critical element. This NPF paradigm implies a profound link between bone regulation and its architecture: in bone the architecture is the regulation. The former is the emergent consequence of the latter. PMID:25653638

  7. Non-linear pattern formation in bone growth and architecture.

    PubMed

    Salmon, Phil

    2014-01-01

    The three-dimensional morphology of bone arises through adaptation to its required engineering performance. Genetically and adaptively bone travels along a complex spatiotemporal trajectory to acquire optimal architecture. On a cellular, micro-anatomical scale, what mechanisms coordinate the activity of osteoblasts and osteoclasts to produce complex and efficient bone architectures? One mechanism is examined here - chaotic non-linear pattern formation (NPF) - which underlies in a unifying way natural structures as disparate as trabecular bone, swarms of birds flying, island formation, fluid turbulence, and others. At the heart of NPF is the fact that simple rules operating between interacting elements, and Turing-like interaction between global and local signals, lead to complex and structured patterns. The study of "group intelligence" exhibited by swarming birds or shoaling fish has led to an embodiment of NPF called "particle swarm optimization" (PSO). This theoretical model could be applicable to the behavior of osteoblasts, osteoclasts, and osteocytes, seeing them operating "socially" in response simultaneously to both global and local signals (endocrine, cytokine, mechanical), resulting in their clustered activity at formation and resorption sites. This represents problem-solving by social intelligence, and could potentially add further realism to in silico computer simulation of bone modeling. What insights has NPF provided to bone biology? One example concerns the genetic disorder juvenile Pagets disease or idiopathic hyperphosphatasia, where the anomalous parallel trabecular architecture characteristic of this pathology is consistent with an NPF paradigm by analogy with known experimental NPF systems. Here, coupling or "feedback" between osteoblasts and osteoclasts is the critical element. This NPF paradigm implies a profound link between bone regulation and its architecture: in bone the architecture is the regulation. The former is the emergent consequence of the latter.

  8. Mathematically guided approaches to distinguish models of periodic patterning

    PubMed Central

    Hiscock, Tom W.; Megason, Sean G.

    2015-01-01

    How periodic patterns are generated is an open question. A number of mechanisms have been proposed – most famously, Turing's reaction-diffusion model. However, many theoretical and experimental studies focus on the Turing mechanism while ignoring other possible mechanisms. Here, we use a general model of periodic patterning to show that different types of mechanism (molecular, cellular, mechanical) can generate qualitatively similar final patterns. Observation of final patterns is therefore not sufficient to favour one mechanism over others. However, we propose that a mathematical approach can help to guide the design of experiments that can distinguish between different mechanisms, and illustrate the potential value of this approach with specific biological examples. PMID:25605777

  9. Theory of Turing Patterns on Time Varying Networks.

    PubMed

    Petit, Julien; Lauwens, Ben; Fanelli, Duccio; Carletti, Timoteo

    2017-10-06

    The process of pattern formation for a multispecies model anchored on a time varying network is studied. A nonhomogeneous perturbation superposed to an homogeneous stable fixed point can be amplified following the Turing mechanism of instability, solely instigated by the network dynamics. By properly tuning the frequency of the imposed network evolution, one can make the examined system behave as its averaged counterpart, over a finite time window. This is the key observation to derive a closed analytical prediction for the onset of the instability in the time dependent framework. Continuously and piecewise constant periodic time varying networks are analyzed, setting the framework for the proposed approach. The extension to nonperiodic settings is also discussed.

  10. FreeTure: A Free software to capTure meteors for FRIPON

    NASA Astrophysics Data System (ADS)

    Audureau, Yoan; Marmo, Chiara; Bouley, Sylvain; Kwon, Min-Kyung; Colas, François; Vaubaillon, Jérémie; Birlan, Mirel; Zanda, Brigitte; Vernazza, Pierre; Caminade, Stephane; Gattecceca, Jérôme

    2014-02-01

    The Fireball Recovery and Interplanetary Observation Network (FRIPON) is a French project started in 2014 which will monitor the sky, using 100 all-sky cameras to detect meteors and to retrieve related meteorites on the ground. There are several detection software all around. Some of them are proprietary. Also, some of them are hardware dependent. We present here the open source software for meteor detection to be installed on the FRIPON network's stations. The software will run on Linux with gigabit Ethernet cameras and we plan to make it cross platform. This paper is focused on the meteor detection method used for the pipeline development and the present capabilities.

  11. Two-qubit quantum cloning machine and quantum correlation broadcasting

    NASA Astrophysics Data System (ADS)

    Kheirollahi, Azam; Mohammadi, Hamidreza; Akhtarshenas, Seyed Javad

    2016-11-01

    Due to the axioms of quantum mechanics, perfect cloning of an unknown quantum state is impossible. But since imperfect cloning is still possible, a question arises: "Is there an optimal quantum cloning machine?" Buzek and Hillery answered this question and constructed their famous B-H quantum cloning machine. The B-H machine clones the state of an arbitrary single qubit in an optimal manner and hence it is universal. Generalizing this machine for a two-qubit system is straightforward, but during this procedure, except for product states, this machine loses its universality and becomes a state-dependent cloning machine. In this paper, we propose some classes of optimal universal local quantum state cloners for a particular class of two-qubit systems, more precisely, for a class of states with known Schmidt basis. We then extend our machine to the case that the Schmidt basis of the input state is deviated from the local computational basis of the machine. We show that more local quantum coherence existing in the input state corresponds to less fidelity between the input and output states. Also we present two classes of a state-dependent local quantum copying machine. Furthermore, we investigate local broadcasting of two aspects of quantum correlations, i.e., quantum entanglement and quantum discord, defined, respectively, within the entanglement-separability paradigm and from an information-theoretic perspective. The results show that although quantum correlation is, in general, very fragile during the broadcasting procedure, quantum discord is broadcasted more robustly than quantum entanglement.

  12. Intelligence-Augmented Rat Cyborgs in Maze Solving.

    PubMed

    Yu, Yipeng; Pan, Gang; Gong, Yongyue; Xu, Kedi; Zheng, Nenggan; Hua, Weidong; Zheng, Xiaoxiang; Wu, Zhaohui

    2016-01-01

    Cyborg intelligence is an emerging kind of intelligence paradigm. It aims to deeply integrate machine intelligence with biological intelligence by connecting machines and living beings via neural interfaces, enhancing strength by combining the biological cognition capability with the machine computational capability. Cyborg intelligence is considered to be a new way to augment living beings with machine intelligence. In this paper, we build rat cyborgs to demonstrate how they can expedite the maze escape task with integration of machine intelligence. We compare the performance of maze solving by computer, by individual rats, and by computer-aided rats (i.e. rat cyborgs). They were asked to find their way from a constant entrance to a constant exit in fourteen diverse mazes. Performance of maze solving was measured by steps, coverage rates, and time spent. The experimental results with six rats and their intelligence-augmented rat cyborgs show that rat cyborgs have the best performance in escaping from mazes. These results provide a proof-of-principle demonstration for cyborg intelligence. In addition, our novel cyborg intelligent system (rat cyborg) has great potential in various applications, such as search and rescue in complex terrains.

  13. Intelligence-Augmented Rat Cyborgs in Maze Solving

    PubMed Central

    Yu, Yipeng; Pan, Gang; Gong, Yongyue; Xu, Kedi; Zheng, Nenggan; Hua, Weidong; Zheng, Xiaoxiang; Wu, Zhaohui

    2016-01-01

    Cyborg intelligence is an emerging kind of intelligence paradigm. It aims to deeply integrate machine intelligence with biological intelligence by connecting machines and living beings via neural interfaces, enhancing strength by combining the biological cognition capability with the machine computational capability. Cyborg intelligence is considered to be a new way to augment living beings with machine intelligence. In this paper, we build rat cyborgs to demonstrate how they can expedite the maze escape task with integration of machine intelligence. We compare the performance of maze solving by computer, by individual rats, and by computer-aided rats (i.e. rat cyborgs). They were asked to find their way from a constant entrance to a constant exit in fourteen diverse mazes. Performance of maze solving was measured by steps, coverage rates, and time spent. The experimental results with six rats and their intelligence-augmented rat cyborgs show that rat cyborgs have the best performance in escaping from mazes. These results provide a proof-of-principle demonstration for cyborg intelligence. In addition, our novel cyborg intelligent system (rat cyborg) has great potential in various applications, such as search and rescue in complex terrains. PMID:26859299

  14. Decoder calibration with ultra small current sample set for intracortical brain-machine interface

    NASA Astrophysics Data System (ADS)

    Zhang, Peng; Ma, Xuan; Chen, Luyao; Zhou, Jin; Wang, Changyong; Li, Wei; He, Jiping

    2018-04-01

    Objective. Intracortical brain-machine interfaces (iBMIs) aim to restore efficient communication and movement ability for paralyzed patients. However, frequent recalibration is required for consistency and reliability, and every recalibration will require relatively large most current sample set. The aim in this study is to develop an effective decoder calibration method that can achieve good performance while minimizing recalibration time. Approach. Two rhesus macaques implanted with intracortical microelectrode arrays were trained separately on movement and sensory paradigm. Neural signals were recorded to decode reaching positions or grasping postures. A novel principal component analysis-based domain adaptation (PDA) method was proposed to recalibrate the decoder with only ultra small current sample set by taking advantage of large historical data, and the decoding performance was compared with other three calibration methods for evaluation. Main results. The PDA method closed the gap between historical and current data effectively, and made it possible to take advantage of large historical data for decoder recalibration in current data decoding. Using only ultra small current sample set (five trials of each category), the decoder calibrated using the PDA method could achieve much better and more robust performance in all sessions than using other three calibration methods in both monkeys. Significance. (1) By this study, transfer learning theory was brought into iBMIs decoder calibration for the first time. (2) Different from most transfer learning studies, the target data in this study were ultra small sample set and were transferred to the source data. (3) By taking advantage of historical data, the PDA method was demonstrated to be effective in reducing recalibration time for both movement paradigm and sensory paradigm, indicating a viable generalization. By reducing the demand for large current training data, this new method may facilitate the application of intracortical brain-machine interfaces in clinical practice.

  15. Using virtual machine monitors to overcome the challenges of monitoring and managing virtualized cloud infrastructures

    NASA Astrophysics Data System (ADS)

    Bamiah, Mervat Adib; Brohi, Sarfraz Nawaz; Chuprat, Suriayati

    2012-01-01

    Virtualization is one of the hottest research topics nowadays. Several academic researchers and developers from IT industry are designing approaches for solving security and manageability issues of Virtual Machines (VMs) residing on virtualized cloud infrastructures. Moving the application from a physical to a virtual platform increases the efficiency, flexibility and reduces management cost as well as effort. Cloud computing is adopting the paradigm of virtualization, using this technique, memory, CPU and computational power is provided to clients' VMs by utilizing the underlying physical hardware. Beside these advantages there are few challenges faced by adopting virtualization such as management of VMs and network traffic, unexpected additional cost and resource allocation. Virtual Machine Monitor (VMM) or hypervisor is the tool used by cloud providers to manage the VMs on cloud. There are several heterogeneous hypervisors provided by various vendors that include VMware, Hyper-V, Xen and Kernel Virtual Machine (KVM). Considering the challenge of VM management, this paper describes several techniques to monitor and manage virtualized cloud infrastructures.

  16. Prediction based proactive thermal virtual machine scheduling in green clouds.

    PubMed

    Kinger, Supriya; Kumar, Rajesh; Sharma, Anju

    2014-01-01

    Cloud computing has rapidly emerged as a widely accepted computing paradigm, but the research on Cloud computing is still at an early stage. Cloud computing provides many advanced features but it still has some shortcomings such as relatively high operating cost and environmental hazards like increasing carbon footprints. These hazards can be reduced up to some extent by efficient scheduling of Cloud resources. Working temperature on which a machine is currently running can be taken as a criterion for Virtual Machine (VM) scheduling. This paper proposes a new proactive technique that considers current and maximum threshold temperature of Server Machines (SMs) before making scheduling decisions with the help of a temperature predictor, so that maximum temperature is never reached. Different workload scenarios have been taken into consideration. The results obtained show that the proposed system is better than existing systems of VM scheduling, which does not consider current temperature of nodes before making scheduling decisions. Thus, a reduction in need of cooling systems for a Cloud environment has been obtained and validated.

  17. Particle-based simulations of polarity establishment reveal stochastic promotion of Turing pattern formation

    PubMed Central

    Ramirez, Samuel A.; Elston, Timothy C.

    2018-01-01

    Polarity establishment, the spontaneous generation of asymmetric molecular distributions, is a crucial component of many cellular functions. Saccharomyces cerevisiae (yeast) undergoes directed growth during budding and mating, and is an ideal model organism for studying polarization. In yeast and many other cell types, the Rho GTPase Cdc42 is the key molecular player in polarity establishment. During yeast polarization, multiple patches of Cdc42 initially form, then resolve into a single front. Because polarization relies on strong positive feedback, it is likely that the amplification of molecular-level fluctuations underlies the generation of multiple nascent patches. In the absence of spatial cues, these fluctuations may be key to driving polarization. Here we used particle-based simulations to investigate the role of stochastic effects in a Turing-type model of yeast polarity establishment. In the model, reactions take place either between two molecules on the membrane, or between a cytosolic and a membrane-bound molecule. Thus, we developed a computational platform that explicitly simulates molecules at and near the cell membrane, and implicitly handles molecules away from the membrane. To evaluate stochastic effects, we compared particle simulations to deterministic reaction-diffusion equation simulations. Defining macroscopic rate constants that are consistent with the microscopic parameters for this system is challenging, because diffusion occurs in two dimensions and particles exchange between the membrane and cytoplasm. We address this problem by empirically estimating macroscopic rate constants from appropriately designed particle-based simulations. Ultimately, we find that stochastic fluctuations speed polarity establishment and permit polarization in parameter regions predicted to be Turing stable. These effects can operate at Cdc42 abundances expected of yeast cells, and promote polarization on timescales consistent with experimental results. To our knowledge, our work represents the first particle-based simulations of a model for yeast polarization that is based on a Turing mechanism. PMID:29529021

  18. Elucidating the role of D4 receptors in mediating attributions of salience to incentive stimuli on Pavlovian conditioned approach and conditioned reinforcement paradigms.

    PubMed

    Cocker, P J; Vonder Haar, C; Winstanley, C A

    2016-10-01

    The power of drug-associated cues to instigate drug 'wanting' and consequently promote drug seeking is a corner stone of contemporary theories of addiction. Gambling disorder has recently been added to the pantheon of addictive disorders due to the phenomenological similarities between the diseases. However, the neurobiological mechanism that may mediate increased sensitivity towards conditioned stimuli in addictive disorders is unclear. We have previously demonstrated using a rodent analogue of a simple slot machine that the dopamine D4 receptor is critically engaged in controlling animals' attribution of salience to stimuli associated with reward in this paradigm, and consequently may represent a target for the treatment of gambling disorder. Here, we investigated the role of acute administration of a D4 receptor agonist on animals' responsivity to conditioned stimuli on both a Pavlovian conditioned approach (autoshaping) and a conditioned reinforcement paradigm. Following training on one of the two tasks, separate cohorts of rats (male and female) were administered a dose of PD168077 shown to be maximally effective at precipitating errors in reward expectancy on the rat slot machine task (10mg/kg). However, augmenting the activity of the D4 receptors in this manner did not alter behaviour on either task. These data therefore provide novel evidence that the D4 receptor does not alter incentive motivation in response to cues on simple behavioural tasks. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Availability of Alternatives and the Processing of Scalar Implicatures: A Visual World Eye-tracking Study

    ERIC Educational Resources Information Center

    Degen, Judith; Tanenhaus, Michael K.

    2016-01-01

    Two visual world experiments investigated the processing of the implicature associated with "some" using a "gumball paradigm." On each trial, participants saw an image of a gumball machine with an upper chamber with orange and blue gumballs and an empty lower chamber. Gumballs dropped to the lower chamber, creating a contrast…

  20. On the Efficient Exploitation of Speculation under Data flow Paradigms of Control

    DTIC Science & Technology

    1989-05-19

    use. Une maison est une machine 6 habiter. - LE CORBUSIER , Vers une architecture 110 Chapter 5 Experiments and Results W e will journey in this chapter...occupation. le replied: "a speculator." - A. MICHAEL LIPPER, Back to the Future: The Case for Speculation, Baruch-Style 25 Chapter 2 Language and System

  1. "Your Model Is Predictive-- but Is It Useful?" Theoretical and Empirical Considerations of a New Paradigm for Adaptive Tutoring Evaluation

    ERIC Educational Resources Information Center

    González-Brenes, José P.; Huang, Yun

    2015-01-01

    Classification evaluation metrics are often used to evaluate adaptive tutoring systems-- programs that teach and adapt to humans. Unfortunately, it is not clear how intuitive these metrics are for practitioners with little machine learning background. Moreover, our experiments suggest that existing convention for evaluating tutoring systems may…

  2. The Visual Uncertainty Paradigm for Controlling Screen-Space Information in Visualization

    ERIC Educational Resources Information Center

    Dasgupta, Aritra

    2012-01-01

    The information visualization pipeline serves as a lossy communication channel for presentation of data on a screen-space of limited resolution. The lossy communication is not just a machine-only phenomenon due to information loss caused by translation of data, but also a reflection of the degree to which the human user can comprehend visual…

  3. Prediction of user preference over shared-control paradigms for a robotic wheelchair.

    PubMed

    Erdogan, Ahmetcan; Argall, Brenna D

    2017-07-01

    The design of intelligent powered wheelchairs has traditionally focused heavily on providing effective and efficient navigation assistance. Significantly less attention has been given to the end-user's preference between different assistance paradigms. It is possible to include these subjective evaluations in the design process, for example by soliciting feedback in post-experiment questionnaires. However, constantly querying the user for feedback during real-world operation is not practical. In this paper, we present a model that correlates objective performance metrics and subjective evaluations of autonomous wheelchair control paradigms. Using off-the-shelf machine learning techniques, we show that it is possible to build a model that can predict the most preferred shared-control method from task execution metrics such as effort, safety, performance and utilization. We further characterize the relative contributions of each of these metrics to the individual choice of most preferred assistance paradigm. Our evaluation includes Spinal Cord Injured (SCI) and uninjured subject groups. The results show that our proposed correlation model enables the continuous tracking of user preference and offers the possibility of autonomy that is customized to each user.

  4. Spatiotemporal Patterns in a Predator-Prey Model with Cross-Diffusion Effect

    NASA Astrophysics Data System (ADS)

    Sambath, M.; Balachandran, K.; Guin, L. N.

    The present research deals with the emergence of spatiotemporal patterns of a two-dimensional (2D) continuous predator-prey system with cross-diffusion effect. First, we work out the critical lines of Hopf and Turing bifurcations of the current model system in a 2D spatial domain by means of bifurcation theory. More specifically, the exact Turing region is specified in a two-parameter space. In effect, by choosing the cross-diffusion coefficient as one of the momentous parameter, we demonstrate that the model system undergoes a sequence of spatiotemporal patterns in a homogeneous environment through diffusion-driven instability. Our results via numerical simulation authenticate that cross-diffusion be able to create stationary patterns which enrich the findings of pattern formation in an ecosystem.

  5. Modelling and formation of spatiotemporal patterns of fractional predation system in subdiffusion and superdiffusion scenarios

    NASA Astrophysics Data System (ADS)

    Owolabi, Kolade M.; Atangana, Abdon

    2018-02-01

    This paper primarily focused on the question of how population diffusion can affect the formation of the spatial patterns in the spatial fraction predator-prey system by Turing mechanisms. Our numerical findings assert that modeling by fractional reaction-diffusion equations should be considered as an appropriate tool for studying the fundamental mechanisms of complex spatiotemporal dynamics. We observe that pure Hopf instability gives rise to the formation of spiral patterns in 2D and pure Turing instability destroys the spiral pattern and results to the formation of chaotic or spatiotemporal spatial patterns. Existence and permanence of the species is also guaranteed with the 3D simulations at some instances of time for subdiffusive and superdiffusive scenarios.

  6. Turing instability in reaction-diffusion models on complex networks

    NASA Astrophysics Data System (ADS)

    Ide, Yusuke; Izuhara, Hirofumi; Machida, Takuya

    2016-09-01

    In this paper, the Turing instability in reaction-diffusion models defined on complex networks is studied. Here, we focus on three types of models which generate complex networks, i.e. the Erdős-Rényi, the Watts-Strogatz, and the threshold network models. From analysis of the Laplacian matrices of graphs generated by these models, we numerically reveal that stable and unstable regions of a homogeneous steady state on the parameter space of two diffusion coefficients completely differ, depending on the network architecture. In addition, we theoretically discuss the stable and unstable regions in the cases of regular enhanced ring lattices which include regular circles, and networks generated by the threshold network model when the number of vertices is large enough.

  7. Robust artifactual independent component classification for BCI practitioners.

    PubMed

    Winkler, Irene; Brandl, Stephanie; Horn, Franziska; Waldburger, Eric; Allefeld, Carsten; Tangermann, Michael

    2014-06-01

    EEG artifacts of non-neural origin can be separated from neural signals by independent component analysis (ICA). It is unclear (1) how robustly recently proposed artifact classifiers transfer to novel users, novel paradigms or changed electrode setups, and (2) how artifact cleaning by a machine learning classifier impacts the performance of brain-computer interfaces (BCIs). Addressing (1), the robustness of different strategies with respect to the transfer between paradigms and electrode setups of a recently proposed classifier is investigated on offline data from 35 users and 3 EEG paradigms, which contain 6303 expert-labeled components from two ICA and preprocessing variants. Addressing (2), the effect of artifact removal on single-trial BCI classification is estimated on BCI trials from 101 users and 3 paradigms. We show that (1) the proposed artifact classifier generalizes to completely different EEG paradigms. To obtain similar results under massively reduced electrode setups, a proposed novel strategy improves artifact classification. Addressing (2), ICA artifact cleaning has little influence on average BCI performance when analyzed by state-of-the-art BCI methods. When slow motor-related features are exploited, performance varies strongly between individuals, as artifacts may obstruct relevant neural activity or are inadvertently used for BCI control. Robustness of the proposed strategies can be reproduced by EEG practitioners as the method is made available as an EEGLAB plug-in.

  8. Emerging memories

    NASA Astrophysics Data System (ADS)

    Baldi, Livio; Bez, Roberto; Sandhu, Gurtej

    2014-12-01

    Memory is a key component of any data processing system. Following the classical Turing machine approach, memories hold both the data to be processed and the rules for processing them. In the history of microelectronics, the distinction has been rather between working memory, which is exemplified by DRAM, and storage memory, exemplified by NAND. These two types of memory devices now represent 90% of all memory market and 25% of the total semiconductor market, and have been the technology drivers in the last decades. Even if radically different in characteristics, they are however based on the same storage mechanism: charge storage, and this mechanism seems to be near to reaching its physical limits. The search for new alternative memory approaches, based on more scalable mechanisms, has therefore gained new momentum. The status of incumbent memory technologies and their scaling limitations will be discussed. Emerging memory technologies will be analyzed, starting from the ones that are already present for niche applications, and which are getting new attention, thanks to recent technology breakthroughs. Maturity level, physical limitations and potential for scaling will be compared to existing memories. At the end the possible future composition of memory systems will be discussed.

  9. Reversibility and measurement in quantum computing

    NASA Astrophysics Data System (ADS)

    Leãao, J. P.

    1998-03-01

    The relation between computation and measurement at a fundamental physical level is yet to be understood. Rolf Landauer was perhaps the first to stress the strong analogy between these two concepts. His early queries have regained pertinence with the recent efforts to developed realizable models of quantum computers. In this context the irreversibility of quantum measurement appears in conflict with the requirement of reversibility of the overall computation associated with the unitary dynamics of quantum evolution. The latter in turn is responsible for the features of superposition and entanglement which make some quantum algorithms superior to classical ones for the same task in speed and resource demand. In this article we advocate an approach to this question which relies on a model of computation designed to enforce the analogy between the two concepts instead of demarcating them as it has been the case so far. The model is introduced as a symmetrization of the classical Turing machine model and is then carried on to quantum mechanics, first as a an abstract local interaction scheme (symbolic measurement) and finally in a nonlocal noninteractive implementation based on Aharonov-Bohm potentials and modular variables. It is suggested that this implementation leads to the most ubiquitous of quantum algorithms: the Discrete Fourier Transform.

  10. BaffleText: a Human Interactive Proof

    NASA Astrophysics Data System (ADS)

    Chew, Monica; Baird, Henry S.

    2003-01-01

    Internet services designed for human use are being abused by programs. We present a defense against such attacks in the form of a CAPTCHA (Completely Automatic Public Turing test to tell Computers and Humans Apart) that exploits the difference in ability between humans and machines in reading images of text. CAPTCHAs are a special case of 'human interactive proofs,' a broad class of security protocols that allow people to identify themselves over networks as members of given groups. We point out vulnerabilities of reading-based CAPTCHAs to dictionary and computer-vision attacks. We also draw on the literature on the psychophysics of human reading, which suggests fresh defenses available to CAPTCHAs. Motivated by these considerations, we propose BaffleText, a CAPTCHA which uses non-English pronounceable words to defend against dictionary attacks, and Gestalt-motivated image-masking degradations to defend against image restoration attacks. Experiments on human subjects confirm the human legibility and user acceptance of BaffleText images. We have found an image-complexity measure that correlates well with user acceptance and assists in engineering the generation of challenges to fit the ability gap. Recent computer-vision attacks, run independently by Mori and Jitendra, suggest that BaffleText is stronger than two existing CAPTCHAs.

  11. Principals' Perceptions on the Necessity to Prepare Students for Careers in Advanced Manufacturing

    ERIC Educational Resources Information Center

    Lee, Matthew

    2015-01-01

    The United States (U.S.) is undergoing a paradigm shift in manufacturing as it progresses from an era of low skill employees who stood in one place controlling machines that drilled, stamped, cut, and milled products that passed through the effective and efficient assembly line, to one that is derived from scientific inquiry and technological…

  12. Voxel-based automated detection of focal cortical dysplasia lesions using diffusion tensor imaging and T2-weighted MRI data.

    PubMed

    Wang, Yanming; Zhou, Yawen; Wang, Huijuan; Cui, Jin; Nguchu, Benedictor Alexander; Zhang, Xufei; Qiu, Bensheng; Wang, Xiaoxiao; Zhu, Mingwang

    2018-05-21

    The aim of this study was to automatically detect focal cortical dysplasia (FCD) lesions in patients with extratemporal lobe epilepsy by relying on diffusion tensor imaging (DTI) and T2-weighted magnetic resonance imaging (MRI) data. We implemented an automated classifier using voxel-based multimodal features to identify gray and white matter abnormalities of FCD in patient cohorts. In addition to the commonly used T2-weighted image intensity feature, DTI-based features were also utilized. A Gaussian processes for machine learning (GPML) classifier was tested on 12 patients with FCD (8 with histologically confirmed FCD) scanned at 1.5 T and cross-validated using a leave-one-out strategy. Moreover, we compared the multimodal GPML paradigm's performance with that of single modal GPML and classical support vector machine (SVM). Our results demonstrated that the GPML performance on DTI-based features (mean AUC = 0.63) matches with the GPML performance on T2-weighted image intensity feature (mean AUC = 0.64). More promisingly, GPML yielded significantly improved performance (mean AUC = 0.76) when applying DTI-based features to multimodal paradigm. Based on the results, it can also be clearly stated that the proposed GPML strategy performed better and is robust to unbalanced dataset contrary to SVM that performed poorly (AUC = 0.69). Therefore, the GPML paradigm using multimodal MRI data containing DTI modality has promising result towards detection of the FCD lesions and provides an effective direction for future researches. Copyright © 2018 Elsevier Inc. All rights reserved.

  13. Molecular graph convolutions: moving beyond fingerprints

    NASA Astrophysics Data System (ADS)

    Kearnes, Steven; McCloskey, Kevin; Berndl, Marc; Pande, Vijay; Riley, Patrick

    2016-08-01

    Molecular "fingerprints" encoding structural information are the workhorse of cheminformatics and machine learning in drug discovery applications. However, fingerprint representations necessarily emphasize particular aspects of the molecular structure while ignoring others, rather than allowing the model to make data-driven decisions. We describe molecular graph convolutions, a machine learning architecture for learning from undirected graphs, specifically small molecules. Graph convolutions use a simple encoding of the molecular graph—atoms, bonds, distances, etc.—which allows the model to take greater advantage of information in the graph structure. Although graph convolutions do not outperform all fingerprint-based methods, they (along with other graph-based methods) represent a new paradigm in ligand-based virtual screening with exciting opportunities for future improvement.

  14. Molecular graph convolutions: moving beyond fingerprints.

    PubMed

    Kearnes, Steven; McCloskey, Kevin; Berndl, Marc; Pande, Vijay; Riley, Patrick

    2016-08-01

    Molecular "fingerprints" encoding structural information are the workhorse of cheminformatics and machine learning in drug discovery applications. However, fingerprint representations necessarily emphasize particular aspects of the molecular structure while ignoring others, rather than allowing the model to make data-driven decisions. We describe molecular graph convolutions, a machine learning architecture for learning from undirected graphs, specifically small molecules. Graph convolutions use a simple encoding of the molecular graph-atoms, bonds, distances, etc.-which allows the model to take greater advantage of information in the graph structure. Although graph convolutions do not outperform all fingerprint-based methods, they (along with other graph-based methods) represent a new paradigm in ligand-based virtual screening with exciting opportunities for future improvement.

  15. INTERDISCIPLINARY PHYSICS AND RELATED AREAS OF SCIENCE AND TECHNOLOGY: Superlattice Patterns in Coupled Turing Systems

    NASA Astrophysics Data System (ADS)

    Liu, Fu-Cheng; He, Ya-Feng; Pan, Yu-Yang

    2010-05-01

    In this paper, superlattice patterns have been investigated by using a two linearly coupled Brusselator model. It is found that superlattice patterns can only be induced in the sub-system with the short wavelength. Three different coupling methods have been used in order to investigate the mode interaction between the two Turing modes. It is proved in the simulations that interaction between activators in the two sub-systems leads to spontaneous formation of black eye pattern and/or white eye patterns while interaction between inhibitors leads to spontaneous formation of super-hexagonal pattern. It is also demonstrated that the same symmetries of the two modes and suitable wavelength ratio of the two modes should also be satisfied to form superlattice patterns.

  16. Model of sustainability of vernacular kampongs within Ngadha culture, Flores

    NASA Astrophysics Data System (ADS)

    Susetyarto, M. B.

    2018-01-01

    In the indigenous people of Ngadha, Flores (8°52’40.45”South, 120°59’8.18”East), the phenomenon of sustainability could be seen in its very interesting architectural traces in the setting of local factors. The sustainability phenomenon had a high value in their life and it was clearly indicated in daily activities as well as farmers, weavers, or carpenters. The phenomenon was unique and has been successfully created as a model. The research has been done by qualitative method in inductive paradigm. The data collection and comprehensive analysis have done in the field by occasional discussions with some sources of Ngadha traditional experts, vernacular architecture researchers, sociologists, anthropologists, and others. The result was a model of sustainability of vernacular kampongs within Ngadha culture, namely Tuku nunga lo’a ghera adha Ngadha. The concept of sustainability was a cultural event that synergizes the five factors supporting continuously sustainability until the optimum momentum of sustainability occurred in those synergistic conditions. The five factors were natural environment (one nua), indigenous community (mesu mora), vernacular architecture (sa’o bhaga ngadhu ture), economy (ngo ngani), and Ngadha culture (adha Ngadha). The significance and impact of the research were to provide input for the completeness of sustainability knowledge, especially the vernacular kampongs sustainability model.

  17. Autonomous biomorphic robots as platforms for sensors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tilden, M.; Hasslacher, B.; Mainieri, R.

    1996-10-01

    The idea of building autonomous robots that can carry out complex and nonrepetitive tasks is an old one, so far unrealized in any meaningful hardware. Tilden has shown recently that there are simple, processor-free solutions to building autonomous mobile machines that continuously adapt to unknown and hostile environments, are designed primarily to survive, and are extremely resistant to damage. These devices use smart mechanics and simple (low component count) electronic neuron control structures having the functionality of biological organisms from simple invertebrates to sophisticated members of the insect and crab family. These devices are paradigms for the development of autonomousmore » machines that can carry out directed goals. The machine then becomes a robust survivalist platform that can carry sensors or instruments. These autonomous roving machines, now in an early stage of development (several proof-of-concept prototype walkers have been built), can be developed so that they are inexpensive, robust, and versatile carriers for a variety of instrument packages. Applications are immediate and many, in areas as diverse as prosthetics, medicine, space, construction, nanoscience, defense, remote sensing, environmental cleanup, and biotechnology.« less

  18. Interface Metaphors for Interactive Machine Learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jasper, Robert J.; Blaha, Leslie M.

    To promote more interactive and dynamic machine learn- ing, we revisit the notion of user-interface metaphors. User-interface metaphors provide intuitive constructs for supporting user needs through interface design elements. A user-interface metaphor provides a visual or action pattern that leverages a user’s knowledge of another domain. Metaphors suggest both the visual representations that should be used in a display as well as the interactions that should be afforded to the user. We argue that user-interface metaphors can also offer a method of extracting interaction-based user feedback for use in machine learning. Metaphors offer indirect, context-based information that can be usedmore » in addition to explicit user inputs, such as user-provided labels. Implicit information from user interactions with metaphors can augment explicit user input for active learning paradigms. Or it might be leveraged in systems where explicit user inputs are more challenging to obtain. Each interaction with the metaphor provides an opportunity to gather data and learn. We argue this approach is especially important in streaming applications, where we desire machine learning systems that can adapt to dynamic, changing data.« less

  19. Prediction Based Proactive Thermal Virtual Machine Scheduling in Green Clouds

    PubMed Central

    Kinger, Supriya; Kumar, Rajesh; Sharma, Anju

    2014-01-01

    Cloud computing has rapidly emerged as a widely accepted computing paradigm, but the research on Cloud computing is still at an early stage. Cloud computing provides many advanced features but it still has some shortcomings such as relatively high operating cost and environmental hazards like increasing carbon footprints. These hazards can be reduced up to some extent by efficient scheduling of Cloud resources. Working temperature on which a machine is currently running can be taken as a criterion for Virtual Machine (VM) scheduling. This paper proposes a new proactive technique that considers current and maximum threshold temperature of Server Machines (SMs) before making scheduling decisions with the help of a temperature predictor, so that maximum temperature is never reached. Different workload scenarios have been taken into consideration. The results obtained show that the proposed system is better than existing systems of VM scheduling, which does not consider current temperature of nodes before making scheduling decisions. Thus, a reduction in need of cooling systems for a Cloud environment has been obtained and validated. PMID:24737962

  20. A self-paced brain-computer interface for controlling a robot simulator: an online event labelling paradigm and an extended Kalman filter based algorithm for online training.

    PubMed

    Tsui, Chun Sing Louis; Gan, John Q; Roberts, Stephen J

    2009-03-01

    Due to the non-stationarity of EEG signals, online training and adaptation are essential to EEG based brain-computer interface (BCI) systems. Self-paced BCIs offer more natural human-machine interaction than synchronous BCIs, but it is a great challenge to train and adapt a self-paced BCI online because the user's control intention and timing are usually unknown. This paper proposes a novel motor imagery based self-paced BCI paradigm for controlling a simulated robot in a specifically designed environment which is able to provide user's control intention and timing during online experiments, so that online training and adaptation of the motor imagery based self-paced BCI can be effectively investigated. We demonstrate the usefulness of the proposed paradigm with an extended Kalman filter based method to adapt the BCI classifier parameters, with experimental results of online self-paced BCI training with four subjects.

  1. Research and application of a novel hybrid decomposition-ensemble learning paradigm with error correction for daily PM10 forecasting

    NASA Astrophysics Data System (ADS)

    Luo, Hongyuan; Wang, Deyun; Yue, Chenqiang; Liu, Yanling; Guo, Haixiang

    2018-03-01

    In this paper, a hybrid decomposition-ensemble learning paradigm combining error correction is proposed for improving the forecast accuracy of daily PM10 concentration. The proposed learning paradigm is consisted of the following two sub-models: (1) PM10 concentration forecasting model; (2) error correction model. In the proposed model, fast ensemble empirical mode decomposition (FEEMD) and variational mode decomposition (VMD) are applied to disassemble original PM10 concentration series and error sequence, respectively. The extreme learning machine (ELM) model optimized by cuckoo search (CS) algorithm is utilized to forecast the components generated by FEEMD and VMD. In order to prove the effectiveness and accuracy of the proposed model, two real-world PM10 concentration series respectively collected from Beijing and Harbin located in China are adopted to conduct the empirical study. The results show that the proposed model performs remarkably better than all other considered models without error correction, which indicates the superior performance of the proposed model.

  2. Proxy-equation paradigm: A strategy for massively parallel asynchronous computations

    NASA Astrophysics Data System (ADS)

    Mittal, Ankita; Girimaji, Sharath

    2017-09-01

    Massively parallel simulations of transport equation systems call for a paradigm change in algorithm development to achieve efficient scalability. Traditional approaches require time synchronization of processing elements (PEs), which severely restricts scalability. Relaxing synchronization requirement introduces error and slows down convergence. In this paper, we propose and develop a novel "proxy equation" concept for a general transport equation that (i) tolerates asynchrony with minimal added error, (ii) preserves convergence order and thus, (iii) expected to scale efficiently on massively parallel machines. The central idea is to modify a priori the transport equation at the PE boundaries to offset asynchrony errors. Proof-of-concept computations are performed using a one-dimensional advection (convection) diffusion equation. The results demonstrate the promise and advantages of the present strategy.

  3. Biomimetics--a review.

    PubMed

    Vincent, J F V

    2009-11-01

    Biology can inform technology at all levels (materials, structures, mechanisms, machines, and control) but there is still a gap between biology and technology. This review itemizes examples of biomimetic products and concludes that the Russian system for inventive problem solving (teoriya resheniya izobreatatelskikh zadatch (TRIZ)) is the best system to underpin the technology transfer. Biomimetics also challenges the current paradigm of technology and suggests more sustainable ways to manipulate the world.

  4. Functional networks inference from rule-based machine learning models.

    PubMed

    Lazzarini, Nicola; Widera, Paweł; Williamson, Stuart; Heer, Rakesh; Krasnogor, Natalio; Bacardit, Jaume

    2016-01-01

    Functional networks play an important role in the analysis of biological processes and systems. The inference of these networks from high-throughput (-omics) data is an area of intense research. So far, the similarity-based inference paradigm (e.g. gene co-expression) has been the most popular approach. It assumes a functional relationship between genes which are expressed at similar levels across different samples. An alternative to this paradigm is the inference of relationships from the structure of machine learning models. These models are able to capture complex relationships between variables, that often are different/complementary to the similarity-based methods. We propose a protocol to infer functional networks from machine learning models, called FuNeL. It assumes, that genes used together within a rule-based machine learning model to classify the samples, might also be functionally related at a biological level. The protocol is first tested on synthetic datasets and then evaluated on a test suite of 8 real-world datasets related to human cancer. The networks inferred from the real-world data are compared against gene co-expression networks of equal size, generated with 3 different methods. The comparison is performed from two different points of view. We analyse the enriched biological terms in the set of network nodes and the relationships between known disease-associated genes in a context of the network topology. The comparison confirms both the biological relevance and the complementary character of the knowledge captured by the FuNeL networks in relation to similarity-based methods and demonstrates its potential to identify known disease associations as core elements of the network. Finally, using a prostate cancer dataset as a case study, we confirm that the biological knowledge captured by our method is relevant to the disease and consistent with the specialised literature and with an independent dataset not used in the inference process. The implementation of our network inference protocol is available at: http://ico2s.org/software/funel.html.

  5. Delay-induced Turing-like waves for one-species reaction-diffusion model on a network

    NASA Astrophysics Data System (ADS)

    Petit, Julien; Carletti, Timoteo; Asllani, Malbor; Fanelli, Duccio

    2015-09-01

    A one-species time-delay reaction-diffusion system defined on a complex network is studied. Traveling waves are predicted to occur following a symmetry-breaking instability of a homogeneous stationary stable solution, subject to an external nonhomogeneous perturbation. These are generalized Turing-like waves that materialize in a single-species populations dynamics model, as the unexpected byproduct of the imposed delay in the diffusion part. Sufficient conditions for the onset of the instability are mathematically provided by performing a linear stability analysis adapted to time-delayed differential equations. The method here developed exploits the properties of the Lambert W-function. The prediction of the theory are confirmed by direct numerical simulation carried out for a modified version of the classical Fisher model, defined on a Watts-Strogatz network and with the inclusion of the delay.

  6. An experimental design method leading to chemical Turing patterns.

    PubMed

    Horváth, Judit; Szalai, István; De Kepper, Patrick

    2009-05-08

    Chemical reaction-diffusion patterns often serve as prototypes for pattern formation in living systems, but only two isothermal single-phase reaction systems have produced sustained stationary reaction-diffusion patterns so far. We designed an experimental method to search for additional systems on the basis of three steps: (i) generate spatial bistability by operating autoactivated reactions in open spatial reactors; (ii) use an independent negative-feedback species to produce spatiotemporal oscillations; and (iii) induce a space-scale separation of the activatory and inhibitory processes with a low-mobility complexing agent. We successfully applied this method to a hydrogen-ion autoactivated reaction, the thiourea-iodate-sulfite (TuIS) reaction, and noticeably produced stationary hexagonal arrays of spots and parallel stripes of pH patterns attributed to a Turing bifurcation. This method could be extended to biochemical reactions.

  7. Periodic stripe formation by a Turing-mechanism operating at growth zones in the mammalian palate

    PubMed Central

    Economou, Andrew D.; Ohazama, Atsushi; Porntaveetus, Thantrira; Sharpe, Paul T.; Kondo, Shigeru; Basson, M. Albert; Gritli-Linde, Amel; Cobourne, Martyn T.; Green, Jeremy B.A.

    2012-01-01

    We present direct evidence of an activator-inhibitor system in the generation of the regularly spaced transverse ridges of the palate. We show that new ridges, or rugae, marked by stripes of Sonic hedgehog (Shh) expression, appear at two growth zones where the space between previously laid-down rugae increases. However, inter-rugal growth is not absolutely required: new stripes still appear when growth is inhibited. Furthermore, when a ruga is excised new Shh expression appears, not at the cut edge but as bifurcating stripes branching from the neighbouring Shh stripe, diagnostic of a Turing-type reaction-diffusion mechanism. Genetic and inhibitor experiments identify Fibroblast Growth Factor (FGF) and Shh as an activator-inhibitor pair in this system. These findings demonstrate a reaction-diffusion mechanism likely to be widely relevant in vertebrate development. PMID:22344222

  8. Spatio-temporal dynamics induced by competing instabilities in two asymmetrically coupled nonlinear evolution equations.

    PubMed

    Schüler, D; Alonso, S; Torcini, A; Bär, M

    2014-12-01

    Pattern formation often occurs in spatially extended physical, biological, and chemical systems due to an instability of the homogeneous steady state. The type of the instability usually prescribes the resulting spatio-temporal patterns and their characteristic length scales. However, patterns resulting from the simultaneous occurrence of instabilities cannot be expected to be simple superposition of the patterns associated with the considered instabilities. To address this issue, we design two simple models composed by two asymmetrically coupled equations of non-conserved (Swift-Hohenberg equations) or conserved (Cahn-Hilliard equations) order parameters with different characteristic wave lengths. The patterns arising in these systems range from coexisting static patterns of different wavelengths to traveling waves. A linear stability analysis allows to derive a two parameter phase diagram for the studied models, in particular, revealing for the Swift-Hohenberg equations, a co-dimension two bifurcation point of Turing and wave instability and a region of coexistence of stationary and traveling patterns. The nonlinear dynamics of the coupled evolution equations is investigated by performing accurate numerical simulations. These reveal more complex patterns, ranging from traveling waves with embedded Turing patterns domains to spatio-temporal chaos, and a wide hysteretic region, where waves or Turing patterns coexist. For the coupled Cahn-Hilliard equations the presence of a weak coupling is sufficient to arrest the coarsening process and to lead to the emergence of purely periodic patterns. The final states are characterized by domains with a characteristic length, which diverges logarithmically with the coupling amplitude.

  9. Hypothermic machine perfusion in kidney transplantation.

    PubMed

    De Deken, Julie; Kocabayoglu, Peri; Moers, Cyril

    2016-06-01

    This article summarizes novel developments in hypothermic machine perfusion (HMP) as an organ preservation modality for kidneys recovered from deceased donors. HMP has undergone a renaissance in recent years. This renewed interest has arisen parallel to a shift in paradigms; not only optimal preservation of an often marginal quality graft is required, but also improved graft function and tools to predict the latter are expected from HMP. The focus of attention in this field is currently drawn to the protection of endothelial integrity by means of additives to the perfusion solution, improvement of the HMP solution, choice of temperature, duration of perfusion, and machine settings. HMP may offer the opportunity to assess aspects of graft viability before transplantation, which can potentially aid preselection of grafts based on characteristics such as perfusate biomarkers, as well as measurement of machine perfusion dynamics parameters. HMP has proven to be beneficial as a kidney preservation method for all types of renal grafts, most notably those retrieved from extended criteria donors. Large numbers of variables during HMP, such as duration, machine settings and additives to the perfusion solution are currently being investigated to improve renal function and graft survival. In addition, the search for biomarkers has become a focus of attention to predict graft function posttransplant.

  10. Resident Space Object Characterization and Behavior Understanding via Machine Learning and Ontology-based Bayesian Networks

    NASA Astrophysics Data System (ADS)

    Furfaro, R.; Linares, R.; Gaylor, D.; Jah, M.; Walls, R.

    2016-09-01

    In this paper, we present an end-to-end approach that employs machine learning techniques and Ontology-based Bayesian Networks (BN) to characterize the behavior of resident space objects. State-of-the-Art machine learning architectures (e.g. Extreme Learning Machines, Convolutional Deep Networks) are trained on physical models to learn the Resident Space Object (RSO) features in the vectorized energy and momentum states and parameters. The mapping from measurements to vectorized energy and momentum states and parameters enables behavior characterization via clustering in the features space and subsequent RSO classification. Additionally, Space Object Behavioral Ontologies (SOBO) are employed to define and capture the domain knowledge-base (KB) and BNs are constructed from the SOBO in a semi-automatic fashion to execute probabilistic reasoning over conclusions drawn from trained classifiers and/or directly from processed data. Such an approach enables integrating machine learning classifiers and probabilistic reasoning to support higher-level decision making for space domain awareness applications. The innovation here is to use these methods (which have enjoyed great success in other domains) in synergy so that it enables a "from data to discovery" paradigm by facilitating the linkage and fusion of large and disparate sources of information via a Big Data Science and Analytics framework.

  11. Harnessing Disordered-Ensemble Quantum Dynamics for Machine Learning

    NASA Astrophysics Data System (ADS)

    Fujii, Keisuke; Nakajima, Kohei

    2017-08-01

    The quantum computer has an amazing potential of fast information processing. However, the realization of a digital quantum computer is still a challenging problem requiring highly accurate controls and key application strategies. Here we propose a platform, quantum reservoir computing, to solve these issues successfully by exploiting the natural quantum dynamics of ensemble systems, which are ubiquitous in laboratories nowadays, for machine learning. This framework enables ensemble quantum systems to universally emulate nonlinear dynamical systems including classical chaos. A number of numerical experiments show that quantum systems consisting of 5-7 qubits possess computational capabilities comparable to conventional recurrent neural networks of 100-500 nodes. This discovery opens up a paradigm for information processing with artificial intelligence powered by quantum physics.

  12. Molecular graph convolutions: moving beyond fingerprints

    PubMed Central

    Kearnes, Steven; McCloskey, Kevin; Berndl, Marc; Pande, Vijay; Riley, Patrick

    2016-01-01

    Molecular “fingerprints” encoding structural information are the workhorse of cheminformatics and machine learning in drug discovery applications. However, fingerprint representations necessarily emphasize particular aspects of the molecular structure while ignoring others, rather than allowing the model to make data-driven decisions. We describe molecular graph convolutions, a machine learning architecture for learning from undirected graphs, specifically small molecules. Graph convolutions use a simple encoding of the molecular graph—atoms, bonds, distances, etc.—which allows the model to take greater advantage of information in the graph structure. Although graph convolutions do not outperform all fingerprint-based methods, they (along with other graph-based methods) represent a new paradigm in ligand-based virtual screening with exciting opportunities for future improvement. PMID:27558503

  13. USBeSafe: Applying One Class SVM for Effective USB Event Anomaly Detection

    DTIC Science & Technology

    2016-04-25

    Directory and File Descriptions . . . . . . . . . . . . . . . . . 64 Bibliography 69 xv List of Figures 2.1 USB Descriptor Hierarchy...countless. One study performed in 2011 found that, in only the two year span prior, 50% of orga- nizations, both public and private, had sensitive...host machine. While ex - isting solutions to the rogue-TD attack paradigm require much in the way of access control maintenance and certificate management

  14. Sparsity and Nullity: Paradigm for Analysis Dictionary Learning

    DTIC Science & Technology

    2016-08-09

    16. SECURITY CLASSIFICATION OF: Sparse models in dictionary learning have been successfully applied in a wide variety of machine learning and...we investigate the relation between the SNS problem and the analysis dictionary learning problem, and show that the SNS problem plays a central role...and may be utilized to solve dictionary learning problems. 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 13. SUPPLEMENTARY NOTES 12

  15. Spectral multi-energy CT texture analysis with machine learning for tissue classification: an investigation using classification of benign parotid tumours as a testing paradigm.

    PubMed

    Al Ajmi, Eiman; Forghani, Behzad; Reinhold, Caroline; Bayat, Maryam; Forghani, Reza

    2018-06-01

    There is a rich amount of quantitative information in spectral datasets generated from dual-energy CT (DECT). In this study, we compare the performance of texture analysis performed on multi-energy datasets to that of virtual monochromatic images (VMIs) at 65 keV only, using classification of the two most common benign parotid neoplasms as a testing paradigm. Forty-two patients with pathologically proven Warthin tumour (n = 25) or pleomorphic adenoma (n = 17) were evaluated. Texture analysis was performed on VMIs ranging from 40 to 140 keV in 5-keV increments (multi-energy analysis) or 65-keV VMIs only, which is typically considered equivalent to single-energy CT. Random forest (RF) models were constructed for outcome prediction using separate randomly selected training and testing sets or the entire patient set. Using multi-energy texture analysis, tumour classification in the independent testing set had accuracy, sensitivity, specificity, positive predictive value, and negative predictive value of 92%, 86%, 100%, 100%, and 83%, compared to 75%, 57%, 100%, 100%, and 63%, respectively, for single-energy analysis. Multi-energy texture analysis demonstrates superior performance compared to single-energy texture analysis of VMIs at 65 keV for classification of benign parotid tumours. • We present and validate a paradigm for texture analysis of DECT scans. • Multi-energy dataset texture analysis is superior to single-energy dataset texture analysis. • DECT texture analysis has high accura\\cy for diagnosis of benign parotid tumours. • DECT texture analysis with machine learning can enhance non-invasive diagnostic tumour evaluation.

  16. Creating Turbulent Flow Realizations with Generative Adversarial Networks

    NASA Astrophysics Data System (ADS)

    King, Ryan; Graf, Peter; Chertkov, Michael

    2017-11-01

    Generating valid inflow conditions is a crucial, yet computationally expensive, step in unsteady turbulent flow simulations. We demonstrate a new technique for rapid generation of turbulent inflow realizations that leverages recent advances in machine learning for image generation using a deep convolutional generative adversarial network (DCGAN). The DCGAN is an unsupervised machine learning technique consisting of two competing neural networks that are trained against each other using backpropagation. One network, the generator, tries to produce samples from the true distribution of states, while the discriminator tries to distinguish between true and synthetic samples. We present results from a fully-trained DCGAN that is able to rapidly draw random samples from the full distribution of possible inflow states without needing to solve the Navier-Stokes equations, eliminating the costly process of spinning up inflow turbulence. This suggests a new paradigm in physics informed machine learning where the turbulence physics can be encoded in either the discriminator or generator. Finally, we also propose additional applications such as feature identification and subgrid scale modeling.

  17. On the effect of subliminal priming on subjective perception of images: a machine learning approach.

    PubMed

    Kumar, Parmod; Mahmood, Faisal; Mohan, Dhanya Menoth; Wong, Ken; Agrawal, Abhishek; Elgendi, Mohamed; Shukla, Rohit; Dauwels, Justin; Chan, Alice H D

    2014-01-01

    The research presented in this article investigates the influence of subliminal prime words on peoples' judgment about images, through electroencephalograms (EEGs). In this cross domain priming paradigm, the participants are asked to rate how much they like the stimulus images, on a 7-point Likert scale, after being subliminally exposed to masked lexical prime words, with EEG recorded simultaneously. Statistical analysis tools are used to analyze the effect of priming on behavior, and machine learning techniques to infer the primes from EEGs. The experiment reveals strong effects of subliminal priming on the participants' explicit rating of images. The subjective judgment affected by the priming makes visible change in event-related potentials (ERPs); results show larger ERP amplitude for the negative primes compared with positive and neutral primes. In addition, Support Vector Machine (SVM) based classifiers are proposed to infer the prime types from the average ERPs, which yields a classification rate of 70%.

  18. Machine Learning for Knowledge Extraction from PHR Big Data.

    PubMed

    Poulymenopoulou, Michaela; Malamateniou, Flora; Vassilacopoulos, George

    2014-01-01

    Cloud computing, Internet of things (IOT) and NoSQL database technologies can support a new generation of cloud-based PHR services that contain heterogeneous (unstructured, semi-structured and structured) patient data (health, social and lifestyle) from various sources, including automatically transmitted data from Internet connected devices of patient living space (e.g. medical devices connected to patients at home care). The patient data stored in such PHR systems constitute big data whose analysis with the use of appropriate machine learning algorithms is expected to improve diagnosis and treatment accuracy, to cut healthcare costs and, hence, to improve the overall quality and efficiency of healthcare provided. This paper describes a health data analytics engine which uses machine learning algorithms for analyzing cloud based PHR big health data towards knowledge extraction to support better healthcare delivery as regards disease diagnosis and prognosis. This engine comprises of the data preparation, the model generation and the data analysis modules and runs on the cloud taking advantage from the map/reduce paradigm provided by Apache Hadoop.

  19. Changing computing paradigms towards power efficiency

    PubMed Central

    Klavík, Pavel; Malossi, A. Cristiano I.; Bekas, Costas; Curioni, Alessandro

    2014-01-01

    Power awareness is fast becoming immensely important in computing, ranging from the traditional high-performance computing applications to the new generation of data centric workloads. In this work, we describe our efforts towards a power-efficient computing paradigm that combines low- and high-precision arithmetic. We showcase our ideas for the widely used kernel of solving systems of linear equations that finds numerous applications in scientific and engineering disciplines as well as in large-scale data analytics, statistics and machine learning. Towards this goal, we developed tools for the seamless power profiling of applications at a fine-grain level. In addition, we verify here previous work on post-FLOPS/W metrics and show that these can shed much more light in the power/energy profile of important applications. PMID:24842033

  20. Wall-based measurement features provides an improved IVUS coronary artery risk assessment when fused with plaque texture-based features during machine learning paradigm.

    PubMed

    Banchhor, Sumit K; Londhe, Narendra D; Araki, Tadashi; Saba, Luca; Radeva, Petia; Laird, John R; Suri, Jasjit S

    2017-12-01

    Planning of percutaneous interventional procedures involves a pre-screening and risk stratification of the coronary artery disease. Current screening tools use stand-alone plaque texture-based features and therefore lack the ability to stratify the risk. This IRB approved study presents a novel strategy for coronary artery disease risk stratification using an amalgamation of IVUS plaque texture-based and wall-based measurement features. Due to common genetic plaque makeup, carotid plaque burden was chosen as a gold standard for risk labels during training-phase of machine learning (ML) paradigm. Cross-validation protocol was adopted to compute the accuracy of the ML framework. A set of 59 plaque texture-based features was padded with six wall-based measurement features to show the improvement in stratification accuracy. The ML system was executed using principle component analysis-based framework for dimensionality reduction and uses support vector machine classifier for training and testing-phases. The ML system produced a stratification accuracy of 91.28%, demonstrating an improvement of 5.69% when wall-based measurement features were combined with plaque texture-based features. The fused system showed an improvement in mean sensitivity, specificity, positive predictive value, and area under the curve by: 6.39%, 4.59%, 3.31% and 5.48%, respectively when compared to the stand-alone system. While meeting the stability criteria of 5%, the ML system also showed a high average feature retaining power and mean reliability of 89.32% and 98.24%, respectively. The ML system showed an improvement in risk stratification accuracy when the wall-based measurement features were fused with the plaque texture-based features. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Cursor control by Kalman filter with a non-invasive body–machine interface

    PubMed Central

    Seáñez-González, Ismael; Mussa-Ivaldi, Ferdinando A

    2015-01-01

    Objective We describe a novel human–machine interface for the control of a two-dimensional (2D) computer cursor using four inertial measurement units (IMUs) placed on the user’s upper-body. Approach A calibration paradigm where human subjects follow a cursor with their body as if they were controlling it with their shoulders generates a map between shoulder motions and cursor kinematics. This map is used in a Kalman filter to estimate the desired cursor coordinates from upper-body motions. We compared cursor control performance in a centre-out reaching task performed by subjects using different amounts of information from the IMUs to control the 2D cursor. Main results Our results indicate that taking advantage of the redundancy of the signals from the IMUs improved overall performance. Our work also demonstrates the potential of non-invasive IMU-based body–machine interface systems as an alternative or complement to brain–machine interfaces for accomplishing cursor control in 2D space. Significance The present study may serve as a platform for people with high-tetraplegia to control assistive devices such as powered wheelchairs using a joystick. PMID:25242561

  2. Application of the SNoW machine learning paradigm to a set of transportation imaging problems

    NASA Astrophysics Data System (ADS)

    Paul, Peter; Burry, Aaron M.; Wang, Yuheng; Kozitsky, Vladimir

    2012-01-01

    Machine learning methods have been successfully applied to image object classification problems where there is clear distinction between classes and where a comprehensive set of training samples and ground truth are readily available. The transportation domain is an area where machine learning methods are particularly applicable, since the classification problems typically have well defined class boundaries and, due to high traffic volumes in most applications, massive roadway data is available. Though these classes tend to be well defined, the particular image noises and variations can be challenging. Another challenge is the extremely high accuracy typically required in most traffic applications. Incorrect assignment of fines or tolls due to imaging mistakes is not acceptable in most applications. For the front seat vehicle occupancy detection problem, classification amounts to determining whether one face (driver only) or two faces (driver + passenger) are detected in the front seat of a vehicle on a roadway. For automatic license plate recognition, the classification problem is a type of optical character recognition problem encompassing multiple class classification. The SNoW machine learning classifier using local SMQT features is shown to be successful in these two transportation imaging applications.

  3. Changing clothes easily: connexin41.8 regulates skin pattern variation.

    PubMed

    Watanabe, Masakatsu; Kondo, Shigeru

    2012-05-01

    The skin patterns of animals are very important for their survival, yet the mechanisms involved in skin pattern formation remain unresolved. Turing's reaction-diffusion model presents a well-known mathematical explanation of how animal skin patterns are formed, and this model can predict various animal patterns that are observed in nature. In this study, we used transgenic zebrafish to generate various artificial skin patterns including a narrow stripe with a wide interstripe, a narrow stripe with a narrow interstripe, a labyrinth, and a 'leopard' pattern (or donut-like ring pattern). In this process, connexin41.8 (or its mutant form) was ectopically expressed using the mitfa promoter. Specifically, the leopard pattern was generated as predicted by Turing's model. Our results demonstrate that the pigment cells in animal skin have the potential and plasticity to establish various patterns and that the reaction-diffusion principle can predict skin patterns of animals. © 2012 John Wiley & Sons A/S.

  4. Is thinking computable?

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1990-01-01

    Strong artificial intelligence claims that conscious thought can arise in computers containing the right algorithms even though none of the programs or components of those computers understand which is going on. As proof, it asserts that brains are finite webs of neurons, each with a definite function governed by the laws of physics; this web has a set of equations that can be solved (or simulated) by a sufficiently powerful computer. Strong AI claims the Turing test as a criterion of success. A recent debate in Scientific American concludes that the Turing test is not sufficient, but leaves intact the underlying premise that thought is a computable process. The recent book by Roger Penrose, however, offers a sharp challenge, arguing that the laws of quantum physics may govern mental processes and that these laws may not be computable. In every area of mathematics and physics, Penrose finds evidence of nonalgorithmic human activity and concludes that mental processes are inherently more powerful than computational processes.

  5. Turbulent patterns in wall-bounded flows: A Turing instability?

    NASA Astrophysics Data System (ADS)

    Manneville, Paul

    2012-06-01

    In their way to/from turbulence, plane wall-bounded flows display an interesting transitional regime where laminar and turbulent oblique bands alternate, the origin of which is still mysterious. In line with Barkley's recent work about the pipe flow transition involving reaction-diffusion concepts, we consider plane Couette flow in the same perspective and transform Waleffe's classical four-variable model of self-sustaining process into a reaction-diffusion model. We show that, upon fulfillment of a condition on the relative diffusivities of its variables, the featureless turbulent regime becomes unstable against patterning as the result of a Turing instability. A reduced two-variable model helps us to delineate the appropriate region of parameter space. An intrinsic status is therefore given to the pattern's wavelength for the first time. Virtues and limitations of the model are discussed, calling for a microscopic support of the phenomenological approach.

  6. On Undecidability Aspects of Resilient Computations and Implications to Exascale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Nageswara S

    2014-01-01

    Future Exascale computing systems with a large number of processors, memory elements and interconnection links, are expected to experience multiple, complex faults, which affect both applications and operating-runtime systems. A variety of algorithms, frameworks and tools are being proposed to realize and/or verify the resilience properties of computations that guarantee correct results on failure-prone computing systems. We analytically show that certain resilient computation problems in presence of general classes of faults are undecidable, that is, no algorithms exist for solving them. We first show that the membership verification in a generic set of resilient computations is undecidable. We describe classesmore » of faults that can create infinite loops or non-halting computations, whose detection in general is undecidable. We then show certain resilient computation problems to be undecidable by using reductions from the loop detection and halting problems under two formulations, namely, an abstract programming language and Turing machines, respectively. These two reductions highlight different failure effects: the former represents program and data corruption, and the latter illustrates incorrect program execution. These results call for broad-based, well-characterized resilience approaches that complement purely computational solutions using methods such as hardware monitors, co-designs, and system- and application-specific diagnosis codes.« less

  7. Quantum Computing in Fock Space Systems

    NASA Astrophysics Data System (ADS)

    Berezin, Alexander A.

    1997-04-01

    Fock space system (FSS) has unfixed number (N) of particles and/or degrees of freedom. In quantum computing (QC) main requirement is sustainability of coherent Q-superpositions. This normally favoured by low noise environment. High excitation/high temperature (T) limit is hence discarded as unfeasible for QC. Conversely, if N is itself a quantized variable, the dimensionality of Hilbert basis for qubits may increase faster (say, N-exponentially) than thermal noise (likely, in powers of N and T). Hence coherency may win over T-randomization. For this type of QC speed (S) of factorization of long integers (with D digits) may increase with D (for 'ordinary' QC speed polynomially decreases with D). This (apparent) paradox rests on non-monotonic bijectivity (cf. Georg Cantor's diagonal counting of rational numbers). This brings entire aleph-null structurality ("Babylonian Library" of infinite informational content of integer field) to superposition determining state of quantum analogue of Turing machine head. Structure of integer infinititude (e.g. distribution of primes) results in direct "Platonic pressure" resembling semi-virtual Casimir efect (presure of cut-off vibrational modes). This "effect", the embodiment of Pythagorean "Number is everything", renders Godelian barrier arbitrary thin and hence FSS-based QC can in principle be unlimitedly efficient (e.g. D/S may tend to zero when D tends to infinity).

  8. Undecidability of the spectral gap.

    PubMed

    Cubitt, Toby S; Perez-Garcia, David; Wolf, Michael M

    2015-12-10

    The spectral gap--the energy difference between the ground state and first excited state of a system--is central to quantum many-body physics. Many challenging open problems, such as the Haldane conjecture, the question of the existence of gapped topological spin liquid phases, and the Yang-Mills gap conjecture, concern spectral gaps. These and other problems are particular cases of the general spectral gap problem: given the Hamiltonian of a quantum many-body system, is it gapped or gapless? Here we prove that this is an undecidable problem. Specifically, we construct families of quantum spin systems on a two-dimensional lattice with translationally invariant, nearest-neighbour interactions, for which the spectral gap problem is undecidable. This result extends to undecidability of other low-energy properties, such as the existence of algebraically decaying ground-state correlations. The proof combines Hamiltonian complexity techniques with aperiodic tilings, to construct a Hamiltonian whose ground state encodes the evolution of a quantum phase-estimation algorithm followed by a universal Turing machine. The spectral gap depends on the outcome of the corresponding 'halting problem'. Our result implies that there exists no algorithm to determine whether an arbitrary model is gapped or gapless, and that there exist models for which the presence or absence of a spectral gap is independent of the axioms of mathematics.

  9. Rosen's (M,R) system in Unified Modelling Language.

    PubMed

    Zhang, Ling; Williams, Richard A; Gatherer, Derek

    2016-01-01

    Robert Rosen's (M,R) system is an abstract biological network architecture that is allegedly non-computable on a Turing machine. If (M,R) is truly non-computable, there are serious implications for the modelling of large biological networks in computer software. A body of work has now accumulated addressing Rosen's claim concerning (M,R) by attempting to instantiate it in various software systems. However, a conclusive refutation has remained elusive, principally since none of the attempts to date have unambiguously avoided the critique that they have altered the properties of (M,R) in the coding process, producing merely approximate simulations of (M,R) rather than true computational models. In this paper, we use the Unified Modelling Language (UML), a diagrammatic notation standard, to express (M,R) as a system of objects having attributes, functions and relations. We believe that this instantiates (M,R) in such a way than none of the original properties of the system are corrupted in the process. Crucially, we demonstrate that (M,R) as classically represented in the relational biology literature is implicitly a UML communication diagram. Furthermore, since UML is formally compatible with object-oriented computing languages, instantiation of (M,R) in UML strongly implies its computability in object-oriented coding languages. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  10. Upstream: How Theory Shapes the Selection of Ways in Strategy

    DTIC Science & Technology

    2016-08-01

    application of theory cannot be simplified as mass casualty, nighttime-area bombing as evidenced by the daring Dambusters Raid in 1943. Nor did the...Americans only use “Industrial Web” precision bombing as evidenced throughout 1945 in Japan. In the case of Operation Desert Storm, a paradigm shift...machine. Morale-Effects theorists in the same war drew upon human philosophy and psychology to justify bombing civilian populations to break German

  11. Spatio-temporal dynamics induced by competing instabilities in two asymmetrically coupled nonlinear evolution equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schüler, D.; Alonso, S.; Bär, M.

    2014-12-15

    Pattern formation often occurs in spatially extended physical, biological, and chemical systems due to an instability of the homogeneous steady state. The type of the instability usually prescribes the resulting spatio-temporal patterns and their characteristic length scales. However, patterns resulting from the simultaneous occurrence of instabilities cannot be expected to be simple superposition of the patterns associated with the considered instabilities. To address this issue, we design two simple models composed by two asymmetrically coupled equations of non-conserved (Swift-Hohenberg equations) or conserved (Cahn-Hilliard equations) order parameters with different characteristic wave lengths. The patterns arising in these systems range from coexistingmore » static patterns of different wavelengths to traveling waves. A linear stability analysis allows to derive a two parameter phase diagram for the studied models, in particular, revealing for the Swift-Hohenberg equations, a co-dimension two bifurcation point of Turing and wave instability and a region of coexistence of stationary and traveling patterns. The nonlinear dynamics of the coupled evolution equations is investigated by performing accurate numerical simulations. These reveal more complex patterns, ranging from traveling waves with embedded Turing patterns domains to spatio-temporal chaos, and a wide hysteretic region, where waves or Turing patterns coexist. For the coupled Cahn-Hilliard equations the presence of a weak coupling is sufficient to arrest the coarsening process and to lead to the emergence of purely periodic patterns. The final states are characterized by domains with a characteristic length, which diverges logarithmically with the coupling amplitude.« less

  12. Off-patent drugs at brand-name prices: a puzzle for policymakers

    PubMed Central

    Tallapragada, Naren P.

    2016-01-01

    In August 2015, Turing Pharmaceuticals acquired the marketing rights to Daraprim (pyrimethamine), a drug used to treat parasitic infections like malaria and toxoplasmosis. Soon after, Turing caused an uproar when it announced that it would raise the price per tablet of Daraprim from \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$\\rm{\\$ 13.50\\ to\\ \\$ 750}$\\end{document}, a 5500% price hike for a drug that has been on the market for over 60 years and off patent since the 1970s. Old, off-patent drugs are becoming increasingly expensive; Daraprim is the archetypal example. Turing had the power to set a high price for Daraprim because the drug's limited patient population, the absence of competing manufacturers, and a lack of therapeutic alternatives all created an effective monopoly. Similar forces have driven up the prices of other off-patent drugs that treat diseases as diverse as heart failure and multi-drug-resistant tuberculosis. Thus, policymakers will have to consider how the high cost of off-patent drugs impacts public health as well as public spending. In this Note I outline the extent of the high-cost off-patent drug problem, drawing special attention to the problem's negative effects on both health outcomes and government budgets. After discussing some of the problem's underlying causes, I present several solutions to the problem that policymakers could consider, with a focus on proposals like reference pricing and expanded compounding that have received relatively little media attention. PMID:27774247

  13. Long-time behavior and Turing instability induced by cross-diffusion in a three species food chain model with a Holling type-II functional response.

    PubMed

    Haile, Dawit; Xie, Zhifu

    2015-09-01

    In this paper, we study a strongly coupled reaction-diffusion system describing three interacting species in a food chain model, where the third species preys on the second one and simultaneously the second species preys on the first one. An intra-species competition b2 among the second predator is introduced to the food chain model. This parameter produces some very interesting result in linear stability and Turing instability. We first show that the unique positive equilibrium solution is locally asymptotically stable for the corresponding ODE system when the intra-species competition exists among the second predator. The positive equilibrium solution remains linearly stable for the reaction diffusion system without cross diffusion, hence it does not belong to the classical Turing instability scheme. But it becomes linearly unstable only when cross-diffusion also plays a role in the reaction-diffusion system, hence the instability is driven solely from the effect of cross diffusion. Our results also exhibit some interesting combining effects of cross-diffusion, intra-species competitions and inter-species interactions. Numerically, we conduct a one parameter analysis which illustrate how the interactions change the existence of stable equilibrium, limit cycle, and chaos. Some interesting dynamical phenomena occur when we perform analysis of interactions in terms of self-production of prey and intra-species competition of the middle predator. By numerical simulations, it illustrates the existence of nonuniform steady solutions and new patterns such as spot patterns, strip patterns and fluctuations due to the diffusion and cross diffusion in two-dimension. Published by Elsevier Inc.

  14. Applications for Gradient Metal Alloys Fabricated Using Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Hofmann, Douglas C.; Borgonia, John Paul C.; Dillon, Robert P.; Suh, Eric J.; Mulder, jerry L.; Gardner, Paul B.

    2013-01-01

    Recently, additive manufacturing (AM) techniques have been developed that may shift the paradigm of traditional metal production by allowing complex net-shaped hardware to be built up layer-by-layer, rather than being machined from a billet. The AM process is ubiquitous with polymers due to their low melting temperatures, fast curing, and controllable viscosity, and 3D printers are widely available as commercial or consumer products. 3D printing with metals is inherently more complicated than with polymers due to their higher melting temperatures and reactivity with air, particularly when heated or molten. The process generally requires a high-power laser or other focused heat source, like an electron beam, for precise melting and deposition. Several promising metal AM techniques have been developed, including laser deposition (also called laser engineered net shaping or LENS® and laser deposition technology (LDT)), direct metal laser sintering (DMLS), and electron beam free-form (EBF). These machines typically use powders or wire feedstock that are melted and deposited using a laser or electron beam. Complex net-shape parts have been widely demonstrated using these (and other) AM techniques and the process appears to be a promising alternative to machining in some cases. Rather than simply competing with traditional machining for cost and time savings, the true advantage of AM involves the fabrication of hardware that cannot be produced using other techniques. This could include parts with "blind" features (like foams or trusses), parts that are difficult to machine conventionally, or parts made from materials that do not exist in bulk forms. In this work, the inventors identify that several AM techniques can be used to develop metal parts that change composition from one location in the part to another, allowing for complete control over the mechanical or physical properties. This changes the paradigm for conventional metal fabrication, which relies on an assortment of "post-processing" methods to locally alter properties (such as coating, heat treating, work hardening, shot peening, etching, anodizing, among others). Building the final part in an additive process allows for the development of an entirely new class of metals, so-called "functionally graded metals" or "gradient alloys." By carefully blending feedstock materials with different properties in an AM process, hardware can be developed with properties that cannot be obtained using other techniques but with the added benefit of the net-shaped fabrication that AM allows.

  15. Cryptanalysis in World War II--and Mathematics Education.

    ERIC Educational Resources Information Center

    Hilton, Peter

    1984-01-01

    Hilton describes the team of cryptanalysts who tried to decipher German and Japanese codes during the Second World War. The work of Turing, essentially developing the computer, is reported, as well as inferences about pure and applied mathematics. (MNS)

  16. JPRS Report China

    DTIC Science & Technology

    1988-10-27

    such as statistics law, measurement law, accounting law, law on Chinese-For- eign joint ventures, law on foreign-owned enterprises, income tax law concerning...Chinese-Foreign joint ven- tures, income tax law concerning foreign enterprises, law of economic contract with foreigners, and so forth

  17. Accommodation in Untextured Stimulus Fields.

    DTIC Science & Technology

    1979-05-01

    that accommodation is notably inaccurate with reduced illumination, textural cue removal, or small aper ture viewing. These situational ametropias are...dark focus. Although, for any individual, large correlations exist among these ametropias , statistically reliable differen ces occur among them as well

  18. Beyond the Turing Test: Performance Metrics for Evaluating a Computer Simulation of the Human Mind

    DTIC Science & Technology

    2002-08-01

    Tomasello , 2001. Perceiving intentions and learning words in the second year of life. In M. Bowerman and S. Levinson (Eds.), Language Acquisition and Conceptual Development. Cambridge University Press, New York, NY.

  19. Deciphering the enigma of undetected species, phylogenetic, and functional diversity based on Good-Turing theory.

    PubMed

    Chao, Anne; Chiu, Chun-Huo; Colwell, Robert K; Magnago, Luiz Fernando S; Chazdon, Robin L; Gotelli, Nicholas J

    2017-11-01

    Estimating the species, phylogenetic, and functional diversity of a community is challenging because rare species are often undetected, even with intensive sampling. The Good-Turing frequency formula, originally developed for cryptography, estimates in an ecological context the true frequencies of rare species in a single assemblage based on an incomplete sample of individuals. Until now, this formula has never been used to estimate undetected species, phylogenetic, and functional diversity. Here, we first generalize the Good-Turing formula to incomplete sampling of two assemblages. The original formula and its two-assemblage generalization provide a novel and unified approach to notation, terminology, and estimation of undetected biological diversity. For species richness, the Good-Turing framework offers an intuitive way to derive the non-parametric estimators of the undetected species richness in a single assemblage, and of the undetected species shared between two assemblages. For phylogenetic diversity, the unified approach leads to an estimator of the undetected Faith's phylogenetic diversity (PD, the total length of undetected branches of a phylogenetic tree connecting all species), as well as a new estimator of undetected PD shared between two phylogenetic trees. For functional diversity based on species traits, the unified approach yields a new estimator of undetected Walker et al.'s functional attribute diversity (FAD, the total species-pairwise functional distance) in a single assemblage, as well as a new estimator of undetected FAD shared between two assemblages. Although some of the resulting estimators have been previously published (but derived with traditional mathematical inequalities), all taxonomic, phylogenetic, and functional diversity estimators are now derived under the same framework. All the derived estimators are theoretically lower bounds of the corresponding undetected diversities; our approach reveals the sufficient conditions under which the estimators are nearly unbiased, thus offering new insights. Simulation results are reported to numerically verify the performance of the derived estimators. We illustrate all estimators and assess their sampling uncertainty with an empirical dataset for Brazilian rain forest trees. These estimators should be widely applicable to many current problems in ecology, such as the effects of climate change on spatial and temporal beta diversity and the contribution of trait diversity to ecosystem multi-functionality. © 2017 by the Ecological Society of America.

  20. Integrated pillar scatterers for speeding up classification of cell holograms.

    PubMed

    Lugnan, Alessio; Dambre, Joni; Bienstman, Peter

    2017-11-27

    The computational power required to classify cell holograms is a major limit to the throughput of label-free cell sorting based on digital holographic microscopy. In this work, a simple integrated photonic stage comprising a collection of silica pillar scatterers is proposed as an effective nonlinear mixing interface between the light scattered by a cell and an image sensor. The light processing provided by the photonic stage allows for the use of a simple linear classifier implemented in the electric domain and applied on a limited number of pixels. A proof-of-concept of the presented machine learning technique, which is based on the extreme learning machine (ELM) paradigm, is provided by the classification results on samples generated by 2D FDTD simulations of cells in a microfluidic channel.

  1. Quantum simulations with noisy quantum computers

    NASA Astrophysics Data System (ADS)

    Gambetta, Jay

    Quantum computing is a new computational paradigm that is expected to lie beyond the standard model of computation. This implies a quantum computer can solve problems that can't be solved by a conventional computer with tractable overhead. To fully harness this power we need a universal fault-tolerant quantum computer. However the overhead in building such a machine is high and a full solution appears to be many years away. Nevertheless, we believe that we can build machines in the near term that cannot be emulated by a conventional computer. It is then interesting to ask what these can be used for. In this talk we will present our advances in simulating complex quantum systems with noisy quantum computers. We will show experimental implementations of this on some small quantum computers.

  2. A novel channel selection method for optimal classification in different motor imagery BCI paradigms.

    PubMed

    Shan, Haijun; Xu, Haojie; Zhu, Shanan; He, Bin

    2015-10-21

    For sensorimotor rhythms based brain-computer interface (BCI) systems, classification of different motor imageries (MIs) remains a crucial problem. An important aspect is how many scalp electrodes (channels) should be used in order to reach optimal performance classifying motor imaginations. While the previous researches on channel selection mainly focus on MI tasks paradigms without feedback, the present work aims to investigate the optimal channel selection in MI tasks paradigms with real-time feedback (two-class control and four-class control paradigms). In the present study, three datasets respectively recorded from MI tasks experiment, two-class control and four-class control experiments were analyzed offline. Multiple frequency-spatial synthesized features were comprehensively extracted from every channel, and a new enhanced method IterRelCen was proposed to perform channel selection. IterRelCen was constructed based on Relief algorithm, but was enhanced from two aspects: change of target sample selection strategy and adoption of the idea of iterative computation, and thus performed more robust in feature selection. Finally, a multiclass support vector machine was applied as the classifier. The least number of channels that yield the best classification accuracy were considered as the optimal channels. One-way ANOVA was employed to test the significance of performance improvement among using optimal channels, all the channels and three typical MI channels (C3, C4, Cz). The results show that the proposed method outperformed other channel selection methods by achieving average classification accuracies of 85.2, 94.1, and 83.2 % for the three datasets, respectively. Moreover, the channel selection results reveal that the average numbers of optimal channels were significantly different among the three MI paradigms. It is demonstrated that IterRelCen has a strong ability for feature selection. In addition, the results have shown that the numbers of optimal channels in the three different motor imagery BCI paradigms are distinct. From a MI task paradigm, to a two-class control paradigm, and to a four-class control paradigm, the number of required channels for optimizing the classification accuracy increased. These findings may provide useful information to optimize EEG based BCI systems, and further improve the performance of noninvasive BCI.

  3. Changing computing paradigms towards power efficiency.

    PubMed

    Klavík, Pavel; Malossi, A Cristiano I; Bekas, Costas; Curioni, Alessandro

    2014-06-28

    Power awareness is fast becoming immensely important in computing, ranging from the traditional high-performance computing applications to the new generation of data centric workloads. In this work, we describe our efforts towards a power-efficient computing paradigm that combines low- and high-precision arithmetic. We showcase our ideas for the widely used kernel of solving systems of linear equations that finds numerous applications in scientific and engineering disciplines as well as in large-scale data analytics, statistics and machine learning. Towards this goal, we developed tools for the seamless power profiling of applications at a fine-grain level. In addition, we verify here previous work on post-FLOPS/W metrics and show that these can shed much more light in the power/energy profile of important applications. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  4. A stochastic multi-agent optimization model for energy infrastructure planning under uncertainty and competition.

    DOT National Transportation Integrated Search

    2017-07-04

    This paper presents a stochastic multi-agent optimization model that supports energy infrastruc- : ture planning under uncertainty. The interdependence between dierent decision entities in the : system is captured in an energy supply chain network, w...

  5. Bot, Cyborg and Automated Turing Test

    NASA Astrophysics Data System (ADS)

    Yan, Jeff

    Ross Anderson: Bot tending might be an attractive activity for children, because children could receive the challenges on their mobile phones, to which they are almost physiologically attached these days, and they’re perhaps used to relatively smaller amounts of pocket money.

  6. Synthetic Molecular Machines for Active Self-Assembly: Prototype Algorithms, Designs, and Experimental Study

    NASA Astrophysics Data System (ADS)

    Dabby, Nadine L.

    Computer science and electrical engineering have been the great success story of the twentieth century. The neat modularity and mapping of a language onto circuits has led to robots on Mars, desktop computers and smartphones. But these devices are not yet able to do some of the things that life takes for granted: repair a scratch, reproduce, regenerate, or grow exponentially fast--all while remaining functional. This thesis explores and develops algorithms, molecular implementations, and theoretical proofs in the context of "active self-assembly" of molecular systems. The long-term vision of active self-assembly is the theoretical and physical implementation of materials that are composed of reconfigurable units with the programmability and adaptability of biology's numerous molecular machines. En route to this goal, we must first find a way to overcome the memory limitations of molecular systems, and to discover the limits of complexity that can be achieved with individual molecules. One of the main thrusts in molecular programming is to use computer science as a tool for figuring out what can be achieved. While molecular systems that are Turing-complete have been demonstrated [Winfree, 1996], these systems still cannot achieve some of the feats biology has achieved. One might think that because a system is Turing-complete, capable of computing "anything," that it can do any arbitrary task. But while it can simulate any digital computational problem, there are many behaviors that are not "computations" in a classical sense, and cannot be directly implemented. Examples include exponential growth and molecular motion relative to a surface. Passive self-assembly systems cannot implement these behaviors because (a) molecular motion relative to a surface requires a source of fuel that is external to the system, and (b) passive systems are too slow to assemble exponentially-fast-growing structures. We call these behaviors "energetically incomplete" programmable behaviors. This class of behaviors includes any behavior where a passive physical system simply does not have enough physical energy to perform the specified tasks in the requisite amount of time. As we will demonstrate and prove, a sufficiently expressive implementation of an "active" molecular self-assembly approach can achieve these behaviors. Using an external source of fuel solves part of the problem, so the system is not "energetically incomplete." But the programmable system also needs to have sufficient expressive power to achieve the specified behaviors. Perhaps surprisingly, some of these systems do not even require Turing completeness to be sufficiently expressive. Building on a large variety of work by other scientists in the fields of DNA nanotechnology, chemistry and reconfigurable robotics, this thesis introduces several research contributions in the context of active self-assembly. We show that simple primitives such as insertion and deletion are able to generate complex and interesting results such as the growth of a linear polymer in logarithmic time and the ability of a linear polymer to treadmill. To this end we developed a formal model for active-self assembly that is directly implementable with DNA molecules. We show that this model is computationally equivalent to a machine capable of producing strings that are stronger than regular languages and, at most, as strong as context-free grammars. This is a great advance in the theory of active self-assembly as prior models were either entirely theoretical or only implementable in the context of macro-scale robotics. We developed a chain reaction method for the autonomous exponential growth of a linear DNA polymer. Our method is based on the insertion of molecules into the assembly, which generates two new insertion sites for every initial one employed. The building of a line in logarithmic time is a first step toward building a shape in logarithmic time. We demonstrate the first construction of a synthetic linear polymer that grows exponentially fast via insertion. We show that monomer molecules are converted into the polymer in logarithmic time via spectrofluorimetry and gel electrophoresis experiments. We also demonstrate the division of these polymers via the addition of a single DNA complex that competes with the insertion mechanism. This shows the growth of a population of polymers in logarithmic time. We characterize the DNA insertion mechanism that we utilize in Chapter 4. We experimentally demonstrate that we can control the kinetics of this reaction over at least seven orders of magnitude, by programming the sequences of DNA that initiate the reaction. In addition, we review co-authored work on programming molecular robots using prescriptive landscapes of DNA origami; this was the first microscopic demonstration of programming a molecular robot to walk on a 2-dimensional surface. We developed a snapshot method for imaging these random walking molecular robots and a CAPTCHA-like analysis method for difficult-to-interpret imaging data.

  7. Bridging paradigms: hybrid mechanistic-discriminative predictive models.

    PubMed

    Doyle, Orla M; Tsaneva-Atansaova, Krasimira; Harte, James; Tiffin, Paul A; Tino, Peter; Díaz-Zuccarini, Vanessa

    2013-03-01

    Many disease processes are extremely complex and characterized by multiple stochastic processes interacting simultaneously. Current analytical approaches have included mechanistic models and machine learning (ML), which are often treated as orthogonal viewpoints. However, to facilitate truly personalized medicine, new perspectives may be required. This paper reviews the use of both mechanistic models and ML in healthcare as well as emerging hybrid methods, which are an exciting and promising approach for biologically based, yet data-driven advanced intelligent systems.

  8. How do robots take two parts apart

    NASA Technical Reports Server (NTRS)

    Bajcsy, Ruzena K.; Tsikos, Constantine J.

    1989-01-01

    This research is a natural progression of efforts which begun with the introduction of a new research paradigm in machine perception, called Active Perception. There it was stated that Active Perception is a problem of intelligent control strategies applied to data acquisition processes which will depend on the current state of the data interpretation, including recognition. The disassembly/assembly problem is treated as an Active Perception problem, and a method for autonomous disassembly based on this framework is presented.

  9. The complexity of proving chaoticity and the Church-Turing thesis

    NASA Astrophysics Data System (ADS)

    Calude, Cristian S.; Calude, Elena; Svozil, Karl

    2010-09-01

    Proving the chaoticity of some dynamical systems is equivalent to solving the hardest problems in mathematics. Conversely, classical physical systems may "compute the hard or even the incomputable" by measuring observables which correspond to computationally hard or even incomputable problems.

  10. JPRS Report. Science & Technology: Europe.

    DTIC Science & Technology

    1991-03-29

    systems (wind power engines, thermal collectors, etc.). Minister of Research and Technology Hubert Curien and Minister of Public Works, Housing...similar to those currently being manufac- tured in the USSR is being hypothesized, together with studies on the development of the new San Marco Scout

  11. Stratigraphic Sedimentary Environmental Change of the Mount Bruce Supergroup, Beasley River Area, Southern Pilbara, Western Australia

    NASA Astrophysics Data System (ADS)

    Komure, M.; Kiyokawa, S.; Ikehara, M.; Tsutsumi, Y.; Horie, K.

    2005-12-01

    The Mount Bruce Supergroup is deposited from Late Archaean to Early Proterozoic in the Pilbara craton, Western Australia. It is filed the information of the period that changes from the Late Archean to the Early Proterozoic, and is the key sequences which could reconstruct the sedimentary environment because of its low metamorphic grade. The evidence of early Proterozoic global ice age as the glacial sediment is reported in this uppermost group (Martin 1999). In this study, we focus the lithological changes of the Mount Bruce Supergroup at the Beasley River - Rocklea Dome area in the Southern Pilbara. Along the Beasley River, this supergroup distributes more than 10000m thick with 5 billion years sequences, and is divided into three groups. The Fortescue Group is identified with the flood basalt to the Shallow marine or the non-marine sediment, the middle Hamersley Group rich in the banded iron formation and the acidic volcanic rock and the upper Turee Creek Group mainly of the Shallow marine sediment. Here we focused origin of the sandstone in each group, especially in the Meteorite Bore Member of Turee Creek Formation which is identified as the early snowball earth events. At the matrix of the diamictite of the Meteorite Bore Member, Origin of diamictite matrix in the Turee Creek Group sediment by the U-Pb detrital zircon geochronology by CHIME and SHRIMP2. The zircon ages points between 2.7Ga and 2.4Ga. In addtion from this matrix, TOC value indicate 0.1-0.05%, the delta 13 C value is -30--20 par mil. These evidence suggested that the organic activity might take place at during ice age.

  12. Autonomous unobtrusive detection of mild cognitive impairment in older adults.

    PubMed

    Akl, Ahmad; Taati, Babak; Mihailidis, Alex

    2015-05-01

    The current diagnosis process of dementia is resulting in a high percentage of cases with delayed detection. To address this problem, in this paper, we explore the feasibility of autonomously detecting mild cognitive impairment (MCI) in the older adult population. We implement a signal processing approach equipped with a machine learning paradigm to process and analyze real-world data acquired using home-based unobtrusive sensing technologies. Using the sensor and clinical data pertaining to 97 subjects, acquired over an average period of three years, a number of measures associated with the subjects' walking speed and general activity in the home were calculated. Different time spans of these measures were used to generate feature vectors to train and test two machine learning algorithms namely support vector machines and random forests. We were able to autonomously detect MCI in older adults with an area under the ROC curve of 0.97 and an area under the precision-recall curve of 0.93 using a time window of 24 weeks. This study is of great significance since it can potentially assist in the early detection of cognitive impairment in older adults.

  13. Machines first, humans second: on the importance of algorithmic interpretation of open chemistry data.

    PubMed

    Clark, Alex M; Williams, Antony J; Ekins, Sean

    2015-01-01

    The current rise in the use of open lab notebook techniques means that there are an increasing number of scientists who make chemical information freely and openly available to the entire community as a series of micropublications that are released shortly after the conclusion of each experiment. We propose that this trend be accompanied by a thorough examination of data sharing priorities. We argue that the most significant immediate benefactor of open data is in fact chemical algorithms, which are capable of absorbing vast quantities of data, and using it to present concise insights to working chemists, on a scale that could not be achieved by traditional publication methods. Making this goal practically achievable will require a paradigm shift in the way individual scientists translate their data into digital form, since most contemporary methods of data entry are designed for presentation to humans rather than consumption by machine learning algorithms. We discuss some of the complex issues involved in fixing current methods, as well as some of the immediate benefits that can be gained when open data is published correctly using unambiguous machine readable formats. Graphical AbstractLab notebook entries must target both visualisation by scientists and use by machine learning algorithms.

  14. Tool Wear Prediction in Ti-6Al-4V Machining through Multiple Sensor Monitoring and PCA Features Pattern Recognition.

    PubMed

    Caggiano, Alessandra

    2018-03-09

    Machining of titanium alloys is characterised by extremely rapid tool wear due to the high cutting temperature and the strong adhesion at the tool-chip and tool-workpiece interface, caused by the low thermal conductivity and high chemical reactivity of Ti alloys. With the aim to monitor the tool conditions during dry turning of Ti-6Al-4V alloy, a machine learning procedure based on the acquisition and processing of cutting force, acoustic emission and vibration sensor signals during turning is implemented. A number of sensorial features are extracted from the acquired sensor signals in order to feed machine learning paradigms based on artificial neural networks. To reduce the large dimensionality of the sensorial features, an advanced feature extraction methodology based on Principal Component Analysis (PCA) is proposed. PCA allowed to identify a smaller number of features ( k = 2 features), the principal component scores, obtained through linear projection of the original d features into a new space with reduced dimensionality k = 2, sufficient to describe the variance of the data. By feeding artificial neural networks with the PCA features, an accurate diagnosis of tool flank wear ( VB max ) was achieved, with predicted values very close to the measured tool wear values.

  15. Tool Wear Prediction in Ti-6Al-4V Machining through Multiple Sensor Monitoring and PCA Features Pattern Recognition

    PubMed Central

    2018-01-01

    Machining of titanium alloys is characterised by extremely rapid tool wear due to the high cutting temperature and the strong adhesion at the tool-chip and tool-workpiece interface, caused by the low thermal conductivity and high chemical reactivity of Ti alloys. With the aim to monitor the tool conditions during dry turning of Ti-6Al-4V alloy, a machine learning procedure based on the acquisition and processing of cutting force, acoustic emission and vibration sensor signals during turning is implemented. A number of sensorial features are extracted from the acquired sensor signals in order to feed machine learning paradigms based on artificial neural networks. To reduce the large dimensionality of the sensorial features, an advanced feature extraction methodology based on Principal Component Analysis (PCA) is proposed. PCA allowed to identify a smaller number of features (k = 2 features), the principal component scores, obtained through linear projection of the original d features into a new space with reduced dimensionality k = 2, sufficient to describe the variance of the data. By feeding artificial neural networks with the PCA features, an accurate diagnosis of tool flank wear (VBmax) was achieved, with predicted values very close to the measured tool wear values. PMID:29522443

  16. The sixth generation robot in space

    NASA Technical Reports Server (NTRS)

    Butcher, A.; Das, A.; Reddy, Y. V.; Singh, H.

    1990-01-01

    The knowledge based simulator developed in the artificial intelligence laboratory has become a working test bed for experimenting with intelligent reasoning architectures. With this simulator, recently, small experiments have been done with an aim to simulate robot behavior to avoid colliding paths. An automatic extension of such experiments to intelligently planning robots in space demands advanced reasoning architectures. One such architecture for general purpose problem solving is explored. The robot, seen as a knowledge base machine, goes via predesigned abstraction mechanism for problem understanding and response generation. The three phases in one such abstraction scheme are: abstraction for representation, abstraction for evaluation, and abstraction for resolution. Such abstractions require multimodality. This multimodality requires the use of intensional variables to deal with beliefs in the system. Abstraction mechanisms help in synthesizing possible propagating lattices for such beliefs. The machine controller enters into a sixth generation paradigm.

  17. A scheme for solving the plane-plane challenge in force measurements at the nanoscale.

    PubMed

    Siria, Alessandro; Huant, Serge; Auvert, Geoffroy; Comin, Fabio; Chevrier, Joel

    2010-05-19

    Non-contact interaction between two parallel flat surfaces is a central paradigm in sciences. This situation is the starting point for a wealth of different models: the capacitor description in electrostatics, hydrodynamic flow, thermal exchange, the Casimir force, direct contact study, third body confinement such as liquids or films of soft condensed matter. The control of parallelism is so demanding that no versatile single force machine in this geometry has been proposed so far. Using a combination of nanopositioning based on inertial motors, of microcrystal shaping with a focused-ion beam (FIB) and of accurate in situ and real-time control of surface parallelism with X-ray diffraction, we propose here a "gedanken" surface-force machine that should enable one to measure interactions between movable surfaces separated by gaps in the micrometer and nanometer ranges.

  18. The Law of Self-Acting Machines and Irreversible Processes with Reversible Replicas

    NASA Astrophysics Data System (ADS)

    Valev, Pentcho

    2002-11-01

    Clausius and Kelvin saved Carnot theorem and developed the second law by assuming that Carnot machines can work in the absence of an operator and that all the irreversible processes have reversible replicas. The former assumption restored Carnot theorem as an experience of mankind whereas the latter generated "the law of ever increasing entropy". Both assumptions are wrong so it makes sense to return to Carnot theorem (or some equivalent) and test it experimentally. Two testable paradigms - the system performing two types of reversible work and the system in dynamical equilibrium - suggest that perpetuum mobile of the second kind in the presence of an operator is possible. The deviation from the second law prediction, expressed as difference between partial derivatives in a Maxwell relation, measures the degree of structural-functional evolution for the respective system.

  19. Big Data, Internet of Things and Cloud Convergence--An Architecture for Secure E-Health Applications.

    PubMed

    Suciu, George; Suciu, Victor; Martian, Alexandru; Craciunescu, Razvan; Vulpe, Alexandru; Marcu, Ioana; Halunga, Simona; Fratu, Octavian

    2015-11-01

    Big data storage and processing are considered as one of the main applications for cloud computing systems. Furthermore, the development of the Internet of Things (IoT) paradigm has advanced the research on Machine to Machine (M2M) communications and enabled novel tele-monitoring architectures for E-Health applications. However, there is a need for converging current decentralized cloud systems, general software for processing big data and IoT systems. The purpose of this paper is to analyze existing components and methods of securely integrating big data processing with cloud M2M systems based on Remote Telemetry Units (RTUs) and to propose a converged E-Health architecture built on Exalead CloudView, a search based application. Finally, we discuss the main findings of the proposed implementation and future directions.

  20. Active Learning Using Hint Information.

    PubMed

    Li, Chun-Liang; Ferng, Chun-Sung; Lin, Hsuan-Tien

    2015-08-01

    The abundance of real-world data and limited labeling budget calls for active learning, an important learning paradigm for reducing human labeling efforts. Many recently developed active learning algorithms consider both uncertainty and representativeness when making querying decisions. However, exploiting representativeness with uncertainty concurrently usually requires tackling sophisticated and challenging learning tasks, such as clustering. In this letter, we propose a new active learning framework, called hinted sampling, which takes both uncertainty and representativeness into account in a simpler way. We design a novel active learning algorithm within the hinted sampling framework with an extended support vector machine. Experimental results validate that the novel active learning algorithm can result in a better and more stable performance than that achieved by state-of-the-art algorithms. We also show that the hinted sampling framework allows improving another active learning algorithm designed from the transductive support vector machine.

  1. The Experiment Factory: Standardizing Behavioral Experiments.

    PubMed

    Sochat, Vanessa V; Eisenberg, Ian W; Enkavi, A Zeynep; Li, Jamie; Bissett, Patrick G; Poldrack, Russell A

    2016-01-01

    The administration of behavioral and experimental paradigms for psychology research is hindered by lack of a coordinated effort to develop and deploy standardized paradigms. While several frameworks (Mason and Suri, 2011; McDonnell et al., 2012; de Leeuw, 2015; Lange et al., 2015) have provided infrastructure and methods for individual research groups to develop paradigms, missing is a coordinated effort to develop paradigms linked with a system to easily deploy them. This disorganization leads to redundancy in development, divergent implementations of conceptually identical tasks, disorganized and error-prone code lacking documentation, and difficulty in replication. The ongoing reproducibility crisis in psychology and neuroscience research (Baker, 2015; Open Science Collaboration, 2015) highlights the urgency of this challenge: reproducible research in behavioral psychology is conditional on deployment of equivalent experiments. A large, accessible repository of experiments for researchers to develop collaboratively is most efficiently accomplished through an open source framework. Here we present the Experiment Factory, an open source framework for the development and deployment of web-based experiments. The modular infrastructure includes experiments, virtual machines for local or cloud deployment, and an application to drive these components and provide developers with functions and tools for further extension. We release this infrastructure with a deployment (http://www.expfactory.org) that researchers are currently using to run a set of over 80 standardized web-based experiments on Amazon Mechanical Turk. By providing open source tools for both deployment and development, this novel infrastructure holds promise to bring reproducibility to the administration of experiments, and accelerate scientific progress by providing a shared community resource of psychological paradigms.

  2. The Experiment Factory: Standardizing Behavioral Experiments

    PubMed Central

    Sochat, Vanessa V.; Eisenberg, Ian W.; Enkavi, A. Zeynep; Li, Jamie; Bissett, Patrick G.; Poldrack, Russell A.

    2016-01-01

    The administration of behavioral and experimental paradigms for psychology research is hindered by lack of a coordinated effort to develop and deploy standardized paradigms. While several frameworks (Mason and Suri, 2011; McDonnell et al., 2012; de Leeuw, 2015; Lange et al., 2015) have provided infrastructure and methods for individual research groups to develop paradigms, missing is a coordinated effort to develop paradigms linked with a system to easily deploy them. This disorganization leads to redundancy in development, divergent implementations of conceptually identical tasks, disorganized and error-prone code lacking documentation, and difficulty in replication. The ongoing reproducibility crisis in psychology and neuroscience research (Baker, 2015; Open Science Collaboration, 2015) highlights the urgency of this challenge: reproducible research in behavioral psychology is conditional on deployment of equivalent experiments. A large, accessible repository of experiments for researchers to develop collaboratively is most efficiently accomplished through an open source framework. Here we present the Experiment Factory, an open source framework for the development and deployment of web-based experiments. The modular infrastructure includes experiments, virtual machines for local or cloud deployment, and an application to drive these components and provide developers with functions and tools for further extension. We release this infrastructure with a deployment (http://www.expfactory.org) that researchers are currently using to run a set of over 80 standardized web-based experiments on Amazon Mechanical Turk. By providing open source tools for both deployment and development, this novel infrastructure holds promise to bring reproducibility to the administration of experiments, and accelerate scientific progress by providing a shared community resource of psychological paradigms. PMID:27199843

  3. Robot Training Through Incremental Learning

    DTIC Science & Technology

    2011-04-18

    Turing Associates, Ann Arbor, MI 48103 ABSTRACT The real world is too complex and variable to directly program an autonomous ground robot’s...11 th Conf. Uncertainty in Artificial Intelligence, 338-45 (1995). [6] J. Cleary and L. Trigg, “K*: An Instance-based learner using an entropic

  4. Active transportation measurement and benchmarking development : New Orleans state of active transportation report 2010.

    DOT National Transportation Integrated Search

    2012-01-01

    Over the last decade, there has been a surge in bicycle and pedestrian use in communities that have invested in active transportation infrastruc-ture and programming. While these increases show potentially promising trends, many of the cities that ha...

  5. Database in Artificial Intelligence.

    ERIC Educational Resources Information Center

    Wilkinson, Julia

    1986-01-01

    Describes a specialist bibliographic database of literature in the field of artificial intelligence created by the Turing Institute (Glasgow, Scotland) using the BRS/Search information retrieval software. The subscription method for end-users--i.e., annual fee entitles user to unlimited access to database, document provision, and printed awareness…

  6. Space-Bounded Church-Turing Thesis and Computational Tractability of Closed Systems.

    PubMed

    Braverman, Mark; Schneider, Jonathan; Rojas, Cristóbal

    2015-08-28

    We report a new limitation on the ability of physical systems to perform computation-one that is based on generalizing the notion of memory, or storage space, available to the system to perform the computation. Roughly, we define memory as the maximal amount of information that the evolving system can carry from one instant to the next. We show that memory is a limiting factor in computation even in lieu of any time limitations on the evolving system-such as when considering its equilibrium regime. We call this limitation the space-bounded Church-Turing thesis (SBCT). The SBCT is supported by a simulation assertion (SA), which states that predicting the long-term behavior of bounded-memory systems is computationally tractable. In particular, one corollary of SA is an explicit bound on the computational hardness of the long-term behavior of a discrete-time finite-dimensional dynamical system that is affected by noise. We prove such a bound explicitly.

  7. Spongiosa Primary Development: A Biochemical Hypothesis by Turing Patterns Formations

    PubMed Central

    López-Vaca, Oscar Rodrigo; Garzón-Alvarado, Diego Alexander

    2012-01-01

    We propose a biochemical model describing the formation of primary spongiosa architecture through a bioregulatory model by metalloproteinase 13 (MMP13) and vascular endothelial growth factor (VEGF). It is assumed that MMP13 regulates cartilage degradation and the VEGF allows vascularization and advances in the ossification front through the presence of osteoblasts. The coupling of this set of molecules is represented by reaction-diffusion equations with parameters in the Turing space, creating a stable spatiotemporal pattern that leads to the formation of the trabeculae present in the spongy tissue. Experimental evidence has shown that the MMP13 regulates VEGF formation, and it is assumed that VEGF negatively regulates MMP13 formation. Thus, the patterns obtained by ossification may represent the primary spongiosa formation during endochondral ossification. Moreover, for the numerical solution, we used the finite element method with the Newton-Raphson method to approximate partial differential nonlinear equations. Ossification patterns obtained may represent the primary spongiosa formation during endochondral ossification. PMID:23193429

  8. Space-Bounded Church-Turing Thesis and Computational Tractability of Closed Systems

    NASA Astrophysics Data System (ADS)

    Braverman, Mark; Schneider, Jonathan; Rojas, Cristóbal

    2015-08-01

    We report a new limitation on the ability of physical systems to perform computation—one that is based on generalizing the notion of memory, or storage space, available to the system to perform the computation. Roughly, we define memory as the maximal amount of information that the evolving system can carry from one instant to the next. We show that memory is a limiting factor in computation even in lieu of any time limitations on the evolving system—such as when considering its equilibrium regime. We call this limitation the space-bounded Church-Turing thesis (SBCT). The SBCT is supported by a simulation assertion (SA), which states that predicting the long-term behavior of bounded-memory systems is computationally tractable. In particular, one corollary of SA is an explicit bound on the computational hardness of the long-term behavior of a discrete-time finite-dimensional dynamical system that is affected by noise. We prove such a bound explicitly.

  9. Intermolecular Structural Change for Thermoswitchable Polymeric Photosensitizer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Wooram; Park, Sin-Jung; Cho, Soojeong

    2016-08-17

    A switchable photosensitizer (PS), which can be activated at a spe-cific condition beside light, has tremendous advantages for photo-dynamic therapy (PDT). Herein, we developed a thermo-switchable polymeric photosensitizer (T-PPS) by conjugating PS (Pheophor-bide-a, PPb-a) to a temperature-responsive polymer backbone of biocompatible hydroxypropyl cellulose (HPC). Self-quenched PS molecules linked in close proximity by pi-pi stacking in T-PPS were easily transited to an active monomeric state by the tempera-ture induced phase transition of polymer backbones. The tempera-ture responsive inter-molecular interaction changes of PS molecules in T-PPS were demonstrated in synchrotron small-angle X-ray scattering (SAXS) and UV-Vis spectrophotometer analysis. The T-PPS allowed switchablemore » activation and synergistically enhanced cancer cell killing effect at the hyperthermia temperature (45 °C). Our developed T-PPS has the considerable potential not only as a new class of photomedicine in clinics but also as a biosensor based on temperature responsiveness.« less

  10. Numerical approaches to model perturbation fire in turing pattern formations

    NASA Astrophysics Data System (ADS)

    Campagna, R.; Brancaccio, M.; Cuomo, S.; Mazzoleni, S.; Russo, L.; Siettos, K.; Giannino, F.

    2017-11-01

    Turing patterns were observed in chemical, physical and biological systems described by coupled reaction-diffusion equations. Several models have been formulated proposing the water as the causal mechanism of vegetation pattern formation, but this isn't an exhaustive hypothesis in some natural environments. An alternative explanation has been related to the plant-soil negative feedback. In Marasco et al. [1] the authors explored the hypothesis that both mechanisms contribute in the formation of regular and irregular vegetation patterns. The mathematical model consists in three partial differential equations (PDEs) that take into account for a dynamic balance between biomass, water and toxic compounds. A numerical approach is mandatory also to investigate on the predictions of this kind of models. In this paper we start from the mathematical model described in [1], set the model parameters such that the biomass reaches a stable spatial pattern (spots) and present preliminary studies about the occurrence of perturbing events, such as wildfire, that can affect the regularity of the biomass configuration.

  11. To Merge Or Not to Merge: A Survey Of Arab Movements Toward Socio-Political Union

    DTIC Science & Technology

    1974-01-01

    Correlation of this paradigm with the events of Islamic history makes it possi- ble to isolate those points at which myth becomes fact. Second , the study... war s, i te5, Peisians--hecame islamicized ani , to .ertiin extent. arabzed. This was acomiplished with the gradual Lntegration of’ the conquered...were powerless against the impingements of the formidable Ottoman war machine which, by the sixteenth century, had engulfed the Arab world from its

  12. Foundations and Emerging Paradigms for Computing in Living Cells.

    PubMed

    Ma, Kevin C; Perli, Samuel D; Lu, Timothy K

    2016-02-27

    Genetic circuits, composed of complex networks of interacting molecular machines, enable living systems to sense their dynamic environments, perform computation on the inputs, and formulate appropriate outputs. By rewiring and expanding these circuits with novel parts and modules, synthetic biologists have adapted living systems into vibrant substrates for engineering. Diverse paradigms have emerged for designing, modeling, constructing, and characterizing such artificial genetic systems. In this paper, we first provide an overview of recent advances in the development of genetic parts and highlight key engineering approaches. We then review the assembly of these parts into synthetic circuits from the perspectives of digital and analog logic, systems biology, and metabolic engineering, three areas of particular theoretical and practical interest. Finally, we discuss notable challenges that the field of synthetic biology still faces in achieving reliable and predictable forward-engineering of artificial biological circuits. Copyright © 2016. Published by Elsevier Ltd.

  13. Model and system learners, optimal process constructors and kinetic theory-based goal-oriented design: A new paradigm in materials and processes informatics

    NASA Astrophysics Data System (ADS)

    Abisset-Chavanne, Emmanuelle; Duval, Jean Louis; Cueto, Elias; Chinesta, Francisco

    2018-05-01

    Traditionally, Simulation-Based Engineering Sciences (SBES) has relied on the use of static data inputs (model parameters, initial or boundary conditions, … obtained from adequate experiments) to perform simulations. A new paradigm in the field of Applied Sciences and Engineering has emerged in the last decade. Dynamic Data-Driven Application Systems [9, 10, 11, 12, 22] allow the linkage of simulation tools with measurement devices for real-time control of simulations and applications, entailing the ability to dynamically incorporate additional data into an executing application, and in reverse, the ability of an application to dynamically steer the measurement process. It is in that context that traditional "digital-twins" are giving raise to a new generation of goal-oriented data-driven application systems, also known as "hybrid-twins", embracing models based on physics and models exclusively based on data adequately collected and assimilated for filling the gap between usual model predictions and measurements. Within this framework new methodologies based on model learners, machine learning and kinetic goal-oriented design are defining a new paradigm in materials, processes and systems engineering.

  14. Sequenced subjective accents for brain-computer interfaces

    NASA Astrophysics Data System (ADS)

    Vlek, R. J.; Schaefer, R. S.; Gielen, C. C. A. M.; Farquhar, J. D. R.; Desain, P.

    2011-06-01

    Subjective accenting is a cognitive process in which identical auditory pulses at an isochronous rate turn into the percept of an accenting pattern. This process can be voluntarily controlled, making it a candidate for communication from human user to machine in a brain-computer interface (BCI) system. In this study we investigated whether subjective accenting is a feasible paradigm for BCI and how its time-structured nature can be exploited for optimal decoding from non-invasive EEG data. Ten subjects perceived and imagined different metric patterns (two-, three- and four-beat) superimposed on a steady metronome. With an offline classification paradigm, we classified imagined accented from non-accented beats on a single trial (0.5 s) level with an average accuracy of 60.4% over all subjects. We show that decoding of imagined accents is also possible with a classifier trained on perception data. Cyclic patterns of accents and non-accents were successfully decoded with a sequence classification algorithm. Classification performances were compared by means of bit rate. Performance in the best scenario translates into an average bit rate of 4.4 bits min-1 over subjects, which makes subjective accenting a promising paradigm for an online auditory BCI.

  15. Learning With Mixed Hard/Soft Pointwise Constraints.

    PubMed

    Gnecco, Giorgio; Gori, Marco; Melacci, Stefano; Sanguineti, Marcello

    2015-09-01

    A learning paradigm is proposed and investigated, in which the classical framework of learning from examples is enhanced by the introduction of hard pointwise constraints, i.e., constraints imposed on a finite set of examples that cannot be violated. Such constraints arise, e.g., when requiring coherent decisions of classifiers acting on different views of the same pattern. The classical examples of supervised learning, which can be violated at the cost of some penalization (quantified by the choice of a suitable loss function) play the role of soft pointwise constraints. Constrained variational calculus is exploited to derive a representer theorem that provides a description of the functional structure of the optimal solution to the proposed learning paradigm. It is shown that such an optimal solution can be represented in terms of a set of support constraints, which generalize the concept of support vectors and open the doors to a novel learning paradigm, called support constraint machines. The general theory is applied to derive the representation of the optimal solution to the problem of learning from hard linear pointwise constraints combined with soft pointwise constraints induced by supervised examples. In some cases, closed-form optimal solutions are obtained.

  16. File System Virtual Appliances: Portable File System Implementations

    DTIC Science & Technology

    2009-05-01

    Mobile Computing Systems and Applications, Santa Cruz, CA, 1994. IEEE. [10] Michael Eisler , Peter Corbett, Michael Kazar, Daniel S. Nydick, and...Gingell, Joseph P. Moran, and William A. Shannon. Virtual Memory Architec- ture in SunOS. In USENIX Summer Conference, pages 81–94, Berkeley, CA, 1987

  17. Modeling Interfacial Thermal Boundary Conductance of Engineered Interfaces

    DTIC Science & Technology

    2014-08-31

    melting / recrystallization of the subsurface Ag/Cu interface. Observed the formation of a novel, lattice-mismatched interfacial microstruc- ture...calculations were converged within 1 × 10−4 Ryd with respect to wave function cutoff energy, energy density cutoff, and k- point sampling. The A-EAM

  18. User-centered design in brain-computer interfaces-a case study.

    PubMed

    Schreuder, Martijn; Riccio, Angela; Risetti, Monica; Dähne, Sven; Ramsay, Andrew; Williamson, John; Mattia, Donatella; Tangermann, Michael

    2013-10-01

    The array of available brain-computer interface (BCI) paradigms has continued to grow, and so has the corresponding set of machine learning methods which are at the core of BCI systems. The latter have evolved to provide more robust data analysis solutions, and as a consequence the proportion of healthy BCI users who can use a BCI successfully is growing. With this development the chances have increased that the needs and abilities of specific patients, the end-users, can be covered by an existing BCI approach. However, most end-users who have experienced the use of a BCI system at all have encountered a single paradigm only. This paradigm is typically the one that is being tested in the study that the end-user happens to be enrolled in, along with other end-users. Though this corresponds to the preferred study arrangement for basic research, it does not ensure that the end-user experiences a working BCI. In this study, a different approach was taken; that of a user-centered design. It is the prevailing process in traditional assistive technology. Given an individual user with a particular clinical profile, several available BCI approaches are tested and - if necessary - adapted to him/her until a suitable BCI system is found. Described is the case of a 48-year-old woman who suffered from an ischemic brain stem stroke, leading to a severe motor- and communication deficit. She was enrolled in studies with two different BCI systems before a suitable system was found. The first was an auditory event-related potential (ERP) paradigm and the second a visual ERP paradigm, both of which are established in literature. The auditory paradigm did not work successfully, despite favorable preconditions. The visual paradigm worked flawlessly, as found over several sessions. This discrepancy in performance can possibly be explained by the user's clinical deficit in several key neuropsychological indicators, such as attention and working memory. While the auditory paradigm relies on both categories, the visual paradigm could be used with lower cognitive workload. Besides attention and working memory, several other neurophysiological and -psychological indicators - and the role they play in the BCIs at hand - are discussed. The user's performance on the first BCI paradigm would typically have excluded her from further ERP-based BCI studies. However, this study clearly shows that, with the numerous paradigms now at our disposal, the pursuit for a functioning BCI system should not be stopped after an initial failed attempt. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.

  19. Iterative free-energy optimization for recurrent neural networks (INFERNO).

    PubMed

    Pitti, Alexandre; Gaussier, Philippe; Quoy, Mathias

    2017-01-01

    The intra-parietal lobe coupled with the Basal Ganglia forms a working memory that demonstrates strong planning capabilities for generating robust yet flexible neuronal sequences. Neurocomputational models however, often fails to control long range neural synchrony in recurrent spiking networks due to spontaneous activity. As a novel framework based on the free-energy principle, we propose to see the problem of spikes' synchrony as an optimization problem of the neurons sub-threshold activity for the generation of long neuronal chains. Using a stochastic gradient descent, a reinforcement signal (presumably dopaminergic) evaluates the quality of one input vector to move the recurrent neural network to a desired activity; depending on the error made, this input vector is strengthened to hill-climb the gradient or elicited to search for another solution. This vector can be learned then by one associative memory as a model of the basal-ganglia to control the recurrent neural network. Experiments on habit learning and on sequence retrieving demonstrate the capabilities of the dual system to generate very long and precise spatio-temporal sequences, above two hundred iterations. Its features are applied then to the sequential planning of arm movements. In line with neurobiological theories, we discuss its relevance for modeling the cortico-basal working memory to initiate flexible goal-directed neuronal chains of causation and its relation to novel architectures such as Deep Networks, Neural Turing Machines and the Free-Energy Principle.

  20. The effects of beta-endorphin: state change modification.

    PubMed

    Veening, Jan G; Barendregt, Henk P

    2015-01-29

    Beta-endorphin (β-END) is an opioid neuropeptide which has an important role in the development of hypotheses concerning the non-synaptic or paracrine communication of brain messages. This kind of communication between neurons has been designated volume transmission (VT) to differentiate it clearly from synaptic communication. VT occurs over short as well as long distances via the extracellular space in the brain, as well as via the cerebrospinal fluid (CSF) flowing through the ventricular spaces inside the brain and the arachnoid space surrounding the central nervous system (CNS). To understand how β-END can have specific behavioral effects, we use the notion behavioral state, inspired by the concept of machine state, coming from Turing (Proc London Math Soc, Series 2,42:230-265, 1937). In section 1.4 the sequential organization of male rat behavior is explained showing that an animal is not free to switch into another state at any given moment. Funneling-constraints restrict the number of possible behavioral transitions in specific phases while at other moments in the sequence the transition to other behavioral states is almost completely open. The effects of β-END on behaviors like food intake and sexual behavior, and the mechanisms involved in reward, meditation and pain control are discussed in detail. The effects on the sequential organization of behavior and on state transitions dominate the description of these effects.

  1. Spiking Neural P Systems with Communication on Request.

    PubMed

    Pan, Linqiang; Păun, Gheorghe; Zhang, Gexiang; Neri, Ferrante

    2017-12-01

    Spiking Neural [Formula: see text] Systems are Neural System models characterized by the fact that each neuron mimics a biological cell and the communication between neurons is based on spikes. In the Spiking Neural [Formula: see text] systems investigated so far, the application of evolution rules depends on the contents of a neuron (checked by means of a regular expression). In these [Formula: see text] systems, a specified number of spikes are consumed and a specified number of spikes are produced, and then sent to each of the neurons linked by a synapse to the evolving neuron. [Formula: see text]In the present work, a novel communication strategy among neurons of Spiking Neural [Formula: see text] Systems is proposed. In the resulting models, called Spiking Neural [Formula: see text] Systems with Communication on Request, the spikes are requested from neighboring neurons, depending on the contents of the neuron (still checked by means of a regular expression). Unlike the traditional Spiking Neural [Formula: see text] systems, no spikes are consumed or created: the spikes are only moved along synapses and replicated (when two or more neurons request the contents of the same neuron). [Formula: see text]The Spiking Neural [Formula: see text] Systems with Communication on Request are proved to be computationally universal, that is, equivalent with Turing machines as long as two types of spikes are used. Following this work, further research questions are listed to be open problems.

  2. European Scientific Notes. Volume 38, Number 8.

    DTIC Science & Technology

    1984-08-01

    is done mechanics, environmentally assisted using a Dugdale-Bilby strip yielding fracture, and oxidation in CO2. model (see Dowling and Townley , 1975...larger than the load ture, ASTM-STP668 (1979), 581. required to initiate cracking (this is Dowling, A.R., and C.H.A. Townley , why most of the failure

  3. Theory-Guided Technology in Computer Science.

    ERIC Educational Resources Information Center

    Ben-Ari, Mordechai

    2001-01-01

    Examines the history of major achievements in computer science as portrayed by winners of the prestigious Turing award and identifies a possibly unique activity called Theory-Guided Technology (TGT). Researchers develop TGT by using theoretical results to create practical technology. Discusses reasons why TGT is practical in computer science and…

  4. Annual Research Progress Report. 1 October 1977-30 September 1978.

    DTIC Science & Technology

    1978-09-30

    requiring craniotomy , one open skull frac- ture, one cervical spine fracture, two quadraplegic patients and seven patients with an acute brain...attempt of aspiration is made. Although the blood is available by gravity drainage , this is not ideal for short collections periods. Further work on the

  5. Hydraulic Diagnostic Monitoring System.

    DTIC Science & Technology

    1981-03-02

    devices were utilized. In one pneumatic circuit, a temperature-compensated pressure switch performed as predicted over a broad tempera- ture range. In...installation ...... ................. 41 9 NADC 81073-60 ILLUSTRATIONS (Cont) Fig. No. Page 28 Temperature-compensated pressure switch .... ................. .42...29 Plot of pressure vs temperature for nitrogen .... ................ .. 43 30 Temperature-compensated pressure switch : diagrammatic circuit

  6. Antiviral Activity of a Small-Molecule Inhibitor of Filovirus Infection

    DTIC Science & Technology

    2010-05-01

    one or two heterocyclic aromatic struc- tures (i.e., indole, benzofuran, benzimidazole , or benzothiophene) connected via an aliphatic linker or...tion of compound hits by high-throughput analysis. Using a ZEBOV-GFP assay, 2-(2-(5-(amino(imino)methyl)-1-benzofu- ran-2-yl)vinyl)-1H- benzimidazole -5

  7. Mammalian Toxicological Evaluation of TNT Wastewaters. Volume I. Chemistry Studies

    DTIC Science & Technology

    1978-03-01

    possessing the structure of II have been reporteds to be effective algicides , so N-morpholinoacetronitrile may arise from the decomposition of such...Mix- tures as Algicides , Bactericides, and Fungicides. Chem. Abstr. 85, 100855K (1975). 6. D. Graetz, G. Chesters, T. C. Daniels, L. W. Newland, and G

  8. Mass Casualty Response of a Modern Deployed Head and Neck Surgical Team

    DTIC Science & Technology

    2010-07-01

    tures (maxilla, mandible, frontal sinus), and miscellaneous injuries such as a parotid duct injury. Based on review of the operative log, 6 patients...trained to consider subtle head and neck injuries such as facial nerve or parotid duct transection. The flexibility to operate alongside other trauma

  9. History of the Pacific Ocean Division Corps of Engineers 1957-1967

    DTIC Science & Technology

    1972-01-01

    mound barrier; designed by HED civil engineer Robert Q. Palmer, these concrete three-bar struc- tures provided a sturdy substitute for scarce rock...that metal buildings would require high main- tenance costs, while the termite problem eliminates construction in wood. Not only for these reasons

  10. Radiation/Catalytic Augmented Combustion.

    DTIC Science & Technology

    1982-05-01

    enhanced combustion processes, utilizing pulsed and continuous VUV light- serces . Similarly, the catalytic technique has provided efficient combustion...tures we had a pl /cx LiF lens with a focal length of 200 nm, and a MgF2 window 2 nmn in thickness. Although these materials are considered to be among

  11. A View of the Combat CAS: Unifying Net-Enabled Teams

    DTIC Science & Technology

    2008-01-01

    Centric Warfare: Its Origin and Future. Proceedings. Volume 124/1/1, 139. Annapolis, MD: U.S. Naval Institute. Chomsky , Noam . 1962. Syntactic Structures...representations and discourse models (For linguistic mod- els, see Chomsky 1962). Discourse models make explicit the struc- ture not of sentences but of

  12. The Impact of Collaboration, Empowerment, and Choice: An Empirical Examination of the Collaborative Course Development Method

    ERIC Educational Resources Information Center

    Aiken, K. Damon; Heinze, Timothy C.; Meuter, Matthew L.; Chapman, Kenneth J.

    2017-01-01

    This research empirically tests collaborative course development (CCD)-a pedagogy presented in the 2016 "Marketing Education Review Special Issue on Teaching Innovations". A team of researchers taught experimental courses using CCD methods (employing various techniques including syllabus building, "flex-tures," free-choice…

  13. Consequences of Recent Southern Hemisphere Winter Variability on Polar Mesospheric Clouds

    DTIC Science & Technology

    2011-01-01

    summer latitudes. Recent observations of a link between the QBO and inter-hemispheric coupling (Espy et al., 2011) are also consistent with these...The role of the QBO in the inter-hemispheric coupling of summer mesospheric tempera- tures. Atmospheric Chemistry and Physics. 11, 495–502. Fiedler, J

  14. Quantum-chemical insights from deep tensor neural networks

    PubMed Central

    Schütt, Kristof T.; Arbabzadah, Farhad; Chmiela, Stefan; Müller, Klaus R.; Tkatchenko, Alexandre

    2017-01-01

    Learning from data has led to paradigm shifts in a multitude of disciplines, including web, text and image search, speech recognition, as well as bioinformatics. Can machine learning enable similar breakthroughs in understanding quantum many-body systems? Here we develop an efficient deep learning approach that enables spatially and chemically resolved insights into quantum-mechanical observables of molecular systems. We unify concepts from many-body Hamiltonians with purpose-designed deep tensor neural networks, which leads to size-extensive and uniformly accurate (1 kcal mol−1) predictions in compositional and configurational chemical space for molecules of intermediate size. As an example of chemical relevance, the model reveals a classification of aromatic rings with respect to their stability. Further applications of our model for predicting atomic energies and local chemical potentials in molecules, reliable isomer energies, and molecules with peculiar electronic structure demonstrate the potential of machine learning for revealing insights into complex quantum-chemical systems. PMID:28067221

  15. Embedded System for Prosthetic Control Using Implanted Neuromuscular Interfaces Accessed Via an Osseointegrated Implant.

    PubMed

    Mastinu, Enzo; Doguet, Pascal; Botquin, Yohan; Hakansson, Bo; Ortiz-Catalan, Max

    2017-08-01

    Despite the technological progress in robotics achieved in the last decades, prosthetic limbs still lack functionality, reliability, and comfort. Recently, an implanted neuromusculoskeletal interface built upon osseointegration was developed and tested in humans, namely the Osseointegrated Human-Machine Gateway. Here, we present an embedded system to exploit the advantages of this technology. Our artificial limb controller allows for bioelectric signals acquisition, processing, decoding of motor intent, prosthetic control, and sensory feedback. It includes a neurostimulator to provide direct neural feedback based on sensory information. The system was validated using real-time tasks characterization, power consumption evaluation, and myoelectric pattern recognition performance. Functionality was proven in a first pilot patient from whom results of daily usage were obtained. The system was designed to be reliably used in activities of daily living, as well as a research platform to monitor prosthesis usage and training, machine-learning-based control algorithms, and neural stimulation paradigms.

  16. Vision Systems with the Human in the Loop

    NASA Astrophysics Data System (ADS)

    Bauckhage, Christian; Hanheide, Marc; Wrede, Sebastian; Käster, Thomas; Pfeiffer, Michael; Sagerer, Gerhard

    2005-12-01

    The emerging cognitive vision paradigm deals with vision systems that apply machine learning and automatic reasoning in order to learn from what they perceive. Cognitive vision systems can rate the relevance and consistency of newly acquired knowledge, they can adapt to their environment and thus will exhibit high robustness. This contribution presents vision systems that aim at flexibility and robustness. One is tailored for content-based image retrieval, the others are cognitive vision systems that constitute prototypes of visual active memories which evaluate, gather, and integrate contextual knowledge for visual analysis. All three systems are designed to interact with human users. After we will have discussed adaptive content-based image retrieval and object and action recognition in an office environment, the issue of assessing cognitive systems will be raised. Experiences from psychologically evaluated human-machine interactions will be reported and the promising potential of psychologically-based usability experiments will be stressed.

  17. On the convergence of nanotechnology and Big Data analysis for computer-aided diagnosis.

    PubMed

    Rodrigues, Jose F; Paulovich, Fernando V; de Oliveira, Maria Cf; de Oliveira, Osvaldo N

    2016-04-01

    An overview is provided of the challenges involved in building computer-aided diagnosis systems capable of precise medical diagnostics based on integration and interpretation of data from different sources and formats. The availability of massive amounts of data and computational methods associated with the Big Data paradigm has brought hope that such systems may soon be available in routine clinical practices, which is not the case today. We focus on visual and machine learning analysis of medical data acquired with varied nanotech-based techniques and on methods for Big Data infrastructure. Because diagnosis is essentially a classification task, we address the machine learning techniques with supervised and unsupervised classification, making a critical assessment of the progress already made in the medical field and the prospects for the near future. We also advocate that successful computer-aided diagnosis requires a merge of methods and concepts from nanotechnology and Big Data analysis.

  18. A wearable computing platform for developing cloud-based machine learning models for health monitoring applications.

    PubMed

    Patel, Shyamal; McGinnis, Ryan S; Silva, Ikaro; DiCristofaro, Steve; Mahadevan, Nikhil; Jortberg, Elise; Franco, Jaime; Martin, Albert; Lust, Joseph; Raj, Milan; McGrane, Bryan; DePetrillo, Paolo; Aranyosi, A J; Ceruolo, Melissa; Pindado, Jesus; Ghaffari, Roozbeh

    2016-08-01

    Wearable sensors have the potential to enable clinical-grade ambulatory health monitoring outside the clinic. Technological advances have enabled development of devices that can measure vital signs with great precision and significant progress has been made towards extracting clinically meaningful information from these devices in research studies. However, translating measurement accuracies achieved in the controlled settings such as the lab and clinic to unconstrained environments such as the home remains a challenge. In this paper, we present a novel wearable computing platform for unobtrusive collection of labeled datasets and a new paradigm for continuous development, deployment and evaluation of machine learning models to ensure robust model performance as we transition from the lab to home. Using this system, we train activity classification models across two studies and track changes in model performance as we go from constrained to unconstrained settings.

  19. Sensor fusion III: 3-D perception and recognition; Proceedings of the Meeting, Boston, MA, Nov. 5-8, 1990

    NASA Technical Reports Server (NTRS)

    Schenker, Paul S. (Editor)

    1991-01-01

    The volume on data fusion from multiple sources discusses fusing multiple views, temporal analysis and 3D motion interpretation, sensor fusion and eye-to-hand coordination, and integration in human shape perception. Attention is given to surface reconstruction, statistical methods in sensor fusion, fusing sensor data with environmental knowledge, computational models for sensor fusion, and evaluation and selection of sensor fusion techniques. Topics addressed include the structure of a scene from two and three projections, optical flow techniques for moving target detection, tactical sensor-based exploration in a robotic environment, and the fusion of human and machine skills for remote robotic operations. Also discussed are K-nearest-neighbor concepts for sensor fusion, surface reconstruction with discontinuities, a sensor-knowledge-command fusion paradigm for man-machine systems, coordinating sensing and local navigation, and terrain map matching using multisensing techniques for applications to autonomous vehicle navigation.

  20. Quantum-chemical insights from deep tensor neural networks.

    PubMed

    Schütt, Kristof T; Arbabzadah, Farhad; Chmiela, Stefan; Müller, Klaus R; Tkatchenko, Alexandre

    2017-01-09

    Learning from data has led to paradigm shifts in a multitude of disciplines, including web, text and image search, speech recognition, as well as bioinformatics. Can machine learning enable similar breakthroughs in understanding quantum many-body systems? Here we develop an efficient deep learning approach that enables spatially and chemically resolved insights into quantum-mechanical observables of molecular systems. We unify concepts from many-body Hamiltonians with purpose-designed deep tensor neural networks, which leads to size-extensive and uniformly accurate (1 kcal mol -1 ) predictions in compositional and configurational chemical space for molecules of intermediate size. As an example of chemical relevance, the model reveals a classification of aromatic rings with respect to their stability. Further applications of our model for predicting atomic energies and local chemical potentials in molecules, reliable isomer energies, and molecules with peculiar electronic structure demonstrate the potential of machine learning for revealing insights into complex quantum-chemical systems.

  1. Artificial intelligence in healthcare: past, present and future.

    PubMed

    Jiang, Fei; Jiang, Yong; Zhi, Hui; Dong, Yi; Li, Hao; Ma, Sufeng; Wang, Yilong; Dong, Qiang; Shen, Haipeng; Wang, Yongjun

    2017-12-01

    Artificial intelligence (AI) aims to mimic human cognitive functions. It is bringing a paradigm shift to healthcare, powered by increasing availability of healthcare data and rapid progress of analytics techniques. We survey the current status of AI applications in healthcare and discuss its future. AI can be applied to various types of healthcare data (structured and unstructured). Popular AI techniques include machine learning methods for structured data, such as the classical support vector machine and neural network, and the modern deep learning, as well as natural language processing for unstructured data. Major disease areas that use AI tools include cancer, neurology and cardiology. We then review in more details the AI applications in stroke, in the three major areas of early detection and diagnosis, treatment, as well as outcome prediction and prognosis evaluation. We conclude with discussion about pioneer AI systems, such as IBM Watson, and hurdles for real-life deployment of AI.

  2. Uncertainty Management for Diagnostics and Prognostics of Batteries using Bayesian Techniques

    NASA Technical Reports Server (NTRS)

    Saha, Bhaskar; Goebel, kai

    2007-01-01

    Uncertainty management has always been the key hurdle faced by diagnostics and prognostics algorithms. A Bayesian treatment of this problem provides an elegant and theoretically sound approach to the modern Condition- Based Maintenance (CBM)/Prognostic Health Management (PHM) paradigm. The application of the Bayesian techniques to regression and classification in the form of Relevance Vector Machine (RVM), and to state estimation as in Particle Filters (PF), provides a powerful tool to integrate the diagnosis and prognosis of battery health. The RVM, which is a Bayesian treatment of the Support Vector Machine (SVM), is used for model identification, while the PF framework uses the learnt model, statistical estimates of noise and anticipated operational conditions to provide estimates of remaining useful life (RUL) in the form of a probability density function (PDF). This type of prognostics generates a significant value addition to the management of any operation involving electrical systems.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garzoglio, Gabriele

    The Fermilab Grid and Cloud Computing Department and the KISTI Global Science experimental Data hub Center are working on a multi-year Collaborative Research and Development Agreement.With the knowledge developed in the first year on how to provision and manage a federation of virtual machines through Cloud management systems. In this second year, we expanded the work on provisioning and federation, increasing both scale and diversity of solutions, and we started to build on-demand services on the established fabric, introducing the paradigm of Platform as a Service to assist with the execution of scientific workflows. We have enabled scientific workflows ofmore » stakeholders to run on multiple cloud resources at the scale of 1,000 concurrent machines. The demonstrations have been in the areas of (a) Virtual Infrastructure Automation and Provisioning, (b) Interoperability and Federation of Cloud Resources, and (c) On-demand Services for ScientificWorkflows.« less

  4. Quantum-chemical insights from deep tensor neural networks

    NASA Astrophysics Data System (ADS)

    Schütt, Kristof T.; Arbabzadah, Farhad; Chmiela, Stefan; Müller, Klaus R.; Tkatchenko, Alexandre

    2017-01-01

    Learning from data has led to paradigm shifts in a multitude of disciplines, including web, text and image search, speech recognition, as well as bioinformatics. Can machine learning enable similar breakthroughs in understanding quantum many-body systems? Here we develop an efficient deep learning approach that enables spatially and chemically resolved insights into quantum-mechanical observables of molecular systems. We unify concepts from many-body Hamiltonians with purpose-designed deep tensor neural networks, which leads to size-extensive and uniformly accurate (1 kcal mol-1) predictions in compositional and configurational chemical space for molecules of intermediate size. As an example of chemical relevance, the model reveals a classification of aromatic rings with respect to their stability. Further applications of our model for predicting atomic energies and local chemical potentials in molecules, reliable isomer energies, and molecules with peculiar electronic structure demonstrate the potential of machine learning for revealing insights into complex quantum-chemical systems.

  5. Machine learning in motion control

    NASA Technical Reports Server (NTRS)

    Su, Renjeng; Kermiche, Noureddine

    1989-01-01

    The existing methodologies for robot programming originate primarily from robotic applications to manufacturing, where uncertainties of the robots and their task environment may be minimized by repeated off-line modeling and identification. In space application of robots, however, a higher degree of automation is required for robot programming because of the desire of minimizing the human intervention. We discuss a new paradigm of robotic programming which is based on the concept of machine learning. The goal is to let robots practice tasks by themselves and the operational data are used to automatically improve their motion performance. The underlying mathematical problem is to solve the problem of dynamical inverse by iterative methods. One of the key questions is how to ensure the convergence of the iterative process. There have been a few small steps taken into this important approach to robot programming. We give a representative result on the convergence problem.

  6. Artificial intelligence in healthcare: past, present and future

    PubMed Central

    Jiang, Fei; Jiang, Yong; Zhi, Hui; Dong, Yi; Li, Hao; Ma, Sufeng; Wang, Yilong; Dong, Qiang; Shen, Haipeng; Wang, Yongjun

    2017-01-01

    Artificial intelligence (AI) aims to mimic human cognitive functions. It is bringing a paradigm shift to healthcare, powered by increasing availability of healthcare data and rapid progress of analytics techniques. We survey the current status of AI applications in healthcare and discuss its future. AI can be applied to various types of healthcare data (structured and unstructured). Popular AI techniques include machine learning methods for structured data, such as the classical support vector machine and neural network, and the modern deep learning, as well as natural language processing for unstructured data. Major disease areas that use AI tools include cancer, neurology and cardiology. We then review in more details the AI applications in stroke, in the three major areas of early detection and diagnosis, treatment, as well as outcome prediction and prognosis evaluation. We conclude with discussion about pioneer AI systems, such as IBM Watson, and hurdles for real-life deployment of AI. PMID:29507784

  7. A Machine-to-Machine protocol benchmark for eHealth applications - Use case: Respiratory rehabilitation.

    PubMed

    Talaminos-Barroso, Alejandro; Estudillo-Valderrama, Miguel A; Roa, Laura M; Reina-Tosina, Javier; Ortega-Ruiz, Francisco

    2016-06-01

    M2M (Machine-to-Machine) communications represent one of the main pillars of the new paradigm of the Internet of Things (IoT), and is making possible new opportunities for the eHealth business. Nevertheless, the large number of M2M protocols currently available hinders the election of a suitable solution that satisfies the requirements that can demand eHealth applications. In the first place, to develop a tool that provides a benchmarking analysis in order to objectively select among the most relevant M2M protocols for eHealth solutions. In the second place, to validate the tool with a particular use case: the respiratory rehabilitation. A software tool, called Distributed Computing Framework (DFC), has been designed and developed to execute the benchmarking tests and facilitate the deployment in environments with a large number of machines, with independence of the protocol and performance metrics selected. DDS, MQTT, CoAP, JMS, AMQP and XMPP protocols were evaluated considering different specific performance metrics, including CPU usage, memory usage, bandwidth consumption, latency and jitter. The results obtained allowed to validate a case of use: respiratory rehabilitation of chronic obstructive pulmonary disease (COPD) patients in two scenarios with different types of requirement: Home-Based and Ambulatory. The results of the benchmark comparison can guide eHealth developers in the choice of M2M technologies. In this regard, the framework presented is a simple and powerful tool for the deployment of benchmark tests under specific environments and conditions. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. ProDiGe: Prioritization Of Disease Genes with multitask machine learning from positive and unlabeled examples

    PubMed Central

    2011-01-01

    Background Elucidating the genetic basis of human diseases is a central goal of genetics and molecular biology. While traditional linkage analysis and modern high-throughput techniques often provide long lists of tens or hundreds of disease gene candidates, the identification of disease genes among the candidates remains time-consuming and expensive. Efficient computational methods are therefore needed to prioritize genes within the list of candidates, by exploiting the wealth of information available about the genes in various databases. Results We propose ProDiGe, a novel algorithm for Prioritization of Disease Genes. ProDiGe implements a novel machine learning strategy based on learning from positive and unlabeled examples, which allows to integrate various sources of information about the genes, to share information about known disease genes across diseases, and to perform genome-wide searches for new disease genes. Experiments on real data show that ProDiGe outperforms state-of-the-art methods for the prioritization of genes in human diseases. Conclusions ProDiGe implements a new machine learning paradigm for gene prioritization, which could help the identification of new disease genes. It is freely available at http://cbio.ensmp.fr/prodige. PMID:21977986

  9. Laser Direct Metal Deposition of 2024 Al Alloy: Trace Geometry Prediction via Machine Learning.

    PubMed

    Caiazzo, Fabrizia; Caggiano, Alessandra

    2018-03-19

    Laser direct metal deposition is an advanced additive manufacturing technology suitably applicable in maintenance, repair, and overhaul of high-cost products, allowing for minimal distortion of the workpiece, reduced heat affected zones, and superior surface quality. Special interest is growing for the repair and coating of 2024 aluminum alloy parts, extensively utilized for a wide range of applications in the automotive, military, and aerospace sectors due to its excellent plasticity, corrosion resistance, electric conductivity, and strength-to-weight ratio. A critical issue in the laser direct metal deposition process is related to the geometrical parameters of the cross-section of the deposited metal trace that should be controlled to meet the part specifications. In this research, a machine learning approach based on artificial neural networks is developed to find the correlation between the laser metal deposition process parameters and the output geometrical parameters of the deposited metal trace produced by laser direct metal deposition on 5-mm-thick 2024 aluminum alloy plates. The results show that the neural network-based machine learning paradigm is able to accurately estimate the appropriate process parameters required to obtain a specified geometry for the deposited metal trace.

  10. Laser Direct Metal Deposition of 2024 Al Alloy: Trace Geometry Prediction via Machine Learning

    PubMed Central

    2018-01-01

    Laser direct metal deposition is an advanced additive manufacturing technology suitably applicable in maintenance, repair, and overhaul of high-cost products, allowing for minimal distortion of the workpiece, reduced heat affected zones, and superior surface quality. Special interest is growing for the repair and coating of 2024 aluminum alloy parts, extensively utilized for a wide range of applications in the automotive, military, and aerospace sectors due to its excellent plasticity, corrosion resistance, electric conductivity, and strength-to-weight ratio. A critical issue in the laser direct metal deposition process is related to the geometrical parameters of the cross-section of the deposited metal trace that should be controlled to meet the part specifications. In this research, a machine learning approach based on artificial neural networks is developed to find the correlation between the laser metal deposition process parameters and the output geometrical parameters of the deposited metal trace produced by laser direct metal deposition on 5-mm-thick 2024 aluminum alloy plates. The results show that the neural network-based machine learning paradigm is able to accurately estimate the appropriate process parameters required to obtain a specified geometry for the deposited metal trace. PMID:29562682

  11. Hidden physics models: Machine learning of nonlinear partial differential equations

    NASA Astrophysics Data System (ADS)

    Raissi, Maziar; Karniadakis, George Em

    2018-03-01

    While there is currently a lot of enthusiasm about "big data", useful data is usually "small" and expensive to acquire. In this paper, we present a new paradigm of learning partial differential equations from small data. In particular, we introduce hidden physics models, which are essentially data-efficient learning machines capable of leveraging the underlying laws of physics, expressed by time dependent and nonlinear partial differential equations, to extract patterns from high-dimensional data generated from experiments. The proposed methodology may be applied to the problem of learning, system identification, or data-driven discovery of partial differential equations. Our framework relies on Gaussian processes, a powerful tool for probabilistic inference over functions, that enables us to strike a balance between model complexity and data fitting. The effectiveness of the proposed approach is demonstrated through a variety of canonical problems, spanning a number of scientific domains, including the Navier-Stokes, Schrödinger, Kuramoto-Sivashinsky, and time dependent linear fractional equations. The methodology provides a promising new direction for harnessing the long-standing developments of classical methods in applied mathematics and mathematical physics to design learning machines with the ability to operate in complex domains without requiring large quantities of data.

  12. Vienna FORTRAN: A FORTRAN language extension for distributed memory multiprocessors

    NASA Technical Reports Server (NTRS)

    Chapman, Barbara; Mehrotra, Piyush; Zima, Hans

    1991-01-01

    Exploiting the performance potential of distributed memory machines requires a careful distribution of data across the processors. Vienna FORTRAN is a language extension of FORTRAN which provides the user with a wide range of facilities for such mapping of data structures. However, programs in Vienna FORTRAN are written using global data references. Thus, the user has the advantage of a shared memory programming paradigm while explicitly controlling the placement of data. The basic features of Vienna FORTRAN are presented along with a set of examples illustrating the use of these features.

  13. Closed-loop brain training: the science of neurofeedback.

    PubMed

    Sitaram, Ranganatha; Ros, Tomas; Stoeckel, Luke; Haller, Sven; Scharnowski, Frank; Lewis-Peacock, Jarrod; Weiskopf, Nikolaus; Blefari, Maria Laura; Rana, Mohit; Oblak, Ethan; Birbaumer, Niels; Sulzer, James

    2017-02-01

    Neurofeedback is a psychophysiological procedure in which online feedback of neural activation is provided to the participant for the purpose of self-regulation. Learning control over specific neural substrates has been shown to change specific behaviours. As a progenitor of brain-machine interfaces, neurofeedback has provided a novel way to investigate brain function and neuroplasticity. In this Review, we examine the mechanisms underlying neurofeedback, which have started to be uncovered. We also discuss how neurofeedback is being used in novel experimental and clinical paradigms from a multidisciplinary perspective, encompassing neuroscientific, neuroengineering and learning-science viewpoints.

  14. Dynamic data distributions in Vienna Fortran

    NASA Technical Reports Server (NTRS)

    Chapman, Barbara; Mehrotra, Piyush; Moritsch, Hans; Zima, Hans

    1993-01-01

    Vienna Fortran is a machine-independent language extension of Fortran, which is based upon the Single-Program-Multiple-Data (SPMD) paradigm and allows the user to write programs for distributed-memory systems using global addresses. The language features focus mainly on the issue of distributing data across virtual processor structures. Those features of Vienna Fortran that allow the data distributions of arrays to change dynamically, depending on runtime conditions are discussed. The relevant language features are discussed, their implementation is outlined, and how they may be used in applications is described.

  15. Programming in Vienna Fortran

    NASA Technical Reports Server (NTRS)

    Chapman, Barbara; Mehrotra, Piyush; Zima, Hans

    1992-01-01

    Exploiting the full performance potential of distributed memory machines requires a careful distribution of data across the processors. Vienna Fortran is a language extension of Fortran which provides the user with a wide range of facilities for such mapping of data structures. In contrast to current programming practice, programs in Vienna Fortran are written using global data references. Thus, the user has the advantages of a shared memory programming paradigm while explicitly controlling the data distribution. In this paper, we present the language features of Vienna Fortran for FORTRAN 77, together with examples illustrating the use of these features.

  16. Merging the Machines of Modern Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolf, Laura; Collins, Jim

    Two recent projects have harnessed supercomputing resources at the US Department of Energy’s Argonne National Laboratory in a novel way to support major fusion science and particle collider experiments. Using leadership computing resources, one team ran fine-grid analysis of real-time data to make near-real-time adjustments to an ongoing experiment, while a second team is working to integrate Argonne’s supercomputers into the Large Hadron Collider/ATLAS workflow. Together these efforts represent a new paradigm of the high-performance computing center as a partner in experimental science.

  17. The Cognitive Architecture for Chaining of Two Mental Operations

    ERIC Educational Resources Information Center

    Sackur, Jerome; Dehaene, Stanislas

    2009-01-01

    A simple view, which dates back to Turing, proposes that complex cognitive operations are composed of serially arranged elementary operations, each passing intermediate results to the next. However, whether and how such serial processing is achieved with a brain composed of massively parallel processors, remains an open question. Here, we study…

  18. CALUTRON STRUCTURE

    DOEpatents

    Price, D.

    1958-09-01

    An improved means is described for removably installing and supporting a collector pocket in a calutron. The salient feature of the invention is the support of the collector pocket by means of suspension bolts engaging the pocket at a point intermediate the top and bottom of the pocket, and having nuts so arranged that by turing the desired predetermined position.

  19. iArchi[tech]ture: Developing a Mobile Social Media Framework for Pedagogical Transformation

    ERIC Educational Resources Information Center

    Cochrane, Thomas; Rhodes, David

    2013-01-01

    This paper critiques the journey of pedagogical change over three mobile learning (mlearning) project iterations (2009 to 2011) within the context of a Bachelor of Architecture degree. The three projects were supported by an intentional community of practice model involving a partnership of an educational researcher/technologist, course lecturers,…

  20. Semi-Automatic Methods of Knowledge Enhancement

    DTIC Science & Technology

    1988-12-05

    pL . Response was patchy. Apparently awed by the complexity of the problem only 3 GM’s responded and all asked for no public use to be made of their...by the SERC . Thanks are due to the Turing Institute and Edinburgh University Ai department for resource and facilities. We would also like to thank

  1. Kinetics of Some Metal Atom and Metal Fluoride Oxidation Reactions Relevant to Air Force Technology Development

    DTIC Science & Technology

    1981-03-01

    Products and Chemicals , Inc ., supplied a complete analysis with each cylinder. Initial measurements with the original batch (cylinder 1) were considered...NF 3] in this limited tempera- ture range. The NFs used in these experiments was made available by Kelly Air Force Base. The manufacturer, Air

  2. Propagation and Attenuation of Lg Waves in South America

    DTIC Science & Technology

    1989-09-01

    La Paz, Bolivia. 34 Ayala, R., 1989, Estudio de las ondas Lg registradas en la estaci6n de LPB, a trav~s del Escudo , Tesis de Grado, Universidad Mayor...Sur, Conselho Nacional de Pesqvisas, Brasil. Couch, R., R. Whitsett, B. Huehn, L. Briceno-Guarupe, 1981, Struc- tures of the continental margin of the

  3. Defense Spending and Regional Growth: An Examination of an Export-Base Model and an Econometric Model.

    DTIC Science & Technology

    1987-06-01

    consumer preferences provide influences that can stimulate the rate of growth of the endogenous and/or exogenous income industries. B. EXPORT INDUSTRIES...location quotient was selected to alleviate 12 some of the problems created by consumer preferences and expendi- ture patterns. This value was compared

  4. Beyond Robotic Wastelands of Time: Abandoned Pedagogical Agents and "New" Pedalled Pedagogies

    ERIC Educational Resources Information Center

    Savin-Baden, Maggi; Tombs, Gemma; Bhakta, Roy

    2015-01-01

    Chatbots, known as pedagogical agents in educational settings, have a long history of use, beginning with Alan Turing's work. Since then online chatbots have become embedded into the fabric of technology. Yet understandings of these technologies are inchoate and often untheorised. Integration of chatbots into educational settings over the past…

  5. One Hundred Ninety-five Cases of High-voltage Electric Injury

    DTIC Science & Technology

    2005-08-01

    that level; and T4 to T5 paraplegia, secondary to fractures of T4 to T7. In 3 cases, frac- tures were not present: one case of a T11 to T12 sensory ...problems, including fractures, neurological inju- ries, ocular injuries, and complex reconstructive and re- habilitative needs, underscores the

  6. Implementation of a Compiler for the Functional Programming Language PHI.

    DTIC Science & Technology

    1987-06-01

    Chapter Three. 8 his acceptance speech for the 1977 ACM Turing Award, Backus criticized traditional programming languages and programming styles. He went... Knn "mfrn ~ i ptr ->type =type; :.f (f~ead :=NULL) { -st alreaay ex-s~ tracer f head; wnile (tracer->iink - NU:LL) rdent of >-sl tracer = : racer- Iik

  7. The Shock and Vibration Digest. Volume 15, Number 4

    DTIC Science & Technology

    1983-04-01

    Akust. Zh., 23, pp 716-723. 132. Mozhaev , V.G., "Shear-Wave Convolution in a Layered Piezoelectric-Semiconductor Struc- ture," Sov. Phys. Acoust., 27...Piezoelectric Halfspace," Proc. Royal Soc. London, Ser. A 364, pp 161-179 (1978). 138. Mozhaev , V.G. and Solodov, I. Yu, "Second- Harmonic Generation of

  8. China Report, Red Flag, Number 8, 16 April 1986.

    DTIC Science & Technology

    1986-06-02

    principles of Marxism and to the destiny of our socialist litera- ture and art. Over the past few years, Comrade Liu Zaifu has published a series of theses...teristics of this figure are the ideological embryo of the modern theory of human nature and humanism, and the contradiction between his democratic

  9. Analysis of nonlocal neural fields for both general and gamma-distributed connectivities

    NASA Astrophysics Data System (ADS)

    Hutt, Axel; Atay, Fatihcan M.

    2005-04-01

    This work studies the stability of equilibria in spatially extended neuronal ensembles. We first derive the model equation from statistical properties of the neuron population. The obtained integro-differential equation includes synaptic and space-dependent transmission delay for both general and gamma-distributed synaptic connectivities. The latter connectivity type reveals infinite, finite, and vanishing self-connectivities. The work derives conditions for stationary and nonstationary instabilities for both kernel types. In addition, a nonlinear analysis for general kernels yields the order parameter equation of the Turing instability. To compare the results to findings for partial differential equations (PDEs), two typical PDE-types are derived from the examined model equation, namely the general reaction-diffusion equation and the Swift-Hohenberg equation. Hence, the discussed integro-differential equation generalizes these PDEs. In the case of the gamma-distributed kernels, the stability conditions are formulated in terms of the mean excitatory and inhibitory interaction ranges. As a novel finding, we obtain Turing instabilities in fields with local inhibition-lateral excitation, while wave instabilities occur in fields with local excitation and lateral inhibition. Numerical simulations support the analytical results.

  10. The Social Embedding of Intelligence

    NASA Astrophysics Data System (ADS)

    Edmonds, Bruce

    I claim that to pass the Turing Test over any period of extended time, it will be necessary to embed the entity into society. This chapter discusses why this is, and how it might be brought about. I start by arguing that intelligence is better characterized by tests of social interaction, especially in open-ended and extended situations. I then argue that learning is an essential component of intelligence and hence that a universal intelligence is impossible. These two arguments support the relevance of the Turing Test as a particular, but appropriate test of interactive intelligence. I look to the human case to argue that individual intelligence uses society to a considerable extent for its development. Taking a lead from the human case, I outline how a socially embedded Artificial Intelligence might be brought about in terms of four aspects: free will, emotion, empathy, and self-modeling. In each case, I try to specify what social 'hooks' might be required for the full ability to develop during a considerable period of in situ acculturation. The chapter ends by speculating what it might be like to live with the result.

  11. How My Program Passed the Turing Test

    NASA Astrophysics Data System (ADS)

    Humphrys, Mark

    In 1989, the author put an ELIZA-like chatbot on the Internet. The conversations this program had can be seen - depending on how one defines the rules (and how seriously one takes the idea of the test itself) - as a passing of the Turing Test. This is the first time this event has been properly written. This chatbot succeeded due to profanity, relentless aggression, prurient queries about the user, and implying that they were a liar when they responded. The element of surprise was also crucial. Most chatbots exist in an environment where people expectto find some bots among the humans. Not this one. What was also novel was the onlineelement. This was certainly one of the first AI programs online. It seems to have been the first (a) AI real-time chat program, which (b) had the element of surprise, and (c) was on the Internet. We conclude with some speculation that the future of all of AI is on the Internet, and a description of the "World- Wide-Mind" project that aims to bring this about.

  12. Turing-like structures in a functional model of cortical spreading depression

    NASA Astrophysics Data System (ADS)

    Verisokin, A. Yu.; Verveyko, D. V.; Postnov, D. E.

    2017-12-01

    Cortical spreading depression (CSD) along with migraine waves and spreading depolarization events with stroke or injures are the front-line examples of extreme physiological behaviors of the brain cortex which manifest themselves via the onset and spreading of localized areas of neuronal hyperactivity followed by their depression. While much is known about the physiological pathways involved, the dynamical mechanisms of the formation and evolution of complex spatiotemporal patterns during CSD are still poorly understood, in spite of the number of modeling studies that have been already performed. Recently we have proposed a relatively simple mathematical model of cortical spreading depression which counts the effects of neurovascular coupling and cerebral blood flow redistribution during CSD. In the present study, we address the main dynamical consequences of newly included pathways, namely, the changes in the formation and propagation speed of the CSD front and the pattern formation features in two dimensions. Our most notable finding is that the combination of vascular-mediated spatial coupling with local regulatory mechanisms results in the formation of stationary Turing-like patterns during a CSD event.

  13. Decoding of top-down cognitive processing for SSVEP-controlled BMI

    PubMed Central

    Min, Byoung-Kyong; Dähne, Sven; Ahn, Min-Hee; Noh, Yung-Kyun; Müller, Klaus-Robert

    2016-01-01

    We present a fast and accurate non-invasive brain-machine interface (BMI) based on demodulating steady-state visual evoked potentials (SSVEPs) in electroencephalography (EEG). Our study reports an SSVEP-BMI that, for the first time, decodes primarily based on top-down and not bottom-up visual information processing. The experimental setup presents a grid-shaped flickering line array that the participants observe while intentionally attending to a subset of flickering lines representing the shape of a letter. While the flickering pixels stimulate the participant’s visual cortex uniformly with equal probability, the participant’s intention groups the strokes and thus perceives a ‘letter Gestalt’. We observed decoding accuracy of 35.81% (up to 65.83%) with a regularized linear discriminant analysis; on average 2.05-fold, and up to 3.77-fold greater than chance levels in multi-class classification. Compared to the EEG signals, an electrooculogram (EOG) did not significantly contribute to decoding accuracies. Further analysis reveals that the top-down SSVEP paradigm shows the most focalised activation pattern around occipital visual areas; Granger causality analysis consistently revealed prefrontal top-down control over early visual processing. Taken together, the present paradigm provides the first neurophysiological evidence for the top-down SSVEP BMI paradigm, which potentially enables multi-class intentional control of EEG-BMIs without using gaze-shifting. PMID:27808125

  14. Decoding of top-down cognitive processing for SSVEP-controlled BMI

    NASA Astrophysics Data System (ADS)

    Min, Byoung-Kyong; Dähne, Sven; Ahn, Min-Hee; Noh, Yung-Kyun; Müller, Klaus-Robert

    2016-11-01

    We present a fast and accurate non-invasive brain-machine interface (BMI) based on demodulating steady-state visual evoked potentials (SSVEPs) in electroencephalography (EEG). Our study reports an SSVEP-BMI that, for the first time, decodes primarily based on top-down and not bottom-up visual information processing. The experimental setup presents a grid-shaped flickering line array that the participants observe while intentionally attending to a subset of flickering lines representing the shape of a letter. While the flickering pixels stimulate the participant’s visual cortex uniformly with equal probability, the participant’s intention groups the strokes and thus perceives a ‘letter Gestalt’. We observed decoding accuracy of 35.81% (up to 65.83%) with a regularized linear discriminant analysis; on average 2.05-fold, and up to 3.77-fold greater than chance levels in multi-class classification. Compared to the EEG signals, an electrooculogram (EOG) did not significantly contribute to decoding accuracies. Further analysis reveals that the top-down SSVEP paradigm shows the most focalised activation pattern around occipital visual areas; Granger causality analysis consistently revealed prefrontal top-down control over early visual processing. Taken together, the present paradigm provides the first neurophysiological evidence for the top-down SSVEP BMI paradigm, which potentially enables multi-class intentional control of EEG-BMIs without using gaze-shifting.

  15. Visual modifications on the P300 speller BCI paradigm

    NASA Astrophysics Data System (ADS)

    Salvaris, M.; Sepulveda, F.

    2009-08-01

    The best known P300 speller brain-computer interface (BCI) paradigm is the Farwell and Donchin paradigm. In this paper, various changes to the visual aspects of this protocol are explored as well as their effects on classification. Changes to the dimensions of the symbols, the distance between the symbols and the colours used were tested. The purpose of the present work was not to achieve the highest possible accuracy results, but to ascertain whether these simple modifications to the visual protocol will provide classification differences between them and what these differences will be. Eight subjects were used, with each subject carrying out a total of six different experiments. In each experiment, the user spelt a total of 39 characters. Two types of classifiers were trained and tested to determine whether the results were classifier dependant. These were a support vector machine (SVM) with a radial basis function (RBF) kernel and Fisher's linear discriminant (FLD). The single-trial classification results and multiple-trial classification results were recorded and compared. Although no visual protocol was the best for all subjects, the best performances, across both classifiers, were obtained with the white background (WB) visual protocol. The worst performance was obtained with the small symbol size (SSS) visual protocol.

  16. Geochemistry of pyrite from diamictites of the Hamersley Basin, Western Australia with implications for the GOE and Paleoproterozoic ice ages.

    NASA Astrophysics Data System (ADS)

    Swanner, Elizabeth; Cates, Nicole; Pecoits, Ernesto; Bekker, Andrey; Konhauser, Kurt O.; Mojzsis, Stephen J.

    2013-04-01

    Sediments of the ca. 2400 Ma Turee Creek Group of Western Australia span the oxygenation of Earth's surface resulting from the 'Great Oxidation Event' (GOE). Diamictite within the Boolgeeda Iron Formation from the Boundary Ridge section at Duck Creek Syncline have been correlated to the glaciogenic Meteorite Bore Member of the Turee Creek Group at Hardey Syncline (Martin, 1999). The Meteorite Bore Member is thought to be correlative and time-equivalent with the Paleoproterozoic glacial diamictites of North America. If diamictite units at Boundary Ridge represent worldwide Paleoproterozoic glaciations, they should record the disappearance of mass independently fractionated (MIF) sulfur. Triple S-isotope compositions for pyrites from the Boundary Ridge sections measured by in situ multi-collector ion microprobe yielded both mass-dependent and mass-independently fractionated (MIF) S isotope values (Δ33S values from -0.65 to 6.27). Trace element heterogeneities were found by measurements at multiple spatial scales within rounded pyrites in the Boundary Ridge section, signifying multiple generations of pyrite from sulfur processed in an anoxic atmosphere. S-isotope data from pyrite in the Boundary Ridge diamictites analyzed in this study and previous work (Williford et al., 2011) define multiple δ34S vs. δ33S arrays, linked to a source of detrital pyrite from the overlying Hamersley and Fortescue groups. Authigenic pyrite in an overlying shale unit from Boundary Ridge plot along the terrestrial fractionation line but retain positive MIF-S and detrital pyrite, results that are incompatible with a correlation to North American Paleoproterozoic glacially-influenced successions where the MIF-S signal permanently disappears. The diamictites at the Duck Creek Syncline are older than the Meteorite Bore Member because of their stratigraphic position within the Boolgeeda Iron Formation underlying the Turee Creek Group, which is separated from the Meteorite Bore Member by nearly 1000 m of Kungarra shale at Hardey Syncline.

  17. Interacting Turing-Hopf Instabilities Drive Symmetry-Breaking Transitions in a Mean-Field Model of the Cortex: A Mechanism for the Slow Oscillation

    NASA Astrophysics Data System (ADS)

    Steyn-Ross, Moira L.; Steyn-Ross, D. A.; Sleigh, J. W.

    2013-04-01

    Electrical recordings of brain activity during the transition from wake to anesthetic coma show temporal and spectral alterations that are correlated with gross changes in the underlying brain state. Entry into anesthetic unconsciousness is signposted by the emergence of large, slow oscillations of electrical activity (≲1Hz) similar to the slow waves observed in natural sleep. Here we present a two-dimensional mean-field model of the cortex in which slow spatiotemporal oscillations arise spontaneously through a Turing (spatial) symmetry-breaking bifurcation that is modulated by a Hopf (temporal) instability. In our model, populations of neurons are densely interlinked by chemical synapses, and by interneuronal gap junctions represented as an inhibitory diffusive coupling. To demonstrate cortical behavior over a wide range of distinct brain states, we explore model dynamics in the vicinity of a general-anesthetic-induced transition from “wake” to “coma.” In this region, the system is poised at a codimension-2 point where competing Turing and Hopf instabilities coexist. We model anesthesia as a moderate reduction in inhibitory diffusion, paired with an increase in inhibitory postsynaptic response, producing a coma state that is characterized by emergent low-frequency oscillations whose dynamics is chaotic in time and space. The effect of long-range axonal white-matter connectivity is probed with the inclusion of a single idealized point-to-point connection. We find that the additional excitation from the long-range connection can provoke seizurelike bursts of cortical activity when inhibitory diffusion is weak, but has little impact on an active cortex. Our proposed dynamic mechanism for the origin of anesthetic slow waves complements—and contrasts with—conventional explanations that require cyclic modulation of ion-channel conductances. We postulate that a similar bifurcation mechanism might underpin the slow waves of natural sleep and comment on the possible consequences of chaotic dynamics for memory processing and learning.

  18. Thermodynamic Paradigm for Solution Demixing Inspired by Nuclear Transport in Living Cells

    NASA Astrophysics Data System (ADS)

    Wang, Ching-Hao; Mehta, Pankaj; Elbaum, Michael

    2017-04-01

    Living cells display a remarkable capacity to compartmentalize their functional biochemistry. A particularly fascinating example is the cell nucleus. Exchange of macromolecules between the nucleus and the surrounding cytoplasm does not involve traversing a lipid bilayer membrane. Instead, large protein channels known as nuclear pores cross the nuclear envelope and regulate the passage of other proteins and RNA molecules. Beyond simply gating diffusion, the system of nuclear pores and associated transport receptors is able to generate substantial concentration gradients, at the energetic expense of guanosine triphosphate hydrolysis. In contrast to conventional approaches to demixing such as reverse osmosis and dialysis, the biological system operates continuously, without application of cyclic changes in pressure or solvent exchange. Abstracting the biological paradigm, we examine this transport system as a thermodynamic machine of solution demixing. Building on the construct of free energy transduction and biochemical kinetics, we find conditions for the stable operation and optimization of the concentration gradients as a function of dissipation in the form of entropy production.

  19. Chronic, Wireless Recordings of Large Scale Brain Activity in Freely Moving Rhesus Monkeys

    PubMed Central

    Schwarz, David A.; Lebedev, Mikhail A.; Hanson, Timothy L.; Dimitrov, Dragan F.; Lehew, Gary; Meloy, Jim; Rajangam, Sankaranarayani; Subramanian, Vivek; Ifft, Peter J.; Li, Zheng; Ramakrishnan, Arjun; Tate, Andrew; Zhuang, Katie; Nicolelis, Miguel A.L.

    2014-01-01

    Advances in techniques for recording large-scale brain activity contribute to both the elucidation of neurophysiological principles and the development of brain-machine interfaces (BMIs). Here we describe a neurophysiological paradigm for performing tethered and wireless large-scale recordings based on movable volumetric three-dimensional (3D) multielectrode implants. This approach allowed us to isolate up to 1,800 units per animal and simultaneously record the extracellular activity of close to 500 cortical neurons, distributed across multiple cortical areas, in freely behaving rhesus monkeys. The method is expandable, in principle, to thousands of simultaneously recorded channels. It also allows increased recording longevity (5 consecutive years), and recording of a broad range of behaviors, e.g. social interactions, and BMI paradigms in freely moving primates. We propose that wireless large-scale recordings could have a profound impact on basic primate neurophysiology research, while providing a framework for the development and testing of clinically relevant neuroprostheses. PMID:24776634

  20. Bayesian decoding using unsorted spikes in the rat hippocampus

    PubMed Central

    Layton, Stuart P.; Chen, Zhe; Wilson, Matthew A.

    2013-01-01

    A fundamental task in neuroscience is to understand how neural ensembles represent information. Population decoding is a useful tool to extract information from neuronal populations based on the ensemble spiking activity. We propose a novel Bayesian decoding paradigm to decode unsorted spikes in the rat hippocampus. Our approach uses a direct mapping between spike waveform features and covariates of interest and avoids accumulation of spike sorting errors. Our decoding paradigm is nonparametric, encoding model-free for representing stimuli, and extracts information from all available spikes and their waveform features. We apply the proposed Bayesian decoding algorithm to a position reconstruction task for freely behaving rats based on tetrode recordings of rat hippocampal neuronal activity. Our detailed decoding analyses demonstrate that our approach is efficient and better utilizes the available information in the nonsortable hash than the standard sorting-based decoding algorithm. Our approach can be adapted to an online encoding/decoding framework for applications that require real-time decoding, such as brain-machine interfaces. PMID:24089403

  1. A rodent brain-machine interface paradigm to study the impact of paraplegia on BMI performance.

    PubMed

    Bridges, Nathaniel R; Meyers, Michael; Garcia, Jonathan; Shewokis, Patricia A; Moxon, Karen A

    2018-05-31

    Most brain machine interfaces (BMI) focus on upper body function in non-injured animals, not addressing the lower limb functional needs of those with paraplegia. A need exists for a novel BMI task that engages the lower body and takes advantage of well-established rodent spinal cord injury (SCI) models to study methods to improve BMI performance. A tilt BMI task was designed that randomly applies different types of tilts to a platform, decodes the tilt type applied and rights the platform if the decoder correctly classifies the tilt type. The task was tested on female rats and is relatively natural such that it does not require the animal to learn a new skill. It is self-rewarding such that there is no need for additional rewards, eliminating food or water restriction, which can be especially hard on spinalized rats. Finally, task difficulty can be adjusted by making the tilt parameters. This novel BMI task bilaterally engages the cortex without visual feedback regarding limb position in space and animals learn to improve their performance both pre and post-SCI.Comparison with Existing Methods: Most BMI tasks primarily engage one hemisphere, are upper-body, rely heavily on visual feedback, do not perform investigations in animal models of SCI, and require nonnaturalistic extrinsic motivation such as water rewarding for performance improvement. Our task addresses these gaps. The BMI paradigm presented here will enable researchers to investigate the interaction of plasticity after SCI and plasticity during BMI training on performance. Copyright © 2018. Published by Elsevier B.V.

  2. Using the Statecharts paradigm for simulation of patient flow in surgical care.

    PubMed

    Sobolev, Boris; Harel, David; Vasilakis, Christos; Levy, Adrian

    2008-03-01

    Computer simulation of patient flow has been used extensively to assess the impacts of changes in the management of surgical care. However, little research is available on the utility of existing modeling techniques. The purpose of this paper is to examine the capacity of Statecharts, a system of graphical specification, for constructing a discrete-event simulation model of the perioperative process. The Statecharts specification paradigm was originally developed for representing reactive systems by extending the formalism of finite-state machines through notions of hierarchy, parallelism, and event broadcasting. Hierarchy permits subordination between states so that one state may contain other states. Parallelism permits more than one state to be active at any given time. Broadcasting of events allows one state to detect changes in another state. In the context of the peri-operative process, hierarchy provides the means to describe steps within activities and to cluster related activities, parallelism provides the means to specify concurrent activities, and event broadcasting provides the means to trigger a series of actions in one activity according to transitions that occur in another activity. Combined with hierarchy and parallelism, event broadcasting offers a convenient way to describe the interaction of concurrent activities. We applied the Statecharts formalism to describe the progress of individual patients through surgical care as a series of asynchronous updates in patient records generated in reaction to events produced by parallel finite-state machines representing concurrent clinical and managerial activities. We conclude that Statecharts capture successfully the behavioral aspects of surgical care delivery by specifying permissible chronology of events, conditions, and actions.

  3. Ghost-in-the-Machine reveals human social signals for human-robot interaction.

    PubMed

    Loth, Sebastian; Jettka, Katharina; Giuliani, Manuel; de Ruiter, Jan P

    2015-01-01

    We used a new method called "Ghost-in-the-Machine" (GiM) to investigate social interactions with a robotic bartender taking orders for drinks and serving them. Using the GiM paradigm allowed us to identify how human participants recognize the intentions of customers on the basis of the output of the robotic recognizers. Specifically, we measured which recognizer modalities (e.g., speech, the distance to the bar) were relevant at different stages of the interaction. This provided insights into human social behavior necessary for the development of socially competent robots. When initiating the drink-order interaction, the most important recognizers were those based on computer vision. When drink orders were being placed, however, the most important information source was the speech recognition. Interestingly, the participants used only a subset of the available information, focussing only on a few relevant recognizers while ignoring others. This reduced the risk of acting on erroneous sensor data and enabled them to complete service interactions more swiftly than a robot using all available sensor data. We also investigated socially appropriate response strategies. In their responses, the participants preferred to use the same modality as the customer's requests, e.g., they tended to respond verbally to verbal requests. Also, they added redundancy to their responses, for instance by using echo questions. We argue that incorporating the social strategies discovered with the GiM paradigm in multimodal grammars of human-robot interactions improves the robustness and the ease-of-use of these interactions, and therefore provides a smoother user experience.

  4. Anticholinesterase-Responsive Weakness in the Canine Similar to Myasthenia Gravis of Man.

    DTIC Science & Technology

    1976-01-01

    eyelids, ears , and facial fea- subcutaneous injection of 2 mg of atropine. A tures .18’21 ’25 Difficult prehension~ dysphagia , chok- evident within...acological t esting is very diagnostic but Ia, ’ weakness was noticed at the same t imenot witIu ~u t hazard . Ant icholines lerase given to a . esophageal

  5. Species Profiles. Life Histories and Environmental Requirements of Coastal Fishes and Invertebrates (Pacific Northwest). Dungeness Crab.

    DTIC Science & Technology

    1986-08-01

    variety of factors crab eggs has been linked to increased including depth, latitude, tempera- egg mortality because of mechanical ture, salinity and...time. crabs seem less dependent on epibenthic cover and can be found over more exposed substrates. Most crabs Temperature- Salinity Interactions remain...13 Salinity . .. ....... ........................................ 14 Temperature- Salinity Interactions. .. .... ....... ....... 14

  6. The M198 Howitzer as a Direct Support Weapon during Amphibious Operations.

    DTIC Science & Technology

    1980-06-06

    critical to the success of f:;ture amphibio ..s ot.er-.tions. Purpose of the Study, The purpose of this study is to determine the imrpact of the ,19 ’s...principle amphibio ;- shies lift capatilities and physical characte-istics indic -tcs thir flexibility a ,d speed, or lack tnhreof, in d::>.rking large

  7. Cyberwarfare and Operational Art

    DTIC Science & Technology

    2017-05-25

    Electronic Attack EMS Electro Magnetic Spectrum FM Field Manual FSB Federal Security Service (Russian Federation) GAO General Accounting Office GRU...Warfare, (Cambridge, MA: O’Reilly Media Inc., 2012), 74. 2 "The Bombe developed in Bletchley by Turing and Welshman and Babbage - all luminaries of...cyberspace domain’s fundamental characteristics. First, cyberspace requires the Electro Magnetic Spectrum ( EMS ) to propagate efficiently. Second

  8. Prediction of Environmental Impact of High-Energy Materials with Atomistic Computer Simulations

    DTIC Science & Technology

    2010-11-01

    from a training set of compounds. Other methods include Quantitative Struc- ture-Activity Relationship ( QSAR ) and Quantitative Structure-Property...26 28 the development of QSPR/ QSAR models, in contrast to boiling points and critical parameters derived from empirical correlations, to improve...Quadratic Configuration Interaction Singles Doubles QSAR Quantitative Structure-Activity Relationship QSPR Quantitative Structure-Property

  9. Test and Evaluation Report of the IMED Volumetric Infusion Pump Model 960A

    DTIC Science & Technology

    1992-02-01

    tested Ambient tempera- ture was out of test lim- its. Windshield anti-ice X Pitot heat X Vent blower X Windshield wiper X Heater X APU X Generator #1 X...Patterson John A. Dellinger, Air Force Base, OH 45433 Southwest Research Institute P. 0. Box 28510 Henry L. Taylor San Antonio, TX 78284 Director

  10. Gela, Italy, Revised Uniform Summary of Surface Weather Observations (RUSSWO). Parts A-F.

    DTIC Science & Technology

    1983-11-03

    ADDRESS I2 REPORT DATE USAFETAC/ CBD 3 Nov 83 Air Weather Service (MAC) 13 NUMBER OF PAGES Scott AFB IL 62225 p. _ _ _0 r4 MONITORING AGENCY NAME & ADDRESS...temperature Lombined; tuid again for dry-bulb, wet-bulb, and dew-point tempera- tures separately. Total observations for thc .;e four Items is also

  11. Autonomous Inter-Task Transfer in Reinforcement Learning Domains

    DTIC Science & Technology

    2008-08-01

    Twentieth International Joint Conference on Artificial Intelli - gence, 2007. 304 Fumihide Tanaka and Masayuki Yamamura. Multitask reinforcement learning...Functions . . . . . . . . . . . . . . . . . . . . . . 17 2.2.3 Artificial Neural Networks . . . . . . . . . . . . . . . . . . . . 18 2.2.4 Instance-based...tures [Laird et al., 1986, Choi et al., 2007]. However, TL for RL tasks has only recently been gaining attention in the artificial intelligence

  12. Dimensions of Intelligent Systems

    DTIC Science & Technology

    2002-08-01

    Keywords: IS, Intelligent Systems, Turing Test, Cognitive Model, situated cognition, BDI, Deep Blue, constructionism 1: Introduction Investigation of...Our social experience provides an implicit, observer bias to assign mentality and intentions to the system in a test and many would argue that...extended the intentional notions of Belief, Desire, and Intention (BDI ) to include social “properties” of Value6

  13. Neyman-Pearson classification algorithms and NP receiver operating characteristics

    PubMed Central

    Tong, Xin; Feng, Yang; Li, Jingyi Jessica

    2018-01-01

    In many binary classification applications, such as disease diagnosis and spam detection, practitioners commonly face the need to limit type I error (that is, the conditional probability of misclassifying a class 0 observation as class 1) so that it remains below a desired threshold. To address this need, the Neyman-Pearson (NP) classification paradigm is a natural choice; it minimizes type II error (that is, the conditional probability of misclassifying a class 1 observation as class 0) while enforcing an upper bound, α, on the type I error. Despite its century-long history in hypothesis testing, the NP paradigm has not been well recognized and implemented in classification schemes. Common practices that directly limit the empirical type I error to no more than α do not satisfy the type I error control objective because the resulting classifiers are likely to have type I errors much larger than α, and the NP paradigm has not been properly implemented in practice. We develop the first umbrella algorithm that implements the NP paradigm for all scoring-type classification methods, such as logistic regression, support vector machines, and random forests. Powered by this algorithm, we propose a novel graphical tool for NP classification methods: NP receiver operating characteristic (NP-ROC) bands motivated by the popular ROC curves. NP-ROC bands will help choose α in a data-adaptive way and compare different NP classifiers. We demonstrate the use and properties of the NP umbrella algorithm and NP-ROC bands, available in the R package nproc, through simulation and real data studies. PMID:29423442

  14. Towards a symbiotic brain-computer interface: exploring the application-decoder interaction

    NASA Astrophysics Data System (ADS)

    Verhoeven, T.; Buteneers Wiersema, P., Jr.; Dambre, J.; Kindermans, PJ

    2015-12-01

    Objective. State of the art brain-computer interface (BCI) research focuses on improving individual components such as the application or the decoder that converts the user’s brain activity to control signals. In this study, we investigate the interaction between these components in the P300 speller, a BCI for communication. We introduce a synergistic approach in which the stimulus presentation sequence is modified to enhance the machine learning decoding. In this way we aim for an improved overall BCI performance. Approach. First, a new stimulus presentation paradigm is introduced which provides us flexibility in tuning the sequence of visual stimuli presented to the user. Next, an experimental setup in which this paradigm is compared to other paradigms uncovers the underlying mechanism of the interdependence between the application and the performance of the decoder. Main results. Extensive analysis of the experimental results reveals the changing requirements of the decoder concerning the data recorded during the spelling session. When few data is recorded, the balance in the number of target and non-target stimuli shown to the user is more important than the signal-to-noise rate (SNR) of the recorded response signals. Only when more data has been collected, the SNR becomes the dominant factor. Significance. For BCIs in general, knowing the dominant factor that affects the decoder performance and being able to respond to it is of utmost importance to improve system performance. For the P300 speller, the proposed tunable paradigm offers the possibility to tune the application to the decoder’s needs at any time and, as such, fully exploit this application-decoder interaction.

  15. Neyman-Pearson classification algorithms and NP receiver operating characteristics.

    PubMed

    Tong, Xin; Feng, Yang; Li, Jingyi Jessica

    2018-02-01

    In many binary classification applications, such as disease diagnosis and spam detection, practitioners commonly face the need to limit type I error (that is, the conditional probability of misclassifying a class 0 observation as class 1) so that it remains below a desired threshold. To address this need, the Neyman-Pearson (NP) classification paradigm is a natural choice; it minimizes type II error (that is, the conditional probability of misclassifying a class 1 observation as class 0) while enforcing an upper bound, α, on the type I error. Despite its century-long history in hypothesis testing, the NP paradigm has not been well recognized and implemented in classification schemes. Common practices that directly limit the empirical type I error to no more than α do not satisfy the type I error control objective because the resulting classifiers are likely to have type I errors much larger than α, and the NP paradigm has not been properly implemented in practice. We develop the first umbrella algorithm that implements the NP paradigm for all scoring-type classification methods, such as logistic regression, support vector machines, and random forests. Powered by this algorithm, we propose a novel graphical tool for NP classification methods: NP receiver operating characteristic (NP-ROC) bands motivated by the popular ROC curves. NP-ROC bands will help choose α in a data-adaptive way and compare different NP classifiers. We demonstrate the use and properties of the NP umbrella algorithm and NP-ROC bands, available in the R package nproc, through simulation and real data studies.

  16. Automatic classification of artifactual ICA-components for artifact removal in EEG signals.

    PubMed

    Winkler, Irene; Haufe, Stefan; Tangermann, Michael

    2011-08-02

    Artifacts contained in EEG recordings hamper both, the visual interpretation by experts as well as the algorithmic processing and analysis (e.g. for Brain-Computer Interfaces (BCI) or for Mental State Monitoring). While hand-optimized selection of source components derived from Independent Component Analysis (ICA) to clean EEG data is widespread, the field could greatly profit from automated solutions based on Machine Learning methods. Existing ICA-based removal strategies depend on explicit recordings of an individual's artifacts or have not been shown to reliably identify muscle artifacts. We propose an automatic method for the classification of general artifactual source components. They are estimated by TDSEP, an ICA method that takes temporal correlations into account. The linear classifier is based on an optimized feature subset determined by a Linear Programming Machine (LPM). The subset is composed of features from the frequency-, the spatial- and temporal domain. A subject independent classifier was trained on 640 TDSEP components (reaction time (RT) study, n = 12) that were hand labeled by experts as artifactual or brain sources and tested on 1080 new components of RT data of the same study. Generalization was tested on new data from two studies (auditory Event Related Potential (ERP) paradigm, n = 18; motor imagery BCI paradigm, n = 80) that used data with different channel setups and from new subjects. Based on six features only, the optimized linear classifier performed on level with the inter-expert disagreement (<10% Mean Squared Error (MSE)) on the RT data. On data of the auditory ERP study, the same pre-calculated classifier generalized well and achieved 15% MSE. On data of the motor imagery paradigm, we demonstrate that the discriminant information used for BCI is preserved when removing up to 60% of the most artifactual source components. We propose a universal and efficient classifier of ICA components for the subject independent removal of artifacts from EEG data. Based on linear methods, it is applicable for different electrode placements and supports the introspection of results. Trained on expert ratings of large data sets, it is not restricted to the detection of eye- and muscle artifacts. Its performance and generalization ability is demonstrated on data of different EEG studies.

  17. Model and experiments to optimize co-adaptation in a simplified myoelectric control system.

    PubMed

    Couraud, M; Cattaert, D; Paclet, F; Oudeyer, P Y; de Rugy, A

    2018-04-01

    To compensate for a limb lost in an amputation, myoelectric prostheses use surface electromyography (EMG) from the remaining muscles to control the prosthesis. Despite considerable progress, myoelectric controls remain markedly different from the way we normally control movements, and require intense user adaptation. To overcome this, our goal is to explore concurrent machine co-adaptation techniques that are developed in the field of brain-machine interface, and that are beginning to be used in myoelectric controls. We combined a simplified myoelectric control with a perturbation for which human adaptation is well characterized and modeled, in order to explore co-adaptation settings in a principled manner. First, we reproduced results obtained in a classical visuomotor rotation paradigm in our simplified myoelectric context, where we rotate the muscle pulling vectors used to reconstruct wrist force from EMG. Then, a model of human adaptation in response to directional error was used to simulate various co-adaptation settings, where perturbations and machine co-adaptation are both applied on muscle pulling vectors. These simulations established that a relatively low gain of machine co-adaptation that minimizes final errors generates slow and incomplete adaptation, while higher gains increase adaptation rate but also errors by amplifying noise. After experimental verification on real subjects, we tested a variable gain that cumulates the advantages of both, and implemented it with directionally tuned neurons similar to those used to model human adaptation. This enables machine co-adaptation to locally improve myoelectric control, and to absorb more challenging perturbations. The simplified context used here enabled to explore co-adaptation settings in both simulations and experiments, and to raise important considerations such as the need for a variable gain encoded locally. The benefits and limits of extending this approach to more complex and functional myoelectric contexts are discussed.

  18. Application of linear logic to simulation

    NASA Astrophysics Data System (ADS)

    Clarke, Thomas L.

    1998-08-01

    Linear logic, since its introduction by Girard in 1987 has proven expressive and powerful. Linear logic has provided natural encodings of Turing machines, Petri nets and other computational models. Linear logic is also capable of naturally modeling resource dependent aspects of reasoning. The distinguishing characteristic of linear logic is that it accounts for resources; two instances of the same variable are considered differently from a single instance. Linear logic thus must obey a form of the linear superposition principle. A proportion can be reasoned with only once, unless a special operator is applied. Informally, linear logic distinguishes two kinds of conjunction, two kinds of disjunction, and also introduces a modal storage operator that explicitly indicates propositions that can be reused. This paper discuses the application of linear logic to simulation. A wide variety of logics have been developed; in addition to classical logic, there are fuzzy logics, affine logics, quantum logics, etc. All of these have found application in simulations of one sort or another. The special characteristics of linear logic and its benefits for simulation will be discussed. Of particular interest is a connection that can be made between linear logic and simulated dynamics by using the concept of Lie algebras and Lie groups. Lie groups provide the connection between the exponential modal storage operators of linear logic and the eigen functions of dynamic differential operators. Particularly suggestive are possible relations between complexity result for linear logic and non-computability results for dynamical systems.

  19. Group II chaperonins: new TRiC(k)s and turns of a protein folding machine.

    PubMed

    Gutsche, I; Essen, L O; Baumeister, W

    1999-10-22

    In the past decade, the eubacterial group I chaperonin GroEL became the paradigm of a protein folding machine. More recently, electron microscopy and X-ray crystallography offered insights into the structure of the thermosome, the archetype of the group II chaperonins which also comprise the chaperonin from the eukaryotic cytosol TRiC. Some structural differences from GroEL were revealed, namely the existence of a built-in lid provided by the helical protrusions of the apical domains instead of a GroES-like co-chaperonin. These structural studies provide a framework for understanding the differences in the mode of action between the group II and the group I chaperonins. In vitro analyses of the folding of non-native substrates coupled to ATP binding and hydrolysis are progressing towards establishing a functional cycle for group II chaperonins. A protein complex called GimC/prefoldin has recently been found to cooperate with TRiC in vivo, and its characterization is under way. Copyright 1999 Academic Press.

  20. Wavelet-enhanced convolutional neural network: a new idea in a deep learning paradigm.

    PubMed

    Savareh, Behrouz Alizadeh; Emami, Hassan; Hajiabadi, Mohamadreza; Azimi, Seyed Majid; Ghafoori, Mahyar

    2018-05-29

    Manual brain tumor segmentation is a challenging task that requires the use of machine learning techniques. One of the machine learning techniques that has been given much attention is the convolutional neural network (CNN). The performance of the CNN can be enhanced by combining other data analysis tools such as wavelet transform. In this study, one of the famous implementations of CNN, a fully convolutional network (FCN), was used in brain tumor segmentation and its architecture was enhanced by wavelet transform. In this combination, a wavelet transform was used as a complementary and enhancing tool for CNN in brain tumor segmentation. Comparing the performance of basic FCN architecture against the wavelet-enhanced form revealed a remarkable superiority of enhanced architecture in brain tumor segmentation tasks. Using mathematical functions and enhancing tools such as wavelet transform and other mathematical functions can improve the performance of CNN in any image processing task such as segmentation and classification.

  1. Artificial Intelligence in Precision Cardiovascular Medicine.

    PubMed

    Krittanawong, Chayakrit; Zhang, HongJu; Wang, Zhen; Aydar, Mehmet; Kitai, Takeshi

    2017-05-30

    Artificial intelligence (AI) is a field of computer science that aims to mimic human thought processes, learning capacity, and knowledge storage. AI techniques have been applied in cardiovascular medicine to explore novel genotypes and phenotypes in existing diseases, improve the quality of patient care, enable cost-effectiveness, and reduce readmission and mortality rates. Over the past decade, several machine-learning techniques have been used for cardiovascular disease diagnosis and prediction. Each problem requires some degree of understanding of the problem, in terms of cardiovascular medicine and statistics, to apply the optimal machine-learning algorithm. In the near future, AI will result in a paradigm shift toward precision cardiovascular medicine. The potential of AI in cardiovascular medicine is tremendous; however, ignorance of the challenges may overshadow its potential clinical impact. This paper gives a glimpse of AI's application in cardiovascular clinical care and discusses its potential role in facilitating precision cardiovascular medicine. Copyright © 2017 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  2. The challenges of informatics in synthetic biology: from biomolecular networks to artificial organisms

    PubMed Central

    Ramoni, Marco F.

    2010-01-01

    The field of synthetic biology holds an inspiring vision for the future; it integrates computational analysis, biological data and the systems engineering paradigm in the design of new biological machines and systems. These biological machines are built from basic biomolecular components analogous to electrical devices, and the information flow among these components requires the augmentation of biological insight with the power of a formal approach to information management. Here we review the informatics challenges in synthetic biology along three dimensions: in silico, in vitro and in vivo. First, we describe state of the art of the in silico support of synthetic biology, from the specific data exchange formats, to the most popular software platforms and algorithms. Next, we cast in vitro synthetic biology in terms of information flow, and discuss genetic fidelity in DNA manipulation, development strategies of biological parts and the regulation of biomolecular networks. Finally, we explore how the engineering chassis can manipulate biological circuitries in vivo to give rise to future artificial organisms. PMID:19906839

  3. Applying knowledge engineering and representation methods to improve support vector machine and multivariate probabilistic neural network CAD performance

    NASA Astrophysics Data System (ADS)

    Land, Walker H., Jr.; Anderson, Frances; Smith, Tom; Fahlbusch, Stephen; Choma, Robert; Wong, Lut

    2005-04-01

    Achieving consistent and correct database cases is crucial to the correct evaluation of any computer-assisted diagnostic (CAD) paradigm. This paper describes the application of artificial intelligence (AI), knowledge engineering (KE) and knowledge representation (KR) to a data set of ~2500 cases from six separate hospitals, with the objective of removing/reducing inconsistent outlier data. Several support vector machine (SVM) kernels were used to measure diagnostic performance of the original and a "cleaned" data set. Specifically, KE and ER principles were applied to the two data sets which were re-examined with respect to the environment and agents. One data set was found to contain 25 non-characterizable sets. The other data set contained 180 non-characterizable sets. CAD system performance was measured with both the original and "cleaned" data sets using two SVM kernels as well as a multivariate probabilistic neural network (PNN). Results demonstrated: (i) a 10% average improvement in overall Az and (ii) approximately a 50% average improvement in partial Az.

  4. SU-E-T-473: A Patient-Specific QC Paradigm Based On Trajectory Log Files and DICOM Plan Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeMarco, J; McCloskey, S; Low, D

    Purpose: To evaluate a remote QC tool for monitoring treatment machine parameters and treatment workflow. Methods: The Varian TrueBeamTM linear accelerator is a digital machine that records machine axis parameters and MLC leaf positions as a function of delivered monitor unit or control point. This information is saved to a binary trajectory log file for every treatment or imaging field in the patient treatment session. A MATLAB analysis routine was developed to parse the trajectory log files for a given patient, compare the expected versus actual machine and MLC positions as well as perform a cross-comparison with the DICOM-RT planmore » file exported from the treatment planning system. The parsing routine sorts the trajectory log files based on the time and date stamp and generates a sequential report file listing treatment parameters and provides a match relative to the DICOM-RT plan file. Results: The trajectory log parsing-routine was compared against a standard record and verify listing for patients undergoing initial IMRT dosimetry verification and weekly and final chart QC. The complete treatment course was independently verified for 10 patients of varying treatment site and a total of 1267 treatment fields were evaluated including pre-treatment imaging fields where applicable. In the context of IMRT plan verification, eight prostate SBRT plans with 4-arcs per plan were evaluated based on expected versus actual machine axis parameters. The average value for the maximum RMS MLC error was 0.067±0.001mm and 0.066±0.002mm for leaf bank A and B respectively. Conclusion: A real-time QC analysis program was tested using trajectory log files and DICOM-RT plan files. The parsing routine is efficient and able to evaluate all relevant machine axis parameters during a patient treatment course including MLC leaf positions and table positions at time of image acquisition and during treatment.« less

  5. Coexistence of Multiple Nonlinear States in a Tristable Passive Kerr Resonator

    NASA Astrophysics Data System (ADS)

    Anderson, Miles; Wang, Yadong; Leo, François; Coen, Stéphane; Erkintalo, Miro; Murdoch, Stuart G.

    2017-07-01

    Passive Kerr cavities driven by coherent laser fields display a rich landscape of nonlinear physics, including bistability, pattern formation, and localized dissipative structures (solitons). Their conceptual simplicity has for several decades offered an unprecedented window into nonlinear cavity dynamics, providing insights into numerous systems and applications ranging from all-optical memory devices to microresonator frequency combs. Yet despite the decades of study, a recent theoretical work has surprisingly alluded to an entirely new and unexplored paradigm in the regime where nonlinearly tilted cavity resonances overlap with one another [T. Hansson and S. Wabnitz, J. Opt. Soc. Am. B 32, 1259 (2015), 10.1364/JOSAB.32.001259]. We use synchronously driven fiber ring resonators to experimentally access this regime and observe the rise of new nonlinear dissipative states. Specifically, we observe, for the first time to the best of our knowledge, the stable coexistence of temporal Kerr cavity solitons and extended modulation instability (Turing) patterns, and perform real-time measurements that unveil the dynamics of the ensuing nonlinear structure. When operating in the regime of continuous wave tristability, we further observe the coexistence of two distinct cavity soliton states, one of which can be identified as a "super" cavity soliton, as predicted by Hansson and Wabnitz. Our experimental findings are in excellent agreement with theoretical analyses and numerical simulations of the infinite-dimensional Ikeda map that governs the cavity dynamics. The results from our work reveal that experimental systems can support complex combinations of distinct nonlinear states, and they could have practical implications to future microresonator-based frequency comb sources.

  6. Reservoir computing with a slowly modulated mask signal for preprocessing using a mutually coupled optoelectronic system

    NASA Astrophysics Data System (ADS)

    Tezuka, Miwa; Kanno, Kazutaka; Bunsen, Masatoshi

    2016-08-01

    Reservoir computing is a machine-learning paradigm based on information processing in the human brain. We numerically demonstrate reservoir computing with a slowly modulated mask signal for preprocessing by using a mutually coupled optoelectronic system. The performance of our system is quantitatively evaluated by a chaotic time series prediction task. Our system can produce comparable performance with reservoir computing with a single feedback system and a fast modulated mask signal. We showed that it is possible to slow down the modulation speed of the mask signal by using the mutually coupled system in reservoir computing.

  7. Comparison of three different detectors applied to synthetic aperture radar data

    NASA Astrophysics Data System (ADS)

    Ranney, Kenneth I.; Khatri, Hiralal; Nguyen, Lam H.

    2002-08-01

    The U.S. Army Research Laboratory has investigated the relative performance of three different target detection paradigms applied to foliage penetration (FOPEN) synthetic aperture radar (SAR) data. The three detectors - a quadratic polynomial discriminator (QPD), Bayesian neural network (BNN) and a support vector machine (SVM) - utilize a common collection of statistics (feature values) calculated from the fully polarimetric FOPEN data. We describe the parametric variations required as part of the algorithm optimizations, and we present the relative performance of the detectors in terms of probability of false alarm (Pfa) and probability of detection (Pd).

  8. Sparse Bayesian learning machine for real-time management of reservoir releases

    NASA Astrophysics Data System (ADS)

    Khalil, Abedalrazq; McKee, Mac; Kemblowski, Mariush; Asefa, Tirusew

    2005-11-01

    Water scarcity and uncertainties in forecasting future water availabilities present serious problems for basin-scale water management. These problems create a need for intelligent prediction models that learn and adapt to their environment in order to provide water managers with decision-relevant information related to the operation of river systems. This manuscript presents examples of state-of-the-art techniques for forecasting that combine excellent generalization properties and sparse representation within a Bayesian paradigm. The techniques are demonstrated as decision tools to enhance real-time water management. A relevance vector machine, which is a probabilistic model, has been used in an online fashion to provide confident forecasts given knowledge of some state and exogenous conditions. In practical applications, online algorithms should recognize changes in the input space and account for drift in system behavior. Support vectors machines lend themselves particularly well to the detection of drift and hence to the initiation of adaptation in response to a recognized shift in system structure. The resulting model will normally have a structure and parameterization that suits the information content of the available data. The utility and practicality of this proposed approach have been demonstrated with an application in a real case study involving real-time operation of a reservoir in a river basin in southern Utah.

  9. Developing Ubiquitous Sensor Network Platform Using Internet of Things: Application in Precision Agriculture.

    PubMed

    Ferrández-Pastor, Francisco Javier; García-Chamizo, Juan Manuel; Nieto-Hidalgo, Mario; Mora-Pascual, Jerónimo; Mora-Martínez, José

    2016-07-22

    The application of Information Technologies into Precision Agriculture methods has clear benefits. Precision Agriculture optimises production efficiency, increases quality, minimises environmental impact and reduces the use of resources (energy, water); however, there are different barriers that have delayed its wide development. Some of these main barriers are expensive equipment, the difficulty to operate and maintain and the standard for sensor networks are still under development. Nowadays, new technological development in embedded devices (hardware and communication protocols), the evolution of Internet technologies (Internet of Things) and ubiquitous computing (Ubiquitous Sensor Networks) allow developing less expensive systems, easier to control, install and maintain, using standard protocols with low-power consumption. This work develops and test a low-cost sensor/actuator network platform, based in Internet of Things, integrating machine-to-machine and human-machine-interface protocols. Edge computing uses this multi-protocol approach to develop control processes on Precision Agriculture scenarios. A greenhouse with hydroponic crop production was developed and tested using Ubiquitous Sensor Network monitoring and edge control on Internet of Things paradigm. The experimental results showed that the Internet technologies and Smart Object Communication Patterns can be combined to encourage development of Precision Agriculture. They demonstrated added benefits (cost, energy, smart developing, acceptance by agricultural specialists) when a project is launched.

  10. VariantSpark: population scale clustering of genotype information.

    PubMed

    O'Brien, Aidan R; Saunders, Neil F W; Guo, Yi; Buske, Fabian A; Scott, Rodney J; Bauer, Denis C

    2015-12-10

    Genomic information is increasingly used in medical practice giving rise to the need for efficient analysis methodology able to cope with thousands of individuals and millions of variants. The widely used Hadoop MapReduce architecture and associated machine learning library, Mahout, provide the means for tackling computationally challenging tasks. However, many genomic analyses do not fit the Map-Reduce paradigm. We therefore utilise the recently developed SPARK engine, along with its associated machine learning library, MLlib, which offers more flexibility in the parallelisation of population-scale bioinformatics tasks. The resulting tool, VARIANTSPARK provides an interface from MLlib to the standard variant format (VCF), offers seamless genome-wide sampling of variants and provides a pipeline for visualising results. To demonstrate the capabilities of VARIANTSPARK, we clustered more than 3,000 individuals with 80 Million variants each to determine the population structure in the dataset. VARIANTSPARK is 80 % faster than the SPARK-based genome clustering approach, ADAM, the comparable implementation using Hadoop/Mahout, as well as ADMIXTURE, a commonly used tool for determining individual ancestries. It is over 90 % faster than traditional implementations using R and Python. The benefits of speed, resource consumption and scalability enables VARIANTSPARK to open up the usage of advanced, efficient machine learning algorithms to genomic data.

  11. Developing Ubiquitous Sensor Network Platform Using Internet of Things: Application in Precision Agriculture

    PubMed Central

    Ferrández-Pastor, Francisco Javier; García-Chamizo, Juan Manuel; Nieto-Hidalgo, Mario; Mora-Pascual, Jerónimo; Mora-Martínez, José

    2016-01-01

    The application of Information Technologies into Precision Agriculture methods has clear benefits. Precision Agriculture optimises production efficiency, increases quality, minimises environmental impact and reduces the use of resources (energy, water); however, there are different barriers that have delayed its wide development. Some of these main barriers are expensive equipment, the difficulty to operate and maintain and the standard for sensor networks are still under development. Nowadays, new technological development in embedded devices (hardware and communication protocols), the evolution of Internet technologies (Internet of Things) and ubiquitous computing (Ubiquitous Sensor Networks) allow developing less expensive systems, easier to control, install and maintain, using standard protocols with low-power consumption. This work develops and test a low-cost sensor/actuator network platform, based in Internet of Things, integrating machine-to-machine and human-machine-interface protocols. Edge computing uses this multi-protocol approach to develop control processes on Precision Agriculture scenarios. A greenhouse with hydroponic crop production was developed and tested using Ubiquitous Sensor Network monitoring and edge control on Internet of Things paradigm. The experimental results showed that the Internet technologies and Smart Object Communication Patterns can be combined to encourage development of Precision Agriculture. They demonstrated added benefits (cost, energy, smart developing, acceptance by agricultural specialists) when a project is launched. PMID:27455265

  12. New Trends in E-Science: Machine Learning and Knowledge Discovery in Databases

    NASA Astrophysics Data System (ADS)

    Brescia, Massimo

    2012-11-01

    Data mining, or Knowledge Discovery in Databases (KDD), while being the main methodology to extract the scientific information contained in Massive Data Sets (MDS), needs to tackle crucial problems since it has to orchestrate complex challenges posed by transparent access to different computing environments, scalability of algorithms, reusability of resources. To achieve a leap forward for the progress of e-science in the data avalanche era, the community needs to implement an infrastructure capable of performing data access, processing and mining in a distributed but integrated context. The increasing complexity of modern technologies carried out a huge production of data, whose related warehouse management and the need to optimize analysis and mining procedures lead to a change in concept on modern science. Classical data exploration, based on local user own data storage and limited computing infrastructures, is no more efficient in the case of MDS, worldwide spread over inhomogeneous data centres and requiring teraflop processing power. In this context modern experimental and observational science requires a good understanding of computer science, network infrastructures, Data Mining, etc. i.e. of all those techniques which fall into the domain of the so called e-science (recently assessed also by the Fourth Paradigm of Science). Such understanding is almost completely absent in the older generations of scientists and this reflects in the inadequacy of most academic and research programs. A paradigm shift is needed: statistical pattern recognition, object oriented programming, distributed computing, parallel programming need to become an essential part of scientific background. A possible practical solution is to provide the research community with easy-to understand, easy-to-use tools, based on the Web 2.0 technologies and Machine Learning methodology. Tools where almost all the complexity is hidden to the final user, but which are still flexible and able to produce efficient and reliable scientific results. All these considerations will be described in the detail in the chapter. Moreover, examples of modern applications offering to a wide variety of e-science communities a large spectrum of computational facilities to exploit the wealth of available massive data sets and powerful machine learning and statistical algorithms will be also introduced.

  13. Invasive Vibrio cholerae Infection Following Burn Injury

    DTIC Science & Technology

    2008-06-01

    revealed no infiltrates. Labs were significant for normal renal and liver chemistries, normal white blood cell count, a mild normocytic anemia, and a...knee amputation, and was noted to have bilateral orbital compartment syndrome requiring cantholysis. Given that both blood and urine cul- tures...and airway pressure re- lease ventilation. Multiple admission blood cultures revealed growth of multidrug-resistant Acinetobacter calcoaceticus

  14. Military Review: The Professional Journal of the U.S. Army. Volume 80, Number 5, September-October 2000

    DTIC Science & Technology

    2000-10-01

    Studying Normandy today looks like arrows and unit symbols. Normandy on 6 June 1944 looked like Saving Private Ryan �dangerous and chaotic...June 1944 looked like Saving Private Ryan �dangerous and chaotic. For some this is mostly a time of high anxiety; for me it is also high adven- ture

  15. U.S. EPA, Pesticide Product Label, TIME-SAVER LIQUID BACTERICIDE, 02/04/2003

    EPA Pesticide Factsheets

    2011-04-14

    ... "~1:I']i,!~: 3- Itt-iii I[ e turE lfhl tu if; ltllf~ H! Lf= i! ~ l,l[~ r or i ;:li{f" lt~~' t 1[" f ~ll~ f pl [1: tl ir~ l tIll i l t .!r~ ti~fr {~llhg Uf ttHjf } itt!t f t~ t ~, ' -; irlt, if-II 0 ...

  16. Above-Campus Services: Shaping the Promise of Cloud Computing for Higher Education

    ERIC Educational Resources Information Center

    Wheeler, Brad; Waggener, Shelton

    2009-01-01

    The concept of today's cloud computing may date back to 1961, when John McCarthy, retired Stanford professor and Turing Award winner, delivered a speech at MIT's Centennial. In that speech, he predicted that in the future, computing would become a "public utility." Yet for colleges and universities, the recent growth of pervasive, very high speed…

  17. U. S. Naval Forces, Vietnam Monthly Historical Summary for March 1969

    DTIC Science & Technology

    1967-10-02

    r.:- tures prior to the start of the mission. At about 1730 on the 19th PCF 101 was proceeding down the C£A Dai River from floi An in company with PCF...effectiveness of night operations was devcojedr, A 23-inch XENON tank searchlight, modified with a pink filter to provide con- patibility with a starlight scope

  18. Prediction and Control of Residual Stresses and Distortion in HY-130 Thick Pipe Weldments

    DTIC Science & Technology

    1979-05-01

    quality. The most common type of shielding gas used in GMAW of low carbon steel is Argon with 25% CO2 . However, in the case of HY-130, experience has...supporting struc- ture for the cylinder which would permit cylinder rotation at a controlled speed with respect to the fixed position of the GMAW torch

  19. American Foreign Policy: Regional Perspectives

    DTIC Science & Technology

    2009-05-15

    than in Mexico—and in greater danger of being over- whelmed by criminal activity. Their reliance on U.S. trade, investment, tourism , and remittances...U.S. FOREIGN POLICY AMBASSADOR (RET.) DAVID C. LITT Broadly speaking, American diplomacy and efforts to support good gover - nance, transparency, and...beyond raw materials, more- over. Some eight hundred Chinese companies now operate in Africa in agricul- ture, telecommunications, health, tourism

  20. The Effects of Random and Nonlinear Waves on Coastal and Offshore Structures

    DTIC Science & Technology

    1987-07-01

    Barik and Paramasivam [2]. Dao and Penzien [3]. Leonard, et al. (4). and Tuali and Hudspeth [8). For a real sea state, the super- position of linear...34 Ocean Engng., Vol. 10, No. 5, 1983, p 303 312. [2] Barik , K. C. and V. Paramasivam, "Response Analysis of Offehore Struc- tures," J. Waterways Port

  1. Uncertainty and Decision Making

    DTIC Science & Technology

    1979-09-01

    higher productivity and satisfaction than a nonsupportive co-worker and enriched tasks affected attitudes but not performance . The greatest uncertainty...leadership V- 4••,,. • , -9- style, goals, and task HLructure) on psychological uncertainty and the resultant effect on performance and satisfaction . People...turn related to satisfaction and performance . In general, a stric- turing leadership style, specific goals and a structured task result in lower unce

  2. Computational Analysis and Experimental Validation of the Friction-Stir Welding Behavior of Ti-6Al-4V

    DTIC Science & Technology

    2011-01-01

    tempera- ture and high-strength workpiece materials like tita - nium. Specifically, it was shown that due to high attendant temperatures these tools...relative amounts of the two phases and are typically classified as a-type, aþb-type, and b-type alloys. Among tita - nium alloys, aþb-type are of

  3. Assessment of Optical Turbulence Profiles Derived From Probabilistic Climatology

    DTIC Science & Technology

    2007-03-01

    654.3.1 Transformed Data Results . . . . . . . . . . . . 664.3.2 Untransformed Data Results . . . . . . . . . . . 704.4 Application of ...the needed repower to destroy surface based enemy targets.Courtesy of Boeing Corporation. http://www.boeing.com/news/ fea-ture/aa2004/backgrounders...medium is cornerstone to successful employ-ment of these HELs. 1.3 Introduction to Optical Turbulence Lethal application of directed energy repower

  4. Heat stability of cured urea-formaldehyde resins by measuring formaldehyde emission

    Treesearch

    Shin-ichiro Tohmura; Chung-Yun Hse; Mitsuo Higuchi

    1999-01-01

    A test method for measuring formaldehyde from urea-formaldehyde (UF) resins at high tempera­tures was developed and used to assess the influence of the reaction pH at synthesis on the formaldehyde emission during cure and heat stability of the cured resins without water. Additionally, 13C-CP/MAS solid-state nuclear magnetic resonance (NMR)...

  5. MIT Lincoln Laboratory Annual Report 2010

    DTIC Science & Technology

    2010-01-01

    Research and Development Center (FFRDC) and a DoD Research and Development Laboratory. The Laboratory conducts research and development pertinent to...year, the Laboratory restruc- tured three divisions to focus research and development in areas that are increasingly important to the nation...the Director 3 Collaborations with MIT campus continue to grow, leveraging the strengths of researchers at both the Laboratory and campus. The

  6. Formal Foundations for the Specification of Software Architecture.

    DTIC Science & Technology

    1995-03-01

    Architectures For- mally: A Case-Study Using KWIC." Kestrel Institute, Palo Alto, CA 94304, April 1994. 58. Kang, Kyo C. Feature-Oriented Domain Analysis ( FODA ...6.3.5 Constraint-Based Architectures ................. 6-60 6.4 Summary ......... ............................. 6-63 VII. Analysis of Process-Based...between these architec- ture theories were investigated. A feasibility analysis on an image processing application demonstrated that architecture theories

  7. JPRS Report Soviet Union Political Affairs.

    DTIC Science & Technology

    1990-07-27

    consciousness. Under these conditions the unresolved state of many social, political, and national problems acquired a special urgency . Meanwhile, the...whole. These errors consist entirely of crude distortions of party policy guidelines. And the tragic mistakes of recent years represent depar - tures...kolkhozes, medical institutions and even the soccer team. And so, there is a great discrepancy between words and actions with respect to mutual

  8. Reliability Validation and Improvement Framework

    DTIC Science & Technology

    2012-11-01

    systems . Steps in that direction include the use of the Architec- ture Tradeoff Analysis Method ® (ATAM®) developed at the Carnegie Mellon...embedded software • cyber - physical systems (CPSs) to indicate that the embedded software interacts with, manag - es, and controls a physical system [Lee...the use of formal static analysis methods to increase our confidence in system operation beyond testing. However, analysis results

  9. Area Handbook Series: Finland: A Country Study

    DTIC Science & Technology

    1988-12-01

    79 DEMOGRAPHY ................................... 79 External M igration ........................... 81 Internal...blue-collar workers (see Demography ; Social Struc- ture, ch. 2). Along with the changes in social and in economic cir- cumstances went changes in popular...examination of the social forces involved in the for- mation of the Finnish state. C. Leonard Lundin’s Finland in the Second World War was a pioneering work

  10. Post-GOE redox insights from Mo isotopes, Ce anomalies, and Mn from the 2.24 Ga Kazput Formation

    NASA Astrophysics Data System (ADS)

    Thoby, M.; Konhauser, K.; Philippot, P.; Killingsworth, B.; Warchola, T.; Lalonde, S.

    2017-12-01

    Following the Great Oxidation event (GOE) defined from 2.45 to 2.2 Ga, an event marking the first appearance of widespread atmospheric oxygen, a combination of decreased Mn(II) supply from land and increased Mn(IV)-precipitation in the oceans should have resulted in lower concentrations of Mn in seawater. Nevertheless, it appears that some early Proterozoic marine sediments record high seawater Mn concentrations hundreds of millions of years after the GOE. Here we investigate a Mn excursion associated with marine carbonates and shales of the 2.31 Ga Kazput Formation. Samples were recovered from drill core collected during the Turee Creek Drilling Project (TCDP). Using molybdenum (Mo) isotope data coupled with cerium (Ce) anomalies, we define the redox condition of the Kazput depositional environment. Initial results show no Mo fractionation and few cerium anomalies in carbonates, pointing to an anoxic basin without Mn oxide precipitates. Additionally, XRF data on the shales indicates an association of Mn with calcium (Ca) suggesting an anoxic environment at the time of their deposition. Our results provide new insights into the nature and environment of the Turee Creek basin and the extent of oxygenation of surface waters after the GOE.

  11. Arctic Stratospheric Temperature In The Winters 1999/2000 and 2000/2001: A Quantitative Assessment and Microphysical Implications

    NASA Astrophysics Data System (ADS)

    Buss, S.; Wernli, H.; Peter, T.; Kivi, R.; Bui, T. P.; Kleinböhl, A.; Schiller, C.

    Stratospheric winter temperatures play a key role in the chain of microphysical and chemical processes that lead to the formation of polar stratospheric clouds (PSCs), chlorine activation and eventually to stratospheric ozone depletion. Here the tempera- ture conditions during the Arctic winters 1999/2000 and 2000/2001 are quantitatively investigated using observed profiles of water vapour and nitric acid, and tempera- tures from high-resolution radiosondes and aircraft observations, global ECMWF and UKMO analyses and mesoscale model simulations over Scandinavia and Greenland. The ECMWF model resolves parts of the gravity wave activity and generally agrees well with the observations. However, for the very cold temperatures near the ice frost point the ECMWF analyses have a warm bias of 1-6 K compared to radiosondes. For the mesoscale model HRM, this bias is generally reduced due to a more accurate rep- resentation of gravity waves. Quantitative estimates of the impact of the mesoscale temperature perturbations indicates that over Scandinavia and Greenland the wave- induced stratospheric cooling (as simulated by the HRM) affects only moderately the estimated chlorine activation and homogeneous NAT particle formation, but strongly enhances the potential for ice formation.

  12. Neurobiomimetic constructs for intelligent unmanned systems and robotics

    NASA Astrophysics Data System (ADS)

    Braun, Jerome J.; Shah, Danelle C.; DeAngelus, Marianne A.

    2014-06-01

    This paper discusses a paradigm we refer to as neurobiomimetic, which involves emulations of brain neuroanatomy and neurobiology aspects and processes. Neurobiomimetic constructs include rudimentary and down-scaled computational representations of brain regions, sub-regions, and synaptic connectivity. Many different instances of neurobiomimetic constructs are possible, depending on various aspects such as the initial conditions of synaptic connectivity, number of neuron elements in regions, connectivity specifics, and more, and we refer to these instances as `animats'. While downscaled for computational feasibility, the animats are very large constructs; the animats implemented in this work contain over 47,000 neuron elements and over 720,000 synaptic connections. The paper outlines aspects of the animats implemented, spatial memory and learning cognitive task, the virtual-reality environment constructed to study the animat performing that task, and discussion of results. In a broad sense, we argue that the neurobiomimetic paradigm pursued in this work constitutes a particularly promising path to artificial cognition and intelligent unmanned systems. Biological brains readily cope with challenges of real-life tasks that consistently prove beyond even the most sophisticated algorithmic approaches known. At the cross-over point of neuroscience, cognitive science and computer science, paradigms such as the one pursued in this work aim to mimic the mechanisms of biological brains and as such, we argue, may lead to machines with abilities closer to those of biological species.

  13. Toward a hybrid brain-computer interface based on repetitive visual stimuli with missing events.

    PubMed

    Wu, Yingying; Li, Man; Wang, Jing

    2016-07-26

    Steady-state visually evoked potentials (SSVEPs) can be elicited by repetitive stimuli and extracted in the frequency domain with satisfied performance. However, the temporal information of such stimulus is often ignored. In this study, we utilized repetitive visual stimuli with missing events to present a novel hybrid BCI paradigm based on SSVEP and omitted stimulus potential (OSP). Four discs flickering from black to white with missing flickers served as visual stimulators to simultaneously elicit subject's SSVEPs and OSPs. Key parameters in the new paradigm, including flicker frequency, optimal electrodes, missing flicker duration and intervals of missing events were qualitatively discussed with offline data. Two omitted flicker patterns including missing black/white disc were proposed and compared. Averaging times were optimized with Information Transfer Rate (ITR) in online experiments, where SSVEPs and OSPs were identified using Canonical Correlation Analysis in the frequency domain and Support Vector Machine (SVM)-Bayes fusion in the time domain, respectively. The online accuracy and ITR (mean ± standard deviation) over nine healthy subjects were 79.29 ± 18.14 % and 19.45 ± 11.99 bits/min with missing black disc pattern, and 86.82 ± 12.91 % and 24.06 ± 10.95 bits/min with missing white disc pattern, respectively. The proposed BCI paradigm, for the first time, demonstrated that SSVEPs and OSPs can be simultaneously elicited in single visual stimulus pattern and recognized in real-time with satisfied performance. Besides the frequency features such as SSVEP elicited by repetitive stimuli, we found a new feature (OSP) in the time domain to design a novel hybrid BCI paradigm by adding missing events in repetitive stimuli.

  14. An introductory analysis of digital infrared thermal imaging guided oral cancer detection using multiresolution rotation invariant texture features

    NASA Astrophysics Data System (ADS)

    Chakraborty, M.; Das Gupta, R.; Mukhopadhyay, S.; Anjum, N.; Patsa, S.; Ray, J. G.

    2017-03-01

    This manuscript presents an analytical treatment on the feasibility of multi-scale Gabor filter bank response for non-invasive oral cancer pre-screening and detection in the long infrared spectrum. Incapability of present healthcare technology to detect oral cancer in budding stage manifests in high mortality rate. The paper contributes a step towards automation in non-invasive computer-aided oral cancer detection using an amalgamation of image processing and machine intelligence paradigms. Previous works have shown the discriminative difference of facial temperature distribution between a normal subject and a patient. The proposed work, for the first time, exploits this difference further by representing the facial Region of Interest(ROI) using multiscale rotation invariant Gabor filter bank responses followed by classification using Radial Basis Function(RBF) kernelized Support Vector Machine(SVM). The proposed study reveals an initial increase in classification accuracy with incrementing image scales followed by degradation of performance; an indication that addition of more and more finer scales tend to embed noisy information instead of discriminative texture patterns. Moreover, the performance is consistently better for filter responses from profile faces compared to frontal faces.This is primarily attributed to the ineptness of Gabor kernels to analyze low spatial frequency components over a small facial surface area. On our dataset comprising of 81 malignant, 59 pre-cancerous, and 63 normal subjects, we achieve state-of-the-art accuracy of 85.16% for normal v/s precancerous and 84.72% for normal v/s malignant classification. This sets a benchmark for further investigation of multiscale feature extraction paradigms in IR spectrum for oral cancer detection.

  15. ESNIB (European Science Notes Information Bulletin): Reports on Current European/Middle Eastern Science

    DTIC Science & Technology

    1989-11-01

    tool for planning, programming , The TERMOS is a digital terrain modeling system and simulating, initiating, and surveying small-scale was developed ...workshop fea- (FRG) turing the European Strategic Program for Research and Conference Language: English Development in Information Technologies...self- * Research and Development in the Numerical addressed mailer and return it to ONREUR. Aerodynamic Systems Program , R. Bailey, NASA

  16. Reliability and Maintainability Analysis: A Conceptual Design Model

    DTIC Science & Technology

    1972-03-01

    Elements For a System I. Research ane Development A. Preliminary design and engineering B. Fabrication of test equipment C. Test operations D...reliability racquiro:wents, little, if any, modu larzation and auto- matic test features would be incorporated in the subsystem design, limited reliability...niaintaina~ility testing and monitoring would be conducted turing dev!qopmcnt, and little Quality Control effort, in the rell ability/’uaintainalility

  17. Refugee Operations: Cultures in Conflict.

    DTIC Science & Technology

    1982-12-01

    people from different cul- tures have unconscious, ingrained assumptions about personal space, interpersonal relations , and the function of time . Those...deal of anxiety among the refugees and administrators. For exam- ple, at Fort McCoy it was frequently observed by civilian employees and refugees that...involved in a variety of activities with several different people at any given time . On the other hand, low-context cultures (interpreters, managers

  18. Reactive Collisions and Final State Analysis in Hypersonic Flight Regime

    DTIC Science & Technology

    2016-09-13

    Kelvin.[7] The gas-phase, surface reactions and energy transfer at these tempera- tures are essentially uncharacterized and the experimental methodologies...high temperatures (1000 to 20000 K) and compared with results from experimentally derived thermodynamics quantities from the NASA CEA (NASA Chemical...with a reproducing kernel Hilbert space (RKHS) method[13] combined with Legendre polynomials; (2) quasi classical trajectory (QCT) calculations to study

  19. Native Shellfish in Nearshore Ecosystems of Puget Sound

    DTIC Science & Technology

    2006-04-01

    Key parameters include temperature and salinity , turbidity, oxygen, pollutants, and food types and concentrations. All these can be affected by...variety of other organisms, depending on the stage in their life history. Larvae (in the plankton) are eaten by coho and chinook salmon and...of particular year classes are probably determined by larval survival to meta- morphosis, which depends on predation, water tempera- tures, food

  20. Medical Services: Preventive Medicine

    DTIC Science & Technology

    1990-10-15

    and comfort.Barracks are ventilated to dilute unpleasant odors , tobacco smoke, airborne microorganisms and dusts, and to reduce tempera- ture and...injury in cold climates by wearing proper cold-weather clothing and frequently changing socks to keep feet dry, by careful handling of gasoline-type...that refugee enclaves and prisoner compounds do not become foci of epidemic disease. (4) Environmental engineering service, LC teams. The LC teams will

  1. The Shock and Vibration Digest. Volume 18, Number 8

    DTIC Science & Technology

    1986-08-01

    the swash plate . This is an active that vibration can be reduced by separation of control system...element program model . ture-borne sound intensity has been tried earlier The agreement is shown to be very good. A on thin- plate constructions in ...predicting the response of two displacement controlled laboratory tests that were used for the determination of the model parameters. 86-1532

  2. Systemic Review and Meta-analysis of Randomized Clinical Trials Comparing Primary vs Delayed Primary Skin Closure in Contaminated and Dirty Abdominal Incisions

    DTIC Science & Technology

    2013-06-26

    Chatwiriyacharoen14 Betadine gauze Unclear 5 Unclear Until suitable for suture Purulent dis- charge or mate- rial or surround- ing cellulitis Unclear Reopened and...Opened McGreal et al20 Povidone-io- dine (1%)- soaked wick Subcuticu- lar suture 4 Unclear Steri-Strips on day 4 Cellulitis , cul- ture-positive

  3. Effect of Time and Temperature on Transformation Toughened Zirconias.

    DTIC Science & Technology

    1987-06-01

    room temperature. High temperature mechanical tests performed vere stress rupture and stepped temperature stress rupture. The results of the tests...tetragonal precipitates will spontaneously transform to the monoclinic phae due to the lattice mismatch stress if they become larger than about 0.2 on, with...specimens, including fast fracture and fracture toughness testing. High temper- ture testing consisting of stress rupture and stepped temperature stress

  4. Encapsulated Decon for Use on Medical Patients

    DTIC Science & Technology

    1983-12-01

    Development of effective decon microcapsules was based on a series of tasks performed on this study. The preliminary tasks included a litera- ture search...culminated with evaluating selected microcapsules on pig skin samples, with HD, GB, arid GD. Results appear encouraging. The best capsule performance...term contact. in addition, a brief study showed magnetite can be incorporated into the capsule wall to provide magnetic microcapsules that can be

  5. JPRS Report, China

    DTIC Science & Technology

    1989-07-07

    a number of rehabilitated comrades went all-out to get an invitation card to attend the memorial service. Many old people went to Yaobang’s...residence to express their mourning for the deceased. At the memorial service, they wept bitterly before the remains of comrade Yaobang. Facing the...cul- ture is to cope with the development of commodity economy, to strengthen ideological and political work, to enchance the level of commercial

  6. Viable Legionella Pneumophila Not Detectable by Culture on Agar Media

    DTIC Science & Technology

    1987-09-01

    iom’microorganisms released to the enirnmet becomesa primary factors in risk assessment. Cul- prime consideration in risk assessment. The ability to ture methods have...detection of microorganisms in the not always be culturable. We surveyed environment. In this sense, LegioneIla pnesanopliila, the environmental ...samples collected from agent of Legionnaires’ pneumonia and related illnesses, poses a microbiological dilemma for environmental morn- * sources

  7. A Social Network Approach to Understanding an Insurgency

    DTIC Science & Technology

    2007-07-01

    and a framework for testing theories regarding struc- tured social relationships.6 Equally relevant is the understanding of a social network approach...A Social Network Approach to Understanding an Insurgency BRIAN REED The study of networks, interactions, and relationships has a long history...characteristics of social network analysis is often counter-intuitive to traditional military thinking, rooted in the efficiency of a hierarchy that

  8. MassTag Polymerase Chain Reaction for Differential Diagnosis of Viral Hemorrhagic Fevers

    DTIC Science & Technology

    2006-04-01

    fever virus (RVFV), Crimean - Congo hemorrhagic fever virus (CCHFV), and hantaviruses (Bunyaviridae); and...ribavirin may be helpful if given early in the course of Lassa fever (9), Crimean - Congo hemorrhagic fever (10), or hemorrhagic fever with renal...I, Erol S, Erdem F, Yilmaz N, Parlak M, et al. Crimean - Congo hemorrhagic fever in eastern Turkey: clinical fea- tures, risk factors and efficacy

  9. Aggregating and Communicating Uncertainty.

    DTIC Science & Technology

    1980-04-01

    wholes, the organic, inclusive struc- tures of events. In artificial intelligence applications, these wholes are sometimes called frames, scripts, or...Here, we will use a somewhat more artificial example. It is well known that many of the more hawkish forces within the Soviet Union believe that a...the actual figures; another nation, which wants to give an exaggerated vision of its capabilities, may provide artificially inflated figures. DE

  10. Balancing the Rates of New Bone Formation and Polymer Degradation Enhances Healing of Weight-Bearing Allograft/Polyurethane Composites in Rabbit Femoral Defects

    DTIC Science & Technology

    2014-10-03

    frac tures? J Orthop Res 29, 33, 2011. Epub 2010/07/08. 2. Russell, T.A., and Leighton , R.K. Comparison of autogenous bone graft and endothermic...Boyd, S.K., Christiansen , B.A., Guldberg, R.E., Jepsen, K.J., and Muller, R. Guidelines for assessment of bone microstructure in rodents using micro

  11. Bridging the Religious Divide

    DTIC Science & Technology

    2006-01-01

    characteristics of Islam itself. Francis Fukuyama takes a similar point of view concerning the cul- ture of Islamists: “Extremists exploit the common...insisted that their intolerant Wahhabi strain must be adopted by all Kashmiris. Women were to adopt the veil, and music was forbidden. They also...Muslim youth to perpetuate violence.14 This influence by the Imams, Mullahs, and clerics over the young, disenfranchised, and impressionable is more

  12. Development of the Enlisted Panel Research Data Base

    DTIC Science & Technology

    1990-01-01

    Loss Files, Accession File, Army Classification Battery Composite Scores pertaining to accession, the Skills Qualifying Test (SQT) data from the SQT...inclusive. Specific accession data variables, including composite score data from the Army Classification Battery Test (ACB), are cap- tured for each...included. To broaden the scope of information for each individual, Skill Qualifying Test (SQT) scores were kept beginning in 1980 and, as of fiscal year

  13. Criteria for Hull-Machinery Rigidity Compatibility,

    DTIC Science & Technology

    1981-05-01

    that between extreme ship load conditions the radius of curva- ture of this circular arc should not be less than 30 x 103m . Engine manufacturers have...141 1-1 14(4. 4.4 4 44 004 4048᝾ 44040 444 40 88 1. 4 1.0404 40404.04.4 *4,1 (44.40 00.604 140444 CO 0 4 0464 04. ’ Rh 40 4 44041.140404 404

  14. Defense AT and L. Volume 43, Number 4

    DTIC Science & Technology

    2014-08-01

    nature , guidance can become dated soon after it is published since it is typically anticipatory or reactive in na- ture. Tearing down boundaries and...how we pro- tect and defend the United States and its allies. Those technologies began as ideas that were nurtured , guarded and secured by...intimidation by criminals /insurgents, and safe facilities for their workers. They also need reliable infrastructure for their manufacturing facilities

  15. NBIC-Convergence as a Paradigm Platform of Sustainable Development

    NASA Astrophysics Data System (ADS)

    Dotsenko, Elena

    2017-11-01

    Today, the fastest rates of scientific and technological development are typical for the spheres of nano-systems and materials industry, information and communication systems, as well as spheres of direct human impact on environment - power industry, urbanization, and industrial infrastructure. Accelerate replacement of a human by machines and robots, the construction of megacities; the transportation of huge volumes of environmentally hazardous goods takes place against the background of intensive generation of knowledge, the transition of the results of fundamental research into specific production technologies. In this process, on the one hand, a fundamentally new format for technological restructuring of the world economy is being developed. On the other hand, a new platform for human-environment interaction is being formed, where both positive and negative environmental impacts will be determined by unstudied factors in the near future. The reason for this is in the forthcoming replacement of the technologies that are familiar to us, although dynamically developing, by fundamentally new - convergent. Entering the front line of technological development - NBIC-convergence - requires a new paradigm of sustainable development.

  16. Development of a neural net paradigm that predicts simulator sickness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allgood, G.O.

    1993-03-01

    A disease exists that affects pilots and aircrew members who use Navy Operational Flight Training Systems. This malady, commonly referred to as simulator sickness and whose symptomatology closely aligns with that of motion sickness, can compromise the use of these systems because of a reduced utilization factor, negative transfer of training, and reduction in combat readiness. A report is submitted that develops an artificial neural network (ANN) and behavioral model that predicts the onset and level of simulator sickness in the pilots and aircrews who sue these systems. It is proposed that the paradigm could be implemented in real timemore » as a biofeedback monitor to reduce the risk to users of these systems. The model captures the neurophysiological impact of use (human-machine interaction) by developing a structure that maps the associative and nonassociative behavioral patterns (learned expectations) and vestibular (otolith and semicircular canals of the inner ear) and tactile interaction, derived from system acceleration profiles, onto an abstract space that predicts simulator sickness for a given training flight.« less

  17. Classification without labels: learning from mixed samples in high energy physics

    NASA Astrophysics Data System (ADS)

    Metodiev, Eric M.; Nachman, Benjamin; Thaler, Jesse

    2017-10-01

    Modern machine learning techniques can be used to construct powerful models for difficult collider physics problems. In many applications, however, these models are trained on imperfect simulations due to a lack of truth-level information in the data, which risks the model learning artifacts of the simulation. In this paper, we introduce the paradigm of classification without labels (CWoLa) in which a classifier is trained to distinguish statistical mixtures of classes, which are common in collider physics. Crucially, neither individual labels nor class proportions are required, yet we prove that the optimal classifier in the CWoLa paradigm is also the optimal classifier in the traditional fully-supervised case where all label information is available. After demonstrating the power of this method in an analytical toy example, we consider a realistic benchmark for collider physics: distinguishing quark- versus gluon-initiated jets using mixed quark/gluon training samples. More generally, CWoLa can be applied to any classification problem where labels or class proportions are unknown or simulations are unreliable, but statistical mixtures of the classes are available.

  18. Neuroblastoma, a Paradigm for Big Data Science in Pediatric Oncology

    PubMed Central

    Salazar, Brittany M.; Balczewski, Emily A.; Ung, Choong Yong; Zhu, Shizhen

    2016-01-01

    Pediatric cancers rarely exhibit recurrent mutational events when compared to most adult cancers. This poses a challenge in understanding how cancers initiate, progress, and metastasize in early childhood. Also, due to limited detected driver mutations, it is difficult to benchmark key genes for drug development. In this review, we use neuroblastoma, a pediatric solid tumor of neural crest origin, as a paradigm for exploring “big data” applications in pediatric oncology. Computational strategies derived from big data science–network- and machine learning-based modeling and drug repositioning—hold the promise of shedding new light on the molecular mechanisms driving neuroblastoma pathogenesis and identifying potential therapeutics to combat this devastating disease. These strategies integrate robust data input, from genomic and transcriptomic studies, clinical data, and in vivo and in vitro experimental models specific to neuroblastoma and other types of cancers that closely mimic its biological characteristics. We discuss contexts in which “big data” and computational approaches, especially network-based modeling, may advance neuroblastoma research, describe currently available data and resources, and propose future models of strategic data collection and analyses for neuroblastoma and other related diseases. PMID:28035989

  19. Neuroblastoma, a Paradigm for Big Data Science in Pediatric Oncology.

    PubMed

    Salazar, Brittany M; Balczewski, Emily A; Ung, Choong Yong; Zhu, Shizhen

    2016-12-27

    Pediatric cancers rarely exhibit recurrent mutational events when compared to most adult cancers. This poses a challenge in understanding how cancers initiate, progress, and metastasize in early childhood. Also, due to limited detected driver mutations, it is difficult to benchmark key genes for drug development. In this review, we use neuroblastoma, a pediatric solid tumor of neural crest origin, as a paradigm for exploring "big data" applications in pediatric oncology. Computational strategies derived from big data science-network- and machine learning-based modeling and drug repositioning-hold the promise of shedding new light on the molecular mechanisms driving neuroblastoma pathogenesis and identifying potential therapeutics to combat this devastating disease. These strategies integrate robust data input, from genomic and transcriptomic studies, clinical data, and in vivo and in vitro experimental models specific to neuroblastoma and other types of cancers that closely mimic its biological characteristics. We discuss contexts in which "big data" and computational approaches, especially network-based modeling, may advance neuroblastoma research, describe currently available data and resources, and propose future models of strategic data collection and analyses for neuroblastoma and other related diseases.

  20. Classification without labels: learning from mixed samples in high energy physics

    DOE PAGES

    Metodiev, Eric M.; Nachman, Benjamin; Thaler, Jesse

    2017-10-25

    Modern machine learning techniques can be used to construct powerful models for difficult collider physics problems. In many applications, however, these models are trained on imperfect simulations due to a lack of truth-level information in the data, which risks the model learning artifacts of the simulation. In this paper, we introduce the paradigm of classification without labels (CWoLa) in which a classifier is trained to distinguish statistical mixtures of classes, which are common in collider physics. Crucially, neither individual labels nor class proportions are required, yet we prove that the optimal classifier in the CWoLa paradigm is also the optimalmore » classifier in the traditional fully-supervised case where all label information is available. After demonstrating the power of this method in an analytical toy example, we consider a realistic benchmark for collider physics: distinguishing quark- versus gluon-initiated jets using mixed quark/gluon training samples. More generally, CWoLa can be applied to any classification problem where labels or class proportions are unknown or simulations are unreliable, but statistical mixtures of the classes are available.« less

  1. Classification without labels: learning from mixed samples in high energy physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Metodiev, Eric M.; Nachman, Benjamin; Thaler, Jesse

    Modern machine learning techniques can be used to construct powerful models for difficult collider physics problems. In many applications, however, these models are trained on imperfect simulations due to a lack of truth-level information in the data, which risks the model learning artifacts of the simulation. In this paper, we introduce the paradigm of classification without labels (CWoLa) in which a classifier is trained to distinguish statistical mixtures of classes, which are common in collider physics. Crucially, neither individual labels nor class proportions are required, yet we prove that the optimal classifier in the CWoLa paradigm is also the optimalmore » classifier in the traditional fully-supervised case where all label information is available. After demonstrating the power of this method in an analytical toy example, we consider a realistic benchmark for collider physics: distinguishing quark- versus gluon-initiated jets using mixed quark/gluon training samples. More generally, CWoLa can be applied to any classification problem where labels or class proportions are unknown or simulations are unreliable, but statistical mixtures of the classes are available.« less

  2. The biological microprocessor, or how to build a computer with biological parts

    PubMed Central

    Moe-Behrens, Gerd HG

    2013-01-01

    Systemics, a revolutionary paradigm shift in scientific thinking, with applications in systems biology, and synthetic biology, have led to the idea of using silicon computers and their engineering principles as a blueprint for the engineering of a similar machine made from biological parts. Here we describe these building blocks and how they can be assembled to a general purpose computer system, a biological microprocessor. Such a system consists of biological parts building an input / output device, an arithmetic logic unit, a control unit, memory, and wires (busses) to interconnect these components. A biocomputer can be used to monitor and control a biological system. PMID:24688733

  3. An efficient 3-dim FFT for plane wave electronic structure calculations on massively parallel machines composed of multiprocessor nodes

    NASA Astrophysics Data System (ADS)

    Goedecker, Stefan; Boulet, Mireille; Deutsch, Thierry

    2003-08-01

    Three-dimensional Fast Fourier Transforms (FFTs) are the main computational task in plane wave electronic structure calculations. Obtaining a high performance on a large numbers of processors is non-trivial on the latest generation of parallel computers that consist of nodes made up of a shared memory multiprocessors. A non-dogmatic method for obtaining high performance for such 3-dim FFTs in a combined MPI/OpenMP programming paradigm will be presented. Exploiting the peculiarities of plane wave electronic structure calculations, speedups of up to 160 and speeds of up to 130 Gflops were obtained on 256 processors.

  4. Processing Solutions for Big Data in Astronomy

    NASA Astrophysics Data System (ADS)

    Fillatre, L.; Lepiller, D.

    2016-09-01

    This paper gives a simple introduction to processing solutions applied to massive amounts of data. It proposes a general presentation of the Big Data paradigm. The Hadoop framework, which is considered as the pioneering processing solution for Big Data, is described together with YARN, the integrated Hadoop tool for resource allocation. This paper also presents the main tools for the management of both the storage (NoSQL solutions) and computing capacities (MapReduce parallel processing schema) of a cluster of machines. Finally, more recent processing solutions like Spark are discussed. Big Data frameworks are now able to run complex applications while keeping the programming simple and greatly improving the computing speed.

  5. Artificial Intelligence approaches in hematopoietic cell transplant: A review of the current status and future directions.

    PubMed

    Muhsen, Ibrahim N; ElHassan, Tusneem; Hashmi, Shahrukh K

    2018-06-08

    Currently, the evidence-based literature on healthcare is expanding exponentially. The opportunities provided by the advancement in artificial intelligence (AI) tools i.e. machine learning are appealing in tackling many of the current healthcare challenges. Thus, AI integration is expanding in most fields of healthcare, including the field of hematology. This study aims to review the current applications of AI in the field hematopoietic cell transplant (HCT). Literature search was done involving the following databases: Ovid-Medline including in-Process and Other Non-Indexed Citations and google scholar. The abstracts of the following professional societies: American Society of Haematology (ASH), American Society for Blood and Marrow Transplantation (ASBMT) and European Society for Blood and Marrow Transplantation (EBMT) were also screened. Literature review showed that the integration of AI in the field of HCT has grown remarkably in the last decade and confers promising avenues in diagnosis and prognosis within HCT populations targeting both pre and post-transplant challenges. Studies on AI integration in HCT have many limitations that include poorly tested algorithms, lack of generalizability and limited use of different AI tools. Machine learning techniques in HCT is an intense area of research that needs a lot of development and needs extensive support from hematology and HCT societies / organizations globally since we believe that this would be the future practice paradigm. Key words: Artificial intelligence, machine learning, hematopoietic cell transplant.

  6. A Practical Framework Toward Prediction of Breaking Force and Disintegration of Tablet Formulations Using Machine Learning Tools.

    PubMed

    Akseli, Ilgaz; Xie, Jingjin; Schultz, Leon; Ladyzhynsky, Nadia; Bramante, Tommasina; He, Xiaorong; Deanne, Rich; Horspool, Keith R; Schwabe, Robert

    2017-01-01

    Enabling the paradigm of quality by design requires the ability to quantitatively correlate material properties and process variables to measureable product performance attributes. Conventional, quality-by-test methods for determining tablet breaking force and disintegration time usually involve destructive tests, which consume significant amount of time and labor and provide limited information. Recent advances in material characterization, statistical analysis, and machine learning have provided multiple tools that have the potential to develop nondestructive, fast, and accurate approaches in drug product development. In this work, a methodology to predict the breaking force and disintegration time of tablet formulations using nondestructive ultrasonics and machine learning tools was developed. The input variables to the model include intrinsic properties of formulation and extrinsic process variables influencing the tablet during manufacturing. The model has been applied to predict breaking force and disintegration time using small quantities of active pharmaceutical ingredient and prototype formulation designs. The novel approach presented is a step forward toward rational design of a robust drug product based on insight into the performance of common materials during formulation and process development. It may also help expedite drug product development timeline and reduce active pharmaceutical ingredient usage while improving efficiency of the overall process. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  7. Architectures for reasoning in parallel

    NASA Technical Reports Server (NTRS)

    Hall, Lawrence O.

    1989-01-01

    The research conducted has dealt with rule-based expert systems. The algorithms that may lead to effective parallelization of them were investigated. Both the forward and backward chained control paradigms were investigated in the course of this work. The best computer architecture for the developed and investigated algorithms has been researched. Two experimental vehicles were developed to facilitate this research. They are Backpac, a parallel backward chained rule-based reasoning system and Datapac, a parallel forward chained rule-based reasoning system. Both systems have been written in Multilisp, a version of Lisp which contains the parallel construct, future. Applying the future function to a function causes the function to become a task parallel to the spawning task. Additionally, Backpac and Datapac have been run on several disparate parallel processors. The machines are an Encore Multimax with 10 processors, the Concert Multiprocessor with 64 processors, and a 32 processor BBN GP1000. Both the Concert and the GP1000 are switch-based machines. The Multimax has all its processors hung off a common bus. All are shared memory machines, but have different schemes for sharing the memory and different locales for the shared memory. The main results of the investigations come from experiments on the 10 processor Encore and the Concert with partitions of 32 or less processors. Additionally, experiments have been run with a stripped down version of EMYCIN.

  8. The Quantum Measurement Problem and Physical reality: A Computation Theoretic Perspective

    NASA Astrophysics Data System (ADS)

    Srikanth, R.

    2006-11-01

    Is the universe computable? If yes, is it computationally a polynomial place? In standard quantum mechanics, which permits infinite parallelism and the infinitely precise specification of states, a negative answer to both questions is not ruled out. On the other hand, empirical evidence suggests that NP-complete problems are intractable in the physical world. Likewise, computational problems known to be algorithmically uncomputable do not seem to be computable by any physical means. We suggest that this close correspondence between the efficiency and power of abstract algorithms on the one hand, and physical computers on the other, finds a natural explanation if the universe is assumed to be algorithmic; that is, that physical reality is the product of discrete sub-physical information processing equivalent to the actions of a probabilistic Turing machine. This assumption can be reconciled with the observed exponentiality of quantum systems at microscopic scales, and the consequent possibility of implementing Shor's quantum polynomial time algorithm at that scale, provided the degree of superposition is intrinsically, finitely upper-bounded. If this bound is associated with the quantum-classical divide (the Heisenberg cut), a natural resolution to the quantum measurement problem arises. From this viewpoint, macroscopic classicality is an evidence that the universe is in BPP, and both questions raised above receive affirmative answers. A recently proposed computational model of quantum measurement, which relates the Heisenberg cut to the discreteness of Hilbert space, is briefly discussed. A connection to quantum gravity is noted. Our results are compatible with the philosophy that mathematical truths are independent of the laws of physics.

  9. Broca's arrow: evolution, prediction, and language in the brain.

    PubMed

    Cooper, David L

    2006-01-01

    Brodmann's areas 44 and 45 in the human brain, also known as Broca's area, have long been associated with language functions, especially in the left hemisphere. However, the precise role Broca's area plays in human language has not been established with certainty. Broca's area has homologs in the great apes and in area F5 in monkeys, which suggests that its original function was not linguistic at all. In fact, great ape and hominid brains show very similar left-over-right asymmetries in Broca's area homologs as well as in other areas, such as homologs to Wernicke's area, that are normally associated with language in modern humans. Moreover, the so-called mirror neurons are located in Broca's area in great apes and area F5 in monkeys, which seem to provide a representation of cause and effect in a primate's environment, particularly its social environment. Humans appear to have these mirror neurons in Broca's area as well. Similarly, genetic evidence related to the FOXP2 gene implicates Broca's area in linguistic function and dysfunction, but the gene itself is a highly conserved developmental gene in vertebrates and is shared with only two or three differences between humans and great apes, five between humans and mice, and eight between humans and songbirds. Taking neurons and portions of the brain as discrete computational segments in the sense of constituting specific Turing machines, this evidence points to a predictive motor and conceptual function for Broca's area in primates, especially for social concepts. In human language, this is consistent with evidence from typological and cognitive linguistics. (c) 2006 Wiley-Liss, Inc.

  10. "Neuro-semeiotics" and "free-energy minimization" suggest a unified perspective for integrative brain actions: focus on receptor heteromers and Roamer type of volume transmission.

    PubMed

    Agnati, Luigi F; Guidolin, Diego; Marcoli, Manuela; Genedani, Susanna; Borroto-Escuela, Dasiel; Maura, Guido; Fuxe, Kjell

    2014-01-01

    Two far-reaching theoretical approaches, namely "Neuro-semeiotics" (NS) and "Free-energy Minimization" (FEM), have been recently proposed as frames within which to put forward heuristic hypotheses on integrative brain actions. In the present paper these two theoretical approaches are briefly discussed in the perspective of a recent model of brain architecture and information handling based on what we suggest calling Jacob's tinkering principle, whereby "to create is to recombine!". The NS and FEM theoretical approaches will be discussed from the perspective both of the Roamer-Type Volume Transmission (especially exosome-mediated) of intercellular communication and of the impact of receptor oligomers and Receptor-Receptor Interactions (RRIs) on signal recognition/decoding processes. In particular, the Bio-semeiotics concept of "adaptor" will be used to analyze RRIs as an important feature of NS. Furthermore, the concept of phenotypic plasticity of cells will be introduced in view of the demonstration of the possible transfer of receptors (i.e., adaptors) into a computational network via exosomes (see also Appendix). Thus, Jacob's tinkering principle will be proposed as a theoretical basis for some learning processes both at the network level (Turing-like type of machine) and at the molecular level as a consequence of both the plastic changes in the adaptors caused by the allosteric interactions in the receptor oligomers and the intercellular transfer of receptors. Finally, on the basis of NS and FEM theories, a unified perspective for integrative brain actions will be proposed.

  11. The improvement of surface roughness for OAP aluminum mirrors: from terahertz to ultraviolet

    NASA Astrophysics Data System (ADS)

    Peng, Jilong; Yu, Qian; Shao, Yajun; Wang, Dong; Yi, Zhong; Wang, Shanshan

    2018-01-01

    Aluminum reflector, especially OAP (Off-Axis Parabolic) reflector, has been widely used in terahertz and infrared systems for its low cost, lightweight, good machinability, small size, simple structure, and having the same thermal expansion and contraction with the system structure which makes it have a wide temperature adaptability. Thorlabs, Daheng and other large optical components companies even have Aluminum OAP sold on shelf. Most of the precision Aluminum OAP is fabricated by SPDT (single point diamond turing). Affected by intermittent shock, the roughness of aluminum OAP mirrors through conventional single-point diamond lathes is around 7 nm which limits the scope of application for aluminum mirrors, like in the high power density terahertz/infrared systems and visible/UV optical systems. In this paper, a continuous process frock is proposed, which effectively reduces the influence of turning impact on the mirror roughness. Using this process, an off-axis parabolic aluminum reflector with an effective diameter of 50 mm, off-axis angle of 90 degree is fabricated, and the performances are validated. Measurement by VEECO NT1100 optical profiler with 20× objects, the surface roughness achieves 2.3 nm, and the surface figure error is within λ/7 RMS (λ= 632.8 nm) tested by FISB Aμ Phase laser interferometer with the help of a standard flat mirror. All these technical specifications are close to the traditional glass-based reflectors, and make it possible for using Aluminum reflectors in the higher LIDT (laser induced damage threshold) systems and even for the micro sensor of ionospheric for vacuum ultraviolet micro nano satellites.

  12. Inorganic Composite Materials in Japan: Status and Trends

    DTIC Science & Technology

    1989-11-01

    is planned with have already done some preliminary work) more sayby engineers and scientists and less on titanium and aluminide matrix compos- by...structural reliability of continued research in elevated tempera- the components. ture fiber and ceramic matrix composites. F=aMoving Blade (FRP...Forming Kawasaki 11eavy Ind with regard to these program target goals ONRFE M7 6 for carbon (CF), SiC, and boron filaments in isotropic titanium

  13. A Segment Level Study of Defense Industry Capital Investment.

    DTIC Science & Technology

    1985-12-01

    examines the econcinic factors that are influencial for encouraging capital investment in the defense industry. A group of candidate variables are...and disguises useful information that is available from the data. The reduced average data for first year group included 130 segments from 34...individually against capital expendi- Ie tures, using the averaged data for the same first year group , the coef- ficient was positive. This indicates

  14. Replication of Japanese Encephalitis Virus.

    DTIC Science & Technology

    1980-12-10

    persistently infected with JEV were studied. Over 200 cells were cloned from these cultures and all but four were nonproducers of infectious virus and viral...obtained for release of interfering particles by persis- tently infected cultures and clones , no new size classes of virus RNA could be demonstrated. iI...denaturing or non-dena- turing conditions. Both virus producer and non-producer cell clones were examined, and whether superinfected or not, they

  15. Improved Robustness and Efficiency for Automatic Visual Site Monitoring

    DTIC Science & Technology

    2009-09-01

    the space of expected poses. To avoid having to compare each test window with the whole training corpus, he builds a template hierarchy by...directions of motion. In a second layer of clustering, it also learns how the low-level clusters co-occur with each other. An infinite mix- ture model is used...implementation. We demonstrate the utility of this detector by modeling scene-level activities with a Hierarchical

  16. The Design and Testing of a High-Temperature Graphite Dilatometer

    DTIC Science & Technology

    1992-06-24

    26 11. Data from three-point-bend samples of PAA, phenolic, and furfural resin samples that were...TEMPERATURE (0C) Fig. 11. Data from three-point-bend samples of (a) PAA, (b) phenolic, and (c) furfural resin samples that were precured to 350*C. The max- imum...graphitization tempera- tures (20000C); and furfural resin carbon absorbs less at all temperatures. 28 V. CONCLUSIONS The dilatometer system described

  17. Earthquake Response of Concrete Gravity Dams Including Hydrodynamic and Foundation Interaction Effects,

    DTIC Science & Technology

    1980-01-01

    standard procedure for Analysis of all types of civil engineering struc- tures. Early in its development, it became apparent that this method had...unique potentialities in the evaluation of stress in dams, and many of its earliest civil engineering applications concerned special problems associated...with such structures [3,4]. The earliest dynamic finite element analyses of civil engineering structures involved the earthquake response analysis of

  18. European Scientific Notes, Volume 38, Number 9.

    DTIC Science & Technology

    1984-09-01

    dropped automa- tically from the mailing list. RSN Invites Letters to the Editor ESN publishes selected letters related to developments and policy in... selective sunmmary can be extract- examine trait anxiety or state-trait ed from the Idzikowski-Baddeley litera- interactions. ture review; it appears in... mutism , and stupor are not seen in fliers as they are in ground soldiers. Reid 1945 WW II - Navigation Errors increased over enemy bomber errors coast

  19. The Shock and Vibration Digest. Volume 18, Number 1

    DTIC Science & Technology

    1986-01-01

    polyurethanes reduced the loss factor and emphasized the correlation between molecular storage modulus by increasing the length of the structure and...one tempera- static deformations. He gave storage and loss ture/frequency range is difficult with copoly- moduli for a carbon black filled and an...has been described (18). The shear loss author states that the frequency dependence of and storage moduli of a void-filled polyurethane the elastomers

  20. Combining Architecture-Centric Engineering with the Team Software Process

    DTIC Science & Technology

    2010-12-01

    colleagues from Quarksoft and CIMAT have re- cently reported on their experiences in “Introducing Software Architecture Development Methods into a TSP...Postmortem Lessons, new goals, new requirements, new risk , etc. Business and technical goals Estimates, plans, process, commitment Work products...architecture to mitigate the risks unco- vered by the ATAM. At the end of the iteration, version 1.0 of the architec- ture is available. Implement a second

Top