Sample records for memory neural network

  1. Dynamic Neural Networks Supporting Memory Retrieval

    PubMed Central

    St. Jacques, Peggy L.; Kragel, Philip A.; Rubin, David C.

    2011-01-01

    How do separate neural networks interact to support complex cognitive processes such as remembrance of the personal past? Autobiographical memory (AM) retrieval recruits a consistent pattern of activation that potentially comprises multiple neural networks. However, it is unclear how such large-scale neural networks interact and are modulated by properties of the memory retrieval process. In the present functional MRI (fMRI) study, we combined independent component analysis (ICA) and dynamic causal modeling (DCM) to understand the neural networks supporting AM retrieval. ICA revealed four task-related components consistent with the previous literature: 1) Medial Prefrontal Cortex (PFC) Network, associated with self-referential processes, 2) Medial Temporal Lobe (MTL) Network, associated with memory, 3) Frontoparietal Network, associated with strategic search, and 4) Cingulooperculum Network, associated with goal maintenance. DCM analysis revealed that the medial PFC network drove activation within the system, consistent with the importance of this network to AM retrieval. Additionally, memory accessibility and recollection uniquely altered connectivity between these neural networks. Recollection modulated the influence of the medial PFC on the MTL network during elaboration, suggesting that greater connectivity among subsystems of the default network supports greater re-experience. In contrast, memory accessibility modulated the influence of frontoparietal and MTL networks on the medial PFC network, suggesting that ease of retrieval involves greater fluency among the multiple networks contributing to AM. These results show the integration between neural networks supporting AM retrieval and the modulation of network connectivity by behavior. PMID:21550407

  2. Long-Term Memory Stabilized by Noise-Induced Rehearsal

    PubMed Central

    Wei, Yi

    2014-01-01

    Cortical networks can maintain memories for decades despite the short lifetime of synaptic strengths. Can a neural network store long-lasting memories in unstable synapses? Here, we study the effects of ongoing spike-timing-dependent plasticity (STDP) on the stability of memory patterns stored in synapses of an attractor neural network. We show that certain classes of STDP rules can stabilize all stored memory patterns despite a short lifetime of synapses. In our model, unstructured neural noise, after passing through the recurrent network connections, carries the imprint of all memory patterns in temporal correlations. STDP, combined with these correlations, leads to reinforcement of all stored patterns, even those that are never explicitly visited. Our findings may provide the functional reason for irregular spiking displayed by cortical neurons and justify models of system memory consolidation. Therefore, we propose that irregular neural activity is the feature that helps cortical networks maintain stable connections. PMID:25411507

  3. A New Local Bipolar Autoassociative Memory Based on External Inputs of Discrete Recurrent Neural Networks With Time Delay.

    PubMed

    Zhou, Caigen; Zeng, Xiaoqin; Luo, Chaomin; Zhang, Huaguang

    In this paper, local bipolar auto-associative memories are presented based on discrete recurrent neural networks with a class of gain type activation function. The weight parameters of neural networks are acquired by a set of inequalities without the learning procedure. The global exponential stability criteria are established to ensure the accuracy of the restored patterns by considering time delays and external inputs. The proposed methodology is capable of effectively overcoming spurious memory patterns and achieving memory capacity. The effectiveness, robustness, and fault-tolerant capability are validated by simulated experiments.In this paper, local bipolar auto-associative memories are presented based on discrete recurrent neural networks with a class of gain type activation function. The weight parameters of neural networks are acquired by a set of inequalities without the learning procedure. The global exponential stability criteria are established to ensure the accuracy of the restored patterns by considering time delays and external inputs. The proposed methodology is capable of effectively overcoming spurious memory patterns and achieving memory capacity. The effectiveness, robustness, and fault-tolerant capability are validated by simulated experiments.

  4. Set selection dynamical system neural networks with partial memories, with applications to Sudoku and KenKen puzzles.

    PubMed

    Boreland, B; Clement, G; Kunze, H

    2015-08-01

    After reviewing set selection and memory model dynamical system neural networks, we introduce a neural network model that combines set selection with partial memories (stored memories on subsets of states in the network). We establish that feasible equilibria with all states equal to ± 1 correspond to answers to a particular set theoretic problem. We show that KenKen puzzles can be formulated as a particular case of this set theoretic problem and use the neural network model to solve them; in addition, we use a similar approach to solve Sudoku. We illustrate the approach in examples. As a heuristic experiment, we use online or print resources to identify the difficulty of the puzzles and compare these difficulties to the number of iterations used by the appropriate neural network solver, finding a strong relationship. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Improvement of the Hopfield Neural Network by MC-Adaptation Rule

    NASA Astrophysics Data System (ADS)

    Zhou, Zhen; Zhao, Hong

    2006-06-01

    We show that the performance of the Hopfield neural networks, especially the quality of the recall and the capacity of the effective storing, can be greatly improved by making use of a recently presented neural network designing method without altering the whole structure of the network. In the improved neural network, a memory pattern is recalled exactly from initial states having a given degree of similarity with the memory pattern, and thus one can avoids to apply the overlap criterion as carried out in the Hopfield neural networks.

  6. Long-term memory stabilized by noise-induced rehearsal.

    PubMed

    Wei, Yi; Koulakov, Alexei A

    2014-11-19

    Cortical networks can maintain memories for decades despite the short lifetime of synaptic strengths. Can a neural network store long-lasting memories in unstable synapses? Here, we study the effects of ongoing spike-timing-dependent plasticity (STDP) on the stability of memory patterns stored in synapses of an attractor neural network. We show that certain classes of STDP rules can stabilize all stored memory patterns despite a short lifetime of synapses. In our model, unstructured neural noise, after passing through the recurrent network connections, carries the imprint of all memory patterns in temporal correlations. STDP, combined with these correlations, leads to reinforcement of all stored patterns, even those that are never explicitly visited. Our findings may provide the functional reason for irregular spiking displayed by cortical neurons and justify models of system memory consolidation. Therefore, we propose that irregular neural activity is the feature that helps cortical networks maintain stable connections. Copyright © 2014 the authors 0270-6474/14/3415804-12$15.00/0.

  7. Electronic implementation of associative memory based on neural network models

    NASA Technical Reports Server (NTRS)

    Moopenn, A.; Lambe, John; Thakoor, A. P.

    1987-01-01

    An electronic embodiment of a neural network based associative memory in the form of a binary connection matrix is described. The nature of false memory errors, their effect on the information storage capacity of binary connection matrix memories, and a novel technique to eliminate such errors with the help of asymmetrical extra connections are discussed. The stability of the matrix memory system incorporating a unique local inhibition scheme is analyzed in terms of local minimization of an energy function. The memory's stability, dynamic behavior, and recall capability are investigated using a 32-'neuron' electronic neural network memory with a 1024-programmable binary connection matrix.

  8. Electronic device aspects of neural network memories

    NASA Technical Reports Server (NTRS)

    Lambe, J.; Moopenn, A.; Thakoor, A. P.

    1985-01-01

    The basic issues related to the electronic implementation of the neural network model (NNM) for content addressable memories are examined. A brief introduction to the principles of the NNM is followed by an analysis of the information storage of the neural network in the form of a binary connection matrix and the recall capability of such matrix memories based on a hardware simulation study. In addition, materials and device architecture issues involved in the future realization of such networks in VLSI-compatible ultrahigh-density memories are considered. A possible space application of such devices would be in the area of large-scale information storage without mechanical devices.

  9. Two Unipolar Terminal-Attractor-Based Associative Memories

    NASA Technical Reports Server (NTRS)

    Liu, Hua-Kuang; Wu, Chwan-Hwa

    1995-01-01

    Two unipolar mathematical models of electronic neural network functioning as terminal-attractor-based associative memory (TABAM) developed. Models comprise sets of equations describing interactions between time-varying inputs and outputs of neural-network memory, regarded as dynamical system. Simplifies design and operation of optoelectronic processor to implement TABAM performing associative recall of images. TABAM concept described in "Optoelectronic Terminal-Attractor-Based Associative Memory" (NPO-18790). Experimental optoelectronic apparatus that performed associative recall of binary images described in "Optoelectronic Inner-Product Neural Associative Memory" (NPO-18491).

  10. Hybrid computing using a neural network with dynamic external memory.

    PubMed

    Graves, Alex; Wayne, Greg; Reynolds, Malcolm; Harley, Tim; Danihelka, Ivo; Grabska-Barwińska, Agnieszka; Colmenarejo, Sergio Gómez; Grefenstette, Edward; Ramalho, Tiago; Agapiou, John; Badia, Adrià Puigdomènech; Hermann, Karl Moritz; Zwols, Yori; Ostrovski, Georg; Cain, Adam; King, Helen; Summerfield, Christopher; Blunsom, Phil; Kavukcuoglu, Koray; Hassabis, Demis

    2016-10-27

    Artificial neural networks are remarkably adept at sensory processing, sequence learning and reinforcement learning, but are limited in their ability to represent variables and data structures and to store data over long timescales, owing to the lack of an external memory. Here we introduce a machine learning model called a differentiable neural computer (DNC), which consists of a neural network that can read from and write to an external memory matrix, analogous to the random-access memory in a conventional computer. Like a conventional computer, it can use its memory to represent and manipulate complex data structures, but, like a neural network, it can learn to do so from data. When trained with supervised learning, we demonstrate that a DNC can successfully answer synthetic questions designed to emulate reasoning and inference problems in natural language. We show that it can learn tasks such as finding the shortest path between specified points and inferring the missing links in randomly generated graphs, and then generalize these tasks to specific graphs such as transport networks and family trees. When trained with reinforcement learning, a DNC can complete a moving blocks puzzle in which changing goals are specified by sequences of symbols. Taken together, our results demonstrate that DNCs have the capacity to solve complex, structured tasks that are inaccessible to neural networks without external read-write memory.

  11. Enhanced storage capacity with errors in scale-free Hopfield neural networks: An analytical study.

    PubMed

    Kim, Do-Hyun; Park, Jinha; Kahng, Byungnam

    2017-01-01

    The Hopfield model is a pioneering neural network model with associative memory retrieval. The analytical solution of the model in mean field limit revealed that memories can be retrieved without any error up to a finite storage capacity of O(N), where N is the system size. Beyond the threshold, they are completely lost. Since the introduction of the Hopfield model, the theory of neural networks has been further developed toward realistic neural networks using analog neurons, spiking neurons, etc. Nevertheless, those advances are based on fully connected networks, which are inconsistent with recent experimental discovery that the number of connections of each neuron seems to be heterogeneous, following a heavy-tailed distribution. Motivated by this observation, we consider the Hopfield model on scale-free networks and obtain a different pattern of associative memory retrieval from that obtained on the fully connected network: the storage capacity becomes tremendously enhanced but with some error in the memory retrieval, which appears as the heterogeneity of the connections is increased. Moreover, the error rates are also obtained on several real neural networks and are indeed similar to that on scale-free model networks.

  12. Multistability in bidirectional associative memory neural networks

    NASA Astrophysics Data System (ADS)

    Huang, Gan; Cao, Jinde

    2008-04-01

    In this Letter, the multistability issue is studied for Bidirectional Associative Memory (BAM) neural networks. Based on the existence and stability analysis of the neural networks with or without delay, it is found that the 2 n-dimensional networks can have 3 equilibria and 2 equilibria of them are locally exponentially stable, where each layer of the BAM network has n neurons. Furthermore, the results has been extended to (n+m)-dimensional BAM neural networks, where there are n and m neurons on the two layers respectively. Finally, two numerical examples are presented to illustrate the validity of our results.

  13. Properties of a memory network in psychology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wedemann, Roseli S.; Donangelo, Raul; Carvalho, Luis A. V. de

    We have previously described neurotic psychopathology and psychoanalytic working-through by an associative memory mechanism, based on a neural network model, where memory was modelled by a Boltzmann machine (BM). Since brain neural topology is selectively structured, we simulated known microscopic mechanisms that control synaptic properties, showing that the network self-organizes to a hierarchical, clustered structure. Here, we show some statistical mechanical properties of the complex networks which result from this self-organization. They indicate that a generalization of the BM may be necessary to model memory.

  14. Properties of a memory network in psychology

    NASA Astrophysics Data System (ADS)

    Wedemann, Roseli S.; Donangelo, Raul; de Carvalho, Luís A. V.

    2007-12-01

    We have previously described neurotic psychopathology and psychoanalytic working-through by an associative memory mechanism, based on a neural network model, where memory was modelled by a Boltzmann machine (BM). Since brain neural topology is selectively structured, we simulated known microscopic mechanisms that control synaptic properties, showing that the network self-organizes to a hierarchical, clustered structure. Here, we show some statistical mechanical properties of the complex networks which result from this self-organization. They indicate that a generalization of the BM may be necessary to model memory.

  15. Stability analysis for stochastic BAM nonlinear neural network with delays

    NASA Astrophysics Data System (ADS)

    Lv, Z. W.; Shu, H. S.; Wei, G. L.

    2008-02-01

    In this paper, stochastic bidirectional associative memory neural networks with constant or time-varying delays is considered. Based on a Lyapunov-Krasovskii functional and the stochastic stability analysis theory, we derive several sufficient conditions in order to guarantee the global asymptotically stable in the mean square. Our investigation shows that the stochastic bidirectional associative memory neural networks are globally asymptotically stable in the mean square if there are solutions to some linear matrix inequalities(LMIs). Hence, the global asymptotic stability of the stochastic bidirectional associative memory neural networks can be easily checked by the Matlab LMI toolbox. A numerical example is given to demonstrate the usefulness of the proposed global asymptotic stability criteria.

  16. Spiking neural network simulation: memory-optimal synaptic event scheduling.

    PubMed

    Stewart, Robert D; Gurney, Kevin N

    2011-06-01

    Spiking neural network simulations incorporating variable transmission delays require synaptic events to be scheduled prior to delivery. Conventional methods have memory requirements that scale with the total number of synapses in a network. We introduce novel scheduling algorithms for both discrete and continuous event delivery, where the memory requirement scales instead with the number of neurons. Superior algorithmic performance is demonstrated using large-scale, benchmarking network simulations.

  17. Abnormal Neural Network of Primary Insomnia: Evidence from Spatial Working Memory Task fMRI.

    PubMed

    Li, Yongli; Liu, Liya; Wang, Enfeng; Zhang, Hongju; Dou, Shewei; Tong, Li; Cheng, Jingliang; Chen, Chuanliang; Shi, Dapeng

    2016-01-01

    Contemporary functional MRI (fMRI) methods can provide a wealth of information about the neural mechanisms associated with primary insomnia (PI), which centrally involve neural network circuits related to spatial working memory. A total of 30 participants diagnosed with PI and without atypical brain anatomy were selected along with 30 age- and gender-matched healthy controls. Subjects were administered the Pittsburgh Sleep Quality Index (PSQI), Hamilton Rating Scale for Depression and clinical assessments of spatial working memory, followed by an MRI scan and fMRI in spatial memory task state. Statistically significant differences between PSQI and spatial working memory were observed between PI patients and controls (p < 0.01). Activation of neural networks related to spatial memory task state in the PI group was observed at the left temporal lobe, left occipital lobe and right frontal lobe. Lower levels of activation were observed in the left parahippocampal gyrus, right parahippocampal gyrus, bilateral temporal cortex, frontal cortex and superior parietal lobule. Participants with PI exhibited characteristic abnormalities in the neural network connectivity related to spatial working memory. These results may be indicative of an underlying pathological mechanism related to spatial working memory deterioration in PI, analogous to recently described mechanisms in other mental health disorders. © 2016 S. Karger AG, Basel.

  18. High Performance Implementation of 3D Convolutional Neural Networks on a GPU.

    PubMed

    Lan, Qiang; Wang, Zelong; Wen, Mei; Zhang, Chunyuan; Wang, Yijie

    2017-01-01

    Convolutional neural networks have proven to be highly successful in applications such as image classification, object tracking, and many other tasks based on 2D inputs. Recently, researchers have started to apply convolutional neural networks to video classification, which constitutes a 3D input and requires far larger amounts of memory and much more computation. FFT based methods can reduce the amount of computation, but this generally comes at the cost of an increased memory requirement. On the other hand, the Winograd Minimal Filtering Algorithm (WMFA) can reduce the number of operations required and thus can speed up the computation, without increasing the required memory. This strategy was shown to be successful for 2D neural networks. We implement the algorithm for 3D convolutional neural networks and apply it to a popular 3D convolutional neural network which is used to classify videos and compare it to cuDNN. For our highly optimized implementation of the algorithm, we observe a twofold speedup for most of the 3D convolution layers of our test network compared to the cuDNN version.

  19. High Performance Implementation of 3D Convolutional Neural Networks on a GPU

    PubMed Central

    Wang, Zelong; Wen, Mei; Zhang, Chunyuan; Wang, Yijie

    2017-01-01

    Convolutional neural networks have proven to be highly successful in applications such as image classification, object tracking, and many other tasks based on 2D inputs. Recently, researchers have started to apply convolutional neural networks to video classification, which constitutes a 3D input and requires far larger amounts of memory and much more computation. FFT based methods can reduce the amount of computation, but this generally comes at the cost of an increased memory requirement. On the other hand, the Winograd Minimal Filtering Algorithm (WMFA) can reduce the number of operations required and thus can speed up the computation, without increasing the required memory. This strategy was shown to be successful for 2D neural networks. We implement the algorithm for 3D convolutional neural networks and apply it to a popular 3D convolutional neural network which is used to classify videos and compare it to cuDNN. For our highly optimized implementation of the algorithm, we observe a twofold speedup for most of the 3D convolution layers of our test network compared to the cuDNN version. PMID:29250109

  20. Neural networks supporting autobiographical memory retrieval in post-traumatic stress disorder

    PubMed Central

    Jacques, Peggy L.; Kragel, Philip A.; Rubin, David C.

    2013-01-01

    Post-traumatic stress disorder (PTSD) affects the functional recruitment and connectivity between neural regions during autobiographical memory (AM) retrieval that overlap with default and control networks. Whether such univariate changes relate to potential differences in the contribution of large-scale neural networks supporting cognition in PTSD is unknown. In the current functional MRI (fMRI) study we employ independent component analysis to examine the influence the engagement of neural networks during the recall of personal memories in PTSD (15 participants) compared to non-trauma exposed, healthy controls (14 participants). We found that the PTSD group recruited similar neural networks when compared to controls during AM recall, including default network subsystems and control networks, but there were group differences in the spatial and temporal characteristics of these networks. First, there were spatial differences in the contribution of the anterior and posterior midline across the networks, and with the amygdala in particular for the medial temporal subsystem of the default network. Second, there were temporal differences in the relationship of the medial prefrontal subsystem of the default network, with less temporal coupling of this network during AM retrieval in PTSD relative to controls. These findings suggest that spatial and temporal characteristics of the default and control networks potentially differ in PTSD versus healthy controls, and contribute to altered recall of personal memory. PMID:23483523

  1. Oscillation, Conduction Delays, and Learning Cooperate to Establish Neural Competition in Recurrent Networks

    PubMed Central

    Kato, Hideyuki; Ikeguchi, Tohru

    2016-01-01

    Specific memory might be stored in a subnetwork consisting of a small population of neurons. To select neurons involved in memory formation, neural competition might be essential. In this paper, we show that excitable neurons are competitive and organize into two assemblies in a recurrent network with spike timing-dependent synaptic plasticity (STDP) and axonal conduction delays. Neural competition is established by the cooperation of spontaneously induced neural oscillation, axonal conduction delays, and STDP. We also suggest that the competition mechanism in this paper is one of the basic functions required to organize memory-storing subnetworks into fine-scale cortical networks. PMID:26840529

  2. Nonequilibrium landscape theory of neural networks.

    PubMed

    Yan, Han; Zhao, Lei; Hu, Liang; Wang, Xidi; Wang, Erkang; Wang, Jin

    2013-11-05

    The brain map project aims to map out the neuron connections of the human brain. Even with all of the wirings mapped out, the global and physical understandings of the function and behavior are still challenging. Hopfield quantified the learning and memory process of symmetrically connected neural networks globally through equilibrium energy. The energy basins of attractions represent memories, and the memory retrieval dynamics is determined by the energy gradient. However, the realistic neural networks are asymmetrically connected, and oscillations cannot emerge from symmetric neural networks. Here, we developed a nonequilibrium landscape-flux theory for realistic asymmetrically connected neural networks. We uncovered the underlying potential landscape and the associated Lyapunov function for quantifying the global stability and function. We found the dynamics and oscillations in human brains responsible for cognitive processes and physiological rhythm regulations are determined not only by the landscape gradient but also by the flux. We found that the flux is closely related to the degrees of the asymmetric connections in neural networks and is the origin of the neural oscillations. The neural oscillation landscape shows a closed-ring attractor topology. The landscape gradient attracts the network down to the ring. The flux is responsible for coherent oscillations on the ring. We suggest the flux may provide the driving force for associations among memories. We applied our theory to rapid-eye movement sleep cycle. We identified the key regulation factors for function through global sensitivity analysis of landscape topography against wirings, which are in good agreements with experiments.

  3. Nonequilibrium landscape theory of neural networks

    PubMed Central

    Yan, Han; Zhao, Lei; Hu, Liang; Wang, Xidi; Wang, Erkang; Wang, Jin

    2013-01-01

    The brain map project aims to map out the neuron connections of the human brain. Even with all of the wirings mapped out, the global and physical understandings of the function and behavior are still challenging. Hopfield quantified the learning and memory process of symmetrically connected neural networks globally through equilibrium energy. The energy basins of attractions represent memories, and the memory retrieval dynamics is determined by the energy gradient. However, the realistic neural networks are asymmetrically connected, and oscillations cannot emerge from symmetric neural networks. Here, we developed a nonequilibrium landscape–flux theory for realistic asymmetrically connected neural networks. We uncovered the underlying potential landscape and the associated Lyapunov function for quantifying the global stability and function. We found the dynamics and oscillations in human brains responsible for cognitive processes and physiological rhythm regulations are determined not only by the landscape gradient but also by the flux. We found that the flux is closely related to the degrees of the asymmetric connections in neural networks and is the origin of the neural oscillations. The neural oscillation landscape shows a closed-ring attractor topology. The landscape gradient attracts the network down to the ring. The flux is responsible for coherent oscillations on the ring. We suggest the flux may provide the driving force for associations among memories. We applied our theory to rapid-eye movement sleep cycle. We identified the key regulation factors for function through global sensitivity analysis of landscape topography against wirings, which are in good agreements with experiments. PMID:24145451

  4. Optical waveguides with memory effect using photochromic material for neural network

    NASA Astrophysics Data System (ADS)

    Tanimoto, Keisuke; Amemiya, Yoshiteru; Yokoyama, Shin

    2018-04-01

    An optical neural network using a waveguide with a memory effect, a photodiode, CMOS circuits and LEDs was proposed. To realize the neural network, optical waveguides with a memory effect were fabricated using a cladding layer containing the photochromic material “diarylethene”. The transmittance of green light was decreased by UV light irradiation and recovered by the passage of green light through the waveguide. It was confirmed that the transmittance versus total energy of the green light that passed through the waveguide well fit the universal exponential curve.

  5. Space-Time Neural Networks

    NASA Technical Reports Server (NTRS)

    Villarreal, James A.; Shelton, Robert O.

    1992-01-01

    Concept of space-time neural network affords distributed temporal memory enabling such network to model complicated dynamical systems mathematically and to recognize temporally varying spatial patterns. Digital filters replace synaptic-connection weights of conventional back-error-propagation neural network.

  6. Feature to prototype transition in neural networks

    NASA Astrophysics Data System (ADS)

    Krotov, Dmitry; Hopfield, John

    Models of associative memory with higher order (higher than quadratic) interactions, and their relationship to neural networks used in deep learning are discussed. Associative memory is conventionally described by recurrent neural networks with dynamical convergence to stable points. Deep learning typically uses feedforward neural nets without dynamics. However, a simple duality relates these two different views when applied to problems of pattern classification. From the perspective of associative memory such models deserve attention because they make it possible to store a much larger number of memories, compared to the quadratic case. In the dual description, these models correspond to feedforward neural networks with one hidden layer and unusual activation functions transmitting the activities of the visible neurons to the hidden layer. These activation functions are rectified polynomials of a higher degree rather than the rectified linear functions used in deep learning. The network learns representations of the data in terms of features for rectified linear functions, but as the power in the activation function is increased there is a gradual shift to a prototype-based representation, the two extreme regimes of pattern recognition known in cognitive psychology. Simons Center for Systems Biology.

  7. A research using hybrid RBF/Elman neural networks for intrusion detection system secure model

    NASA Astrophysics Data System (ADS)

    Tong, Xiaojun; Wang, Zhu; Yu, Haining

    2009-10-01

    A hybrid RBF/Elman neural network model that can be employed for both anomaly detection and misuse detection is presented in this paper. The IDSs using the hybrid neural network can detect temporally dispersed and collaborative attacks effectively because of its memory of past events. The RBF network is employed as a real-time pattern classification and the Elman network is employed to restore the memory of past events. The IDSs using the hybrid neural network are evaluated against the intrusion detection evaluation data sponsored by U.S. Defense Advanced Research Projects Agency (DARPA). Experimental results are presented in ROC curves. Experiments show that the IDSs using this hybrid neural network improve the detection rate and decrease the false positive rate effectively.

  8. Creative-Dynamics Approach To Neural Intelligence

    NASA Technical Reports Server (NTRS)

    Zak, Michail A.

    1992-01-01

    Paper discusses approach to mathematical modeling of artificial neural networks exhibiting complicated behaviors reminiscent of creativity and intelligence of biological neural networks. Neural network treated as non-Lipschitzian dynamical system - as described in "Non-Lipschitzian Dynamics For Modeling Neural Networks" (NPO-17814). System serves as tool for modeling of temporal-pattern memories and recognition of complicated spatial patterns.

  9. A Neural Network Model of Retrieval-Induced Forgetting

    ERIC Educational Resources Information Center

    Norman, Kenneth A.; Newman, Ehren L.; Detre, Greg

    2007-01-01

    Retrieval-induced forgetting (RIF) refers to the finding that retrieving a memory can impair subsequent recall of related memories. Here, the authors present a new model of how the brain gives rise to RIF in both semantic and episodic memory. The core of the model is a recently developed neural network learning algorithm that leverages regular…

  10. A New Measure for Neural Compensation Is Positively Correlated With Working Memory and Gait Speed.

    PubMed

    Ji, Lanxin; Pearlson, Godfrey D; Hawkins, Keith A; Steffens, David C; Guo, Hua; Wang, Lihong

    2018-01-01

    Neuroimaging studies suggest that older adults may compensate for declines in brain function and cognition through reorganization of neural resources. A limitation of prior research is reliance on between-group comparisons of neural activation (e.g., younger vs. older), which cannot be used to assess compensatory ability quantitatively. It is also unclear about the relationship between compensatory ability with cognitive function or how other factors such as physical exercise modulates compensatory ability. Here, we proposed a data-driven method to semi-quantitatively measure neural compensation under a challenging cognitive task, and we then explored connections between neural compensation to cognitive engagement and cognitive reserve (CR). Functional and structural magnetic resonance imaging scans were acquired for 26 healthy older adults during a face-name memory task. Spatial independent component analysis (ICA) identified visual, attentional and left executive as core networks. Results show that the smaller the volumes of the gray matter (GM) structures within core networks, the more networks were needed to conduct the task ( r = -0.408, p = 0.035). Therefore, the number of task-activated networks controlling for the GM volume within core networks was defined as a measure of neural compensatory ability. We found that compensatory ability correlated with working memory performance ( r = 0.528, p = 0.035). Among subjects with good memory task performance, those with higher CR used fewer networks than subjects with lower CR. Among poor-performance subjects, those using more networks had higher CR. Our results indicated that using a high cognitive-demanding task to measure the number of activated neural networks could be a useful and sensitive measure of neural compensation in older adults.

  11. Resonator memories and optical novelty filters

    NASA Astrophysics Data System (ADS)

    Anderson, Dana Z.; Erle, Marie C.

    Optical resonators having holographic elements are potential candidates for storing information that can be accessed through content addressable or associative recall. Closely related to the resonator memory is the optical novelty filter, which can detect the differences between a test object and a set of reference objects. We discuss implementations of these devices using continuous optical media such as photorefractive materials. The discussion is framed in the context of neural network models. There are both formal and qualitative similarities between the resonator memory and optical novelty filter and network models. Mode competition arises in the theory of the resonator memory, much as it does in some network models. We show that the role of the phenomena of "daydreaming" in the real-time programmable optical resonator is very much akin to the role of "unlearning" in neural network memories. The theory of programming the real-time memory for a single mode is given in detail. This leads to a discussion of the optical novelty filter. Experimental results for the resonator memory, the real-time programmable memory, and the optical tracking novelty filter are reviewed. We also point to several issues that need to be addressed in order to implement more formal models of neural networks.

  12. Resonator Memories And Optical Novelty Filters

    NASA Astrophysics Data System (ADS)

    Anderson, Dana Z.; Erie, Marie C.

    1987-05-01

    Optical resonators having holographic elements are potential candidates for storing information that can be accessed through content-addressable or associative recall. Closely related to the resonator memory is the optical novelty filter, which can detect the differences between a test object and a set of reference objects. We discuss implementations of these devices using continuous optical media such as photorefractive ma-terials. The discussion is framed in the context of neural network models. There are both formal and qualitative similarities between the resonator memory and optical novelty filter and network models. Mode competition arises in the theory of the resonator memory, much as it does in some network models. We show that the role of the phenomena of "daydream-ing" in the real-time programmable optical resonator is very much akin to the role of "unlearning" in neural network memories. The theory of programming the real-time memory for a single mode is given in detail. This leads to a discussion of the optical novelty filter. Experimental results for the resonator memory, the real-time programmable memory, and the optical tracking novelty filter are reviewed. We also point to several issues that need to be addressed in order to implement more formal models of neural networks.

  13. Linear matrix inequality approach to exponential synchronization of a class of chaotic neural networks with time-varying delays

    NASA Astrophysics Data System (ADS)

    Wu, Wei; Cui, Bao-Tong

    2007-07-01

    In this paper, a synchronization scheme for a class of chaotic neural networks with time-varying delays is presented. This class of chaotic neural networks covers several well-known neural networks, such as Hopfield neural networks, cellular neural networks, and bidirectional associative memory networks. The obtained criteria are expressed in terms of linear matrix inequalities, thus they can be efficiently verified. A comparison between our results and the previous results shows that our results are less restrictive.

  14. DANoC: An Efficient Algorithm and Hardware Codesign of Deep Neural Networks on Chip.

    PubMed

    Zhou, Xichuan; Li, Shengli; Tang, Fang; Hu, Shengdong; Lin, Zhi; Zhang, Lei

    2017-07-18

    Deep neural networks (NNs) are the state-of-the-art models for understanding the content of images and videos. However, implementing deep NNs in embedded systems is a challenging task, e.g., a typical deep belief network could exhaust gigabytes of memory and result in bandwidth and computational bottlenecks. To address this challenge, this paper presents an algorithm and hardware codesign for efficient deep neural computation. A hardware-oriented deep learning algorithm, named the deep adaptive network, is proposed to explore the sparsity of neural connections. By adaptively removing the majority of neural connections and robustly representing the reserved connections using binary integers, the proposed algorithm could save up to 99.9% memory utility and computational resources without undermining classification accuracy. An efficient sparse-mapping-memory-based hardware architecture is proposed to fully take advantage of the algorithmic optimization. Different from traditional Von Neumann architecture, the deep-adaptive network on chip (DANoC) brings communication and computation in close proximity to avoid power-hungry parameter transfers between on-board memory and on-chip computational units. Experiments over different image classification benchmarks show that the DANoC system achieves competitively high accuracy and efficiency comparing with the state-of-the-art approaches.

  15. Global asymptotic stability of hybrid bidirectional associative memory neural networks with time delays

    NASA Astrophysics Data System (ADS)

    Arik, Sabri

    2006-02-01

    This Letter presents a sufficient condition for the existence, uniqueness and global asymptotic stability of the equilibrium point for bidirectional associative memory (BAM) neural networks with distributed time delays. The results impose constraint conditions on the network parameters of neural system independently of the delay parameter, and they are applicable to all bounded continuous non-monotonic neuron activation functions. The results are also compared with the previous results derived in the literature.

  16. Programmable Analog Memory Resistors For Electronic Neural Networks

    NASA Technical Reports Server (NTRS)

    Ramesham, Rajeshuni; Thakoor, Sarita; Daud, Taher; Thakoor, Anilkumar P.

    1990-01-01

    Electrical resistance of new solid-state device altered repeatedly by suitable control signals, yet remains at steady value when control signal removed. Resistance set at low value ("on" state), high value ("off" state), or at any convenient intermediate value and left there until new value desired. Circuits of this type particularly useful in nonvolatile, associative electronic memories based on models of neural networks. Such programmable analog memory resistors ideally suited as synaptic interconnects in "self-learning" neural nets. Operation of device depends on electrochromic property of WO3, which when pure is insulator. Potential uses include nonvolatile, erasable, electronically programmable read-only memories.

  17. Statistical downscaling of precipitation using long short-term memory recurrent neural networks

    NASA Astrophysics Data System (ADS)

    Misra, Saptarshi; Sarkar, Sudeshna; Mitra, Pabitra

    2017-11-01

    Hydrological impacts of global climate change on regional scale are generally assessed by downscaling large-scale climatic variables, simulated by General Circulation Models (GCMs), to regional, small-scale hydrometeorological variables like precipitation, temperature, etc. In this study, we propose a new statistical downscaling model based on Recurrent Neural Network with Long Short-Term Memory which captures the spatio-temporal dependencies in local rainfall. The previous studies have used several other methods such as linear regression, quantile regression, kernel regression, beta regression, and artificial neural networks. Deep neural networks and recurrent neural networks have been shown to be highly promising in modeling complex and highly non-linear relationships between input and output variables in different domains and hence we investigated their performance in the task of statistical downscaling. We have tested this model on two datasets—one on precipitation in Mahanadi basin in India and the second on precipitation in Campbell River basin in Canada. Our autoencoder coupled long short-term memory recurrent neural network model performs the best compared to other existing methods on both the datasets with respect to temporal cross-correlation, mean squared error, and capturing the extremes.

  18. Large memory capacity in chaotic artificial neural networks: a view of the anti-integrable limit.

    PubMed

    Lin, Wei; Chen, Guanrong

    2009-08-01

    In the literature, it was reported that the chaotic artificial neural network model with sinusoidal activation functions possesses a large memory capacity as well as a remarkable ability of retrieving the stored patterns, better than the conventional chaotic model with only monotonic activation functions such as sigmoidal functions. This paper, from the viewpoint of the anti-integrable limit, elucidates the mechanism inducing the superiority of the model with periodic activation functions that includes sinusoidal functions. Particularly, by virtue of the anti-integrable limit technique, this paper shows that any finite-dimensional neural network model with periodic activation functions and properly selected parameters has much more abundant chaotic dynamics that truly determine the model's memory capacity and pattern-retrieval ability. To some extent, this paper mathematically and numerically demonstrates that an appropriate choice of the activation functions and control scheme can lead to a large memory capacity and better pattern-retrieval ability of the artificial neural network models.

  19. SPECIAL ISSUE ON OPTICAL PROCESSING OF INFORMATION: Optical neural networks based on holographic correlators

    NASA Astrophysics Data System (ADS)

    Sokolov, V. K.; Shubnikov, E. I.

    1995-10-01

    The three most important models of neural networks — a bidirectional associative memory, Hopfield networks, and adaptive resonance networks — are used as examples to show that a holographic correlator has its place in the neural computing paradigm.

  20. Spatial light modulators and applications III; Proceedings of the Meeting, San Diego, CA, Aug. 7, 8, 1989

    NASA Astrophysics Data System (ADS)

    Efron, Uzi

    Recent advances in the technology and applications of spatial light modulators (SLMs) are discussed in review essays by leading experts. Topics addressed include materials for SLMs, SLM devices and device technology, applications to optical data processing, and applications to artificial neural networks. Particular attention is given to nonlinear optical polymers, liquid crystals, magnetooptic SLMs, multiple-quantum-well SLMs, deformable-mirror SLMs, three-dimensional optical memories, applications of photorefractive devices to optical computing, photonic neurocomputers and learning machines, holographic associative memories, SLMs as parallel memories for optoelectronic neural networks, and coherent-optics implementations of neural-network models.

  1. Spatial light modulators and applications III; Proceedings of the Meeting, San Diego, CA, Aug. 7, 8, 1989

    NASA Technical Reports Server (NTRS)

    Efron, Uzi (Editor)

    1990-01-01

    Recent advances in the technology and applications of spatial light modulators (SLMs) are discussed in review essays by leading experts. Topics addressed include materials for SLMs, SLM devices and device technology, applications to optical data processing, and applications to artificial neural networks. Particular attention is given to nonlinear optical polymers, liquid crystals, magnetooptic SLMs, multiple-quantum-well SLMs, deformable-mirror SLMs, three-dimensional optical memories, applications of photorefractive devices to optical computing, photonic neurocomputers and learning machines, holographic associative memories, SLMs as parallel memories for optoelectronic neural networks, and coherent-optics implementations of neural-network models.

  2. Global robust stability of bidirectional associative memory neural networks with multiple time delays.

    PubMed

    Senan, Sibel; Arik, Sabri

    2007-10-01

    This correspondence presents a sufficient condition for the existence, uniqueness, and global robust asymptotic stability of the equilibrium point for bidirectional associative memory neural networks with discrete time delays. The results impose constraint conditions on the network parameters of the neural system independently of the delay parameter, and they are applicable to all bounded continuous nonmonotonic neuron activation functions. Some numerical examples are given to compare our results with the previous robust stability results derived in the literature.

  3. Periodic bidirectional associative memory neural networks with distributed delays

    NASA Astrophysics Data System (ADS)

    Chen, Anping; Huang, Lihong; Liu, Zhigang; Cao, Jinde

    2006-05-01

    Some sufficient conditions are obtained for the existence and global exponential stability of a periodic solution to the general bidirectional associative memory (BAM) neural networks with distributed delays by using the continuation theorem of Mawhin's coincidence degree theory and the Lyapunov functional method and the Young's inequality technique. These results are helpful for designing a globally exponentially stable and periodic oscillatory BAM neural network, and the conditions can be easily verified and be applied in practice. An example is also given to illustrate our results.

  4. Terminal attractors for addressable memory in neural networks

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    1988-01-01

    A new type of attractors - terminal attractors - for an addressable memory in neural networks operating in continuous time is introduced. These attractors represent singular solutions of the dynamical system. They intersect (or envelope) the families of regular solutions while each regular solution approaches the terminal attractor in a finite time period. It is shown that terminal attractors can be incorporated into neural networks such that any desired set of these attractors with prescribed basins is provided by an appropriate selection of the weight matrix.

  5. Temporal neural networks and transient analysis of complex engineering systems

    NASA Astrophysics Data System (ADS)

    Uluyol, Onder

    A theory is introduced for a multi-layered Local Output Gamma Feedback (LOGF) neural network within the paradigm of Locally-Recurrent Globally-Feedforward neural networks. It is developed for the identification, prediction, and control tasks of spatio-temporal systems and allows for the presentation of different time scales through incorporation of a gamma memory. It is initially applied to the tasks of sunspot and Mackey-Glass series prediction as benchmarks, then it is extended to the task of power level control of a nuclear reactor at different fuel cycle conditions. The developed LOGF neuron model can also be viewed as a Transformed Input and State (TIS) Gamma memory for neural network architectures for temporal processing. The novel LOGF neuron model extends the static neuron model by incorporating into it a short-term memory structure in the form of a digital gamma filter. A feedforward neural network made up of LOGF neurons can thus be used to model dynamic systems. A learning algorithm based upon the Backpropagation-Through-Time (BTT) approach is derived. It is applicable for training a general L-layer LOGF neural network. The spatial and temporal weights and parameters of the network are iteratively optimized for a given problem using the derived learning algorithm.

  6. Still searching for the engram.

    PubMed

    Eichenbaum, Howard

    2016-09-01

    For nearly a century, neurobiologists have searched for the engram-the neural representation of a memory. Early studies showed that the engram is widely distributed both within and across brain areas and is supported by interactions among large networks of neurons. Subsequent research has identified engrams that support memory within dedicated functional systems for habit learning and emotional memory, but the engram for declarative memories has been elusive. Nevertheless, recent years have brought progress from molecular biological approaches that identify neurons and networks that are necessary and sufficient to support memory, and from recording approaches and population analyses that characterize the information coded by large neural networks. These new directions offer the promise of revealing the engrams for episodic and semantic memories.

  7. Convergence analysis of stochastic hybrid bidirectional associative memory neural networks with delays

    NASA Astrophysics Data System (ADS)

    Wan, Li; Zhou, Qinghua

    2007-10-01

    The stability property of stochastic hybrid bidirectional associate memory (BAM) neural networks with discrete delays is considered. Without assuming the symmetry of synaptic connection weights and the monotonicity and differentiability of activation functions, the delay-independent sufficient conditions to guarantee the exponential stability of the equilibrium solution for such networks are given by using the nonnegative semimartingale convergence theorem.

  8. Development of a computational model on the neural activity patterns of a visual working memory in a hierarchical feedforward Network

    NASA Astrophysics Data System (ADS)

    An, Soyoung; Choi, Woochul; Paik, Se-Bum

    2015-11-01

    Understanding the mechanism of information processing in the human brain remains a unique challenge because the nonlinear interactions between the neurons in the network are extremely complex and because controlling every relevant parameter during an experiment is difficult. Therefore, a simulation using simplified computational models may be an effective approach. In the present study, we developed a general model of neural networks that can simulate nonlinear activity patterns in the hierarchical structure of a neural network system. To test our model, we first examined whether our simulation could match the previously-observed nonlinear features of neural activity patterns. Next, we performed a psychophysics experiment for a simple visual working memory task to evaluate whether the model could predict the performance of human subjects. Our studies show that the model is capable of reproducing the relationship between memory load and performance and may contribute, in part, to our understanding of how the structure of neural circuits can determine the nonlinear neural activity patterns in the human brain.

  9. pth moment exponential stability of stochastic memristor-based bidirectional associative memory (BAM) neural networks with time delays.

    PubMed

    Wang, Fen; Chen, Yuanlong; Liu, Meichun

    2018-02-01

    Stochastic memristor-based bidirectional associative memory (BAM) neural networks with time delays play an increasingly important role in the design and implementation of neural network systems. Under the framework of Filippov solutions, the issues of the pth moment exponential stability of stochastic memristor-based BAM neural networks are investigated. By using the stochastic stability theory, Itô's differential formula and Young inequality, the criteria are derived. Meanwhile, with Lyapunov approach and Cauchy-Schwarz inequality, we derive some sufficient conditions for the mean square exponential stability of the above systems. The obtained results improve and extend previous works on memristor-based or usual neural networks dynamical systems. Four numerical examples are provided to illustrate the effectiveness of the proposed results. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Neural network modeling of associative memory: Beyond the Hopfield model

    NASA Astrophysics Data System (ADS)

    Dasgupta, Chandan

    1992-07-01

    A number of neural network models, in which fixed-point and limit-cycle attractors of the underlying dynamics are used to store and associatively recall information, are described. In the first class of models, a hierarchical structure is used to store an exponentially large number of strongly correlated memories. The second class of models uses limit cycles to store and retrieve individual memories. A neurobiologically plausible network that generates low-amplitude periodic variations of activity, similar to the oscillations observed in electroencephalographic recordings, is also described. Results obtained from analytic and numerical studies of the properties of these networks are discussed.

  11. Dismount Threat Recognition through Automatic Pose Identification

    DTIC Science & Technology

    2012-03-01

    10 2.2.2 Enabling Technologies . . . . . . . . . . . . . . 11 2.2.3 Associative Memory Neural Networks . . . . . . 12 III. Methodology...20 3.2.3 Creating Separability . . . . . . . . . . . . . . . 23 3.3 Training the Associative Memory Neural Network... Effects of Parameter and Method Choices . . . . . . . . 30 4.3.1 Decimel versus Bipolar . . . . . . . . . . . . . . 30 4.3.2 Bipolar and Binary Values

  12. Still searching for the engram

    PubMed Central

    Eichenbaum, Howard

    2016-01-01

    For nearly a century neurobiologists have searched for the engram - the neural representation of a memory. Early studies showed that the engram is widely distributed both within and across brain areas and is supported by interactions among large networks of neurons. Subsequent research has identified engrams that support memory within dedicated functional systems for habit learning and emotional memory, but the engram for declarative memories has been elusive. Nevertheless, recent years have brought progress from molecular biological approaches that identify neurons and networks that are necessary and sufficient to support memory, and from recording approaches and population analyses that characterize the information coded by large neural networks. These new directions offer the promise of revealing the engrams for episodic and semantic memories. PMID:26944423

  13. Discrete-time bidirectional associative memory neural networks with variable delays

    NASA Astrophysics Data System (ADS)

    Liang, variable delays [rapid communication] J.; Cao, J.; Ho, D. W. C.

    2005-02-01

    Based on the linear matrix inequality (LMI), some sufficient conditions are presented in this Letter for the existence, uniqueness and global exponential stability of the equilibrium point of discrete-time bidirectional associative memory (BAM) neural networks with variable delays. Some of the stability criteria obtained in this Letter are delay-dependent, and some of them are delay-independent, they are less conservative than the ones reported so far in the literature. Furthermore, the results provide one more set of easily verified criteria for determining the exponential stability of discrete-time BAM neural networks.

  14. A Recurrent Network Model of Somatosensory Parametric Working Memory in the Prefrontal Cortex

    PubMed Central

    Miller, Paul; Brody, Carlos D; Romo, Ranulfo; Wang, Xiao-Jing

    2015-01-01

    A parametric working memory network stores the information of an analog stimulus in the form of persistent neural activity that is monotonically tuned to the stimulus. The family of persistent firing patterns with a continuous range of firing rates must all be realizable under exactly the same external conditions (during the delay when the transient stimulus is withdrawn). How this can be accomplished by neural mechanisms remains an unresolved question. Here we present a recurrent cortical network model of irregularly spiking neurons that was designed to simulate a somatosensory working memory experiment with behaving monkeys. Our model reproduces the observed positively and negatively monotonic persistent activity, and heterogeneous tuning curves of memory activity. We show that fine-tuning mathematically corresponds to a precise alignment of cusps in the bifurcation diagram of the network. Moreover, we show that the fine-tuned network can integrate stimulus inputs over several seconds. Assuming that such time integration occurs in neural populations downstream from a tonically persistent neural population, our model is able to account for the slow ramping-up and ramping-down behaviors of neurons observed in prefrontal cortex. PMID:14576212

  15. Artificial Neural Network Metamodels of Stochastic Computer Simulations

    DTIC Science & Technology

    1994-08-10

    SUBTITLE r 5. FUNDING NUMBERS Artificial Neural Network Metamodels of Stochastic I () Computer Simulations 6. AUTHOR(S) AD- A285 951 Robert Allen...8217!298*1C2 ARTIFICIAL NEURAL NETWORK METAMODELS OF STOCHASTIC COMPUTER SIMULATIONS by Robert Allen Kilmer B.S. in Education Mathematics, Indiana...dedicate this document to the memory of my father, William Ralph Kilmer. mi ABSTRACT Signature ARTIFICIAL NEURAL NETWORK METAMODELS OF STOCHASTIC

  16. Artificial neural networks as quantum associative memory

    NASA Astrophysics Data System (ADS)

    Hamilton, Kathleen; Schrock, Jonathan; Imam, Neena; Humble, Travis

    We present results related to the recall accuracy and capacity of Hopfield networks implemented on commercially available quantum annealers. The use of Hopfield networks and artificial neural networks as content-addressable memories offer robust storage and retrieval of classical information, however, implementation of these models using currently available quantum annealers faces several challenges: the limits of precision when setting synaptic weights, the effects of spurious spin-glass states and minor embedding of densely connected graphs into fixed-connectivity hardware. We consider neural networks which are less than fully-connected, and also consider neural networks which contain multiple sparsely connected clusters. We discuss the effect of weak edge dilution on the accuracy of memory recall, and discuss how the multiple clique structure affects the storage capacity. Our work focuses on storage of patterns which can be embedded into physical hardware containing n < 1000 qubits. This work was supported by the United States Department of Defense and used resources of the Computational Research and Development Programs as Oak Ridge National Laboratory under Contract No. DE-AC0500OR22725 with the U. S. Department of Energy.

  17. Recognition of Telugu characters using neural networks.

    PubMed

    Sukhaswami, M B; Seetharamulu, P; Pujari, A K

    1995-09-01

    The aim of the present work is to recognize printed and handwritten Telugu characters using artificial neural networks (ANNs). Earlier work on recognition of Telugu characters has been done using conventional pattern recognition techniques. We make an initial attempt here of using neural networks for recognition with the aim of improving upon earlier methods which do not perform effectively in the presence of noise and distortion in the characters. The Hopfield model of neural network working as an associative memory is chosen for recognition purposes initially. Due to limitation in the capacity of the Hopfield neural network, we propose a new scheme named here as the Multiple Neural Network Associative Memory (MNNAM). The limitation in storage capacity has been overcome by combining multiple neural networks which work in parallel. It is also demonstrated that the Hopfield network is suitable for recognizing noisy printed characters as well as handwritten characters written by different "hands" in a variety of styles. Detailed experiments have been carried out using several learning strategies and results are reported. It is shown here that satisfactory recognition is possible using the proposed strategy. A detailed preprocessing scheme of the Telugu characters from digitized documents is also described.

  18. Deciphering Neural Codes of Memory during Sleep

    PubMed Central

    Chen, Zhe; Wilson, Matthew A.

    2017-01-01

    Memories of experiences are stored in the cerebral cortex. Sleep is critical for consolidating hippocampal memory of wake experiences into the neocortex. Understanding representations of neural codes of hippocampal-neocortical networks during sleep would reveal important circuit mechanisms on memory consolidation, and provide novel insights into memory and dreams. Although sleep-associated ensemble spike activity has been investigated, identifying the content of memory in sleep remains challenging. Here, we revisit important experimental findings on sleep-associated memory (i.e., neural activity patterns in sleep that reflect memory processing) and review computational approaches for analyzing sleep-associated neural codes (SANC). We focus on two analysis paradigms for sleep-associated memory, and propose a new unsupervised learning framework (“memory first, meaning later”) for unbiased assessment of SANC. PMID:28390699

  19. A neural network model of memory and higher cognitive functions.

    PubMed

    Vogel, David D

    2005-01-01

    I first describe a neural network model of associative memory in a small region of the brain. The model depends, unconventionally, on disinhibition of inhibitory links between excitatory neurons rather than long-term potentiation (LTP) of excitatory projections. The model may be shown to have advantages over traditional neural network models both in terms of information storage capacity and biological plausibility. The learning and recall algorithms are independent of network architecture, and require no thresholds or finely graded synaptic strengths. Several copies of this local network are then connected by means of many, weak, reciprocal, excitatory projections that allow one region to control the recall of information in another to produce behaviors analogous to serial memory, classical and operant conditioning, secondary reinforcement, refabrication of memory, and fabrication of possible future events. The network distinguishes between perceived and recalled events, and can predicate its response on the absence as well as the presence of particular stimuli. Some of these behaviors are achieved in ways that seem to provide instances of self-awareness and imagination, suggesting that consciousness may emerge as an epiphenomenon in simple brains.

  20. Functional connectivity of hippocampal and prefrontal networks during episodic and spatial memory based on real-world environments.

    PubMed

    Robin, Jessica; Hirshhorn, Marnie; Rosenbaum, R Shayna; Winocur, Gordon; Moscovitch, Morris; Grady, Cheryl L

    2015-01-01

    Several recent studies have compared episodic and spatial memory in neuroimaging paradigms in order to understand better the contribution of the hippocampus to each of these tasks. In the present study, we build on previous findings showing common neural activation in default network areas during episodic and spatial memory tasks based on familiar, real-world environments (Hirshhorn et al. (2012) Neuropsychologia 50:3094-3106). Following previous demonstrations of the presence of functionally connected sub-networks within the default network, we performed seed-based functional connectivity analyses to determine how, depending on the task, the hippocampus and prefrontal cortex differentially couple with one another and with distinct whole-brain networks. We found evidence for a medial prefrontal-parietal network and a medial temporal lobe network, which were functionally connected to the prefrontal and hippocampal seeds, respectively, regardless of the nature of the memory task. However, these two networks were functionally connected with one another during the episodic memory task, but not during spatial memory tasks. Replicating previous reports of fractionation of the default network into stable sub-networks, this study also shows how these sub-networks may flexibly couple and uncouple with one another based on task demands. These findings support the hypothesis that episodic memory and spatial memory share a common medial temporal lobe-based neural substrate, with episodic memory recruiting additional prefrontal sub-networks. © 2014 Wiley Periodicals, Inc.

  1. Improved Adjoint-Operator Learning For A Neural Network

    NASA Technical Reports Server (NTRS)

    Toomarian, Nikzad; Barhen, Jacob

    1995-01-01

    Improved method of adjoint-operator learning reduces amount of computation and associated computational memory needed to make electronic neural network learn temporally varying pattern (e.g., to recognize moving object in image) in real time. Method extension of method described in "Adjoint-Operator Learning for a Neural Network" (NPO-18352).

  2. Short-Term Memory in Orthogonal Neural Networks

    NASA Astrophysics Data System (ADS)

    White, Olivia L.; Lee, Daniel D.; Sompolinsky, Haim

    2004-04-01

    We study the ability of linear recurrent networks obeying discrete time dynamics to store long temporal sequences that are retrievable from the instantaneous state of the network. We calculate this temporal memory capacity for both distributed shift register and random orthogonal connectivity matrices. We show that the memory capacity of these networks scales with system size.

  3. Attractor neural networks with resource-efficient synaptic connectivity

    NASA Astrophysics Data System (ADS)

    Pehlevan, Cengiz; Sengupta, Anirvan

    Memories are thought to be stored in the attractor states of recurrent neural networks. Here we explore how resource constraints interplay with memory storage function to shape synaptic connectivity of attractor networks. We propose that given a set of memories, in the form of population activity patterns, the neural circuit choses a synaptic connectivity configuration that minimizes a resource usage cost. We argue that the total synaptic weight (l1-norm) in the network measures the resource cost because synaptic weight is correlated with synaptic volume, which is a limited resource, and is proportional to neurotransmitter release and post-synaptic current, both of which cost energy. Using numerical simulations and replica theory, we characterize optimal connectivity profiles in resource-efficient attractor networks. Our theory explains several experimental observations on cortical connectivity profiles, 1) connectivity is sparse, because synapses are costly, 2) bidirectional connections are overrepresented and 3) are stronger, because attractor states need strong recurrence.

  4. A hypercube compact neural network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rostykus, P.L.; Somani, A.K.

    1988-09-01

    A major problem facing implementation of neural networks is the connection problem. One popular tradeoff is to remove connections. Random disconnection severely degrades the capabilities. The hypercube based Compact Neural Network (CNN) has structured architecture combined with a rearrangement of the memory vectors gives a larger input space and better degradation than a cost equivalent network with more connections. The CNNs are based on a Hopfield network. The changes from the Hopfield net include states of -1 and +1 and when a node was evaluated to 0, it was not biased either positive or negative, instead it resumed its previousmore » state. L = PEs, N = memories and t/sub ij/s is the weights between i and j.« less

  5. Neural networks and MIMD-multiprocessors

    NASA Technical Reports Server (NTRS)

    Vanhala, Jukka; Kaski, Kimmo

    1990-01-01

    Two artificial neural network models are compared. They are the Hopfield Neural Network Model and the Sparse Distributed Memory model. Distributed algorithms for both of them are designed and implemented. The run time characteristics of the algorithms are analyzed theoretically and tested in practice. The storage capacities of the networks are compared. Implementations are done using a distributed multiprocessor system.

  6. Robust stability of bidirectional associative memory neural networks with time delays

    NASA Astrophysics Data System (ADS)

    Park, Ju H.

    2006-01-01

    Based on the Lyapunov Krasovskii functionals combined with linear matrix inequality approach, a novel stability criterion is proposed for asymptotic stability of bidirectional associative memory neural networks with time delays. A novel delay-dependent stability criterion is given in terms of linear matrix inequalities, which can be solved easily by various optimization algorithms.

  7. Terminal attractors in neural networks

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    1989-01-01

    A new type of attractor (terminal attractors) for content-addressable memory, associative memory, and pattern recognition in artificial neural networks operating in continuous time is introduced. The idea of a terminal attractor is based upon a violation of the Lipschitz condition at a fixed point. As a result, the fixed point becomes a singular solution which envelopes the family of regular solutions, while each regular solution approaches such an attractor in finite time. It will be shown that terminal attractors can be incorporated into neural networks such that any desired set of these attractors with prescribed basins is provided by an appropriate selection of the synaptic weights. The applications of terminal attractors for content-addressable and associative memories, pattern recognition, self-organization, and for dynamical training are illustrated.

  8. The list-composition effect in memory for emotional and neutral pictures: Differential contribution of ventral and dorsal attention networks to successful encoding.

    PubMed

    Barnacle, Gemma E; Montaldi, Daniela; Talmi, Deborah; Sommer, Tobias

    2016-09-01

    The Emotional enhancement of memory (EEM) is observed in immediate free-recall memory tests when emotional and neutral stimuli are encoded and tested together ("mixed lists"), but surprisingly, not when they are encoded and tested separately ("pure lists"). Here our aim was to investigate whether the effect of list-composition (mixed versus pure lists) on the EEM is due to differential allocation of attention. We scanned participants with fMRI during encoding of semantically-related emotional (negative valence only) and neutral pictures. Analysis of memory performance data replicated previous work, demonstrating an interaction between list composition and emotional valence. In mixed lists, neural subsequent memory effects in the dorsal attention network were greater for neutral stimulus encoding, while neural subsequent memory effects for emotional stimuli were found in a region associated with the ventral attention network. These results imply that when life experiences include both emotional and neutral elements, memory for the latter is more highly correlated with neural activity representing goal-directed attention processing at encoding. Copyright © 2016. Published by Elsevier Ltd.

  9. Sequence memory based on coherent spin-interaction neural networks.

    PubMed

    Xia, Min; Wong, W K; Wang, Zhijie

    2014-12-01

    Sequence information processing, for instance, the sequence memory, plays an important role on many functions of brain. In the workings of the human brain, the steady-state period is alterable. However, in the existing sequence memory models using heteroassociations, the steady-state period cannot be changed in the sequence recall. In this work, a novel neural network model for sequence memory with controllable steady-state period based on coherent spininteraction is proposed. In the proposed model, neurons fire collectively in a phase-coherent manner, which lets a neuron group respond differently to different patterns and also lets different neuron groups respond differently to one pattern. The simulation results demonstrating the performance of the sequence memory are presented. By introducing a new coherent spin-interaction sequence memory model, the steady-state period can be controlled by dimension parameters and the overlap between the input pattern and the stored patterns. The sequence storage capacity is enlarged by coherent spin interaction compared with the existing sequence memory models. Furthermore, the sequence storage capacity has an exponential relationship to the dimension of the neural network.

  10. Physical exercise increases involvement of motor networks as a compensatory mechanism during a cognitively challenging task.

    PubMed

    Ji, Lanxin; Pearlson, Godfrey D; Zhang, Xue; Steffens, David C; Ji, Xiaoqing; Guo, Hua; Wang, Lihong

    2018-05-31

    Neuroimaging studies suggest that older adults may compensate for declines in cognitive function through neural compensation and reorganization of neural resources. While neural compensation as a key component of cognitive reserve is an important factor that mediates cognitive decline, the field lacks a quantitative measure of neural compensatory ability, and little is known about factors that may modify compensation, such as physical exercise. Twenty-five healthy older adults participated in a 6-week dance training exercise program. Gait speed, cognitive function, and functional magnetic resonance imaging during a challenging memory task were measured before and after the exercise program. In this study, we used a newly proposed data-driven independent component analysis approach to measure neural compensatory ability and tested the effect of physical exercise on neural compensation through a longitudinal study. After the exercise program, participants showed significantly improved memory performance in Logical Memory Test (WMS(LM)) (P < .001) and Rey Auditory Verbal Learning Test (P = .001) and increased gait speed measured by the 6-minute walking test (P = .01). Among all identified neural networks, only the motor cortices and cerebellum showed greater involvement during the memory task after exercise. Importantly, subjects who activated the motor network only after exercise (but not before exercise) showed WMS(LM) increases. We conclude that physical exercise improved gait speed, cognitive function, and compensatory ability through increased involvement of motor-related networks. Copyright © 2018 John Wiley & Sons, Ltd.

  11. DNA methylation mediates neural processing after odor learning in the honeybee

    PubMed Central

    Biergans, Stephanie D.; Claudianos, Charles; Reinhard, Judith; Galizia, C. Giovanni

    2017-01-01

    DNA methyltransferases (Dnmts) - epigenetic writers catalyzing the transfer of methyl-groups to cytosine (DNA methylation) – regulate different aspects of memory formation in many animal species. In honeybees, Dnmt activity is required to adjust the specificity of olfactory reward memories and bees’ relearning capability. The physiological relevance of Dnmt-mediated DNA methylation in neural networks, however, remains unknown. Here, we investigated how Dnmt activity impacts neuroplasticity in the bees’ primary olfactory center, the antennal lobe (AL) an equivalent of the vertebrate olfactory bulb. The AL is crucial for odor discrimination, an indispensable process in forming specific odor memories. Using pharmacological inhibition, we demonstrate that Dnmt activity influences neural network properties during memory formation in vivo. We show that Dnmt activity promotes fast odor pattern separation in trained bees. Furthermore, Dnmt activity during memory formation increases both the number of responding glomeruli and the response magnitude to a novel odor. These data suggest that Dnmt activity is necessary for a form of homoeostatic network control which might involve inhibitory interneurons in the AL network. PMID:28240742

  12. Complex Networks in Psychological Models

    NASA Astrophysics Data System (ADS)

    Wedemann, R. S.; Carvalho, L. S. A. V. D.; Donangelo, R.

    We develop schematic, self-organizing, neural-network models to describe mechanisms associated with mental processes, by a neurocomputational substrate. These models are examples of real world complex networks with interesting general topological structures. Considering dopaminergic signal-to-noise neuronal modulation in the central nervous system, we propose neural network models to explain development of cortical map structure and dynamics of memory access, and unify different mental processes into a single neurocomputational substrate. Based on our neural network models, neurotic behavior may be understood as an associative memory process in the brain, and the linguistic, symbolic associative process involved in psychoanalytic working-through can be mapped onto a corresponding process of reconfiguration of the neural network. The models are illustrated through computer simulations, where we varied dopaminergic modulation and observed the self-organizing emergent patterns at the resulting semantic map, interpreting them as different manifestations of mental functioning, from psychotic through to normal and neurotic behavior, and creativity.

  13. Optoelectronic Inner-Product Neural Associative Memory

    NASA Technical Reports Server (NTRS)

    Liu, Hua-Kuang

    1993-01-01

    Optoelectronic apparatus acts as artificial neural network performing associative recall of binary images. Recall process is iterative one involving optical computation of inner products between binary input vector and one or more reference binary vectors in memory. Inner-product method requires far less memory space than matrix-vector method.

  14. Neural network based feed-forward high density associative memory

    NASA Technical Reports Server (NTRS)

    Daud, T.; Moopenn, A.; Lamb, J. L.; Ramesham, R.; Thakoor, A. P.

    1987-01-01

    A novel thin film approach to neural-network-based high-density associative memory is described. The information is stored locally in a memory matrix of passive, nonvolatile, binary connection elements with a potential to achieve a storage density of 10 to the 9th bits/sq cm. Microswitches based on memory switching in thin film hydrogenated amorphous silicon, and alternatively in manganese oxide, have been used as programmable read-only memory elements. Low-energy switching has been ascertained in both these materials. Fabrication and testing of memory matrix is described. High-speed associative recall approaching 10 to the 7th bits/sec and high storage capacity in such a connection matrix memory system is also described.

  15. Global asymptotic stability analysis of bidirectional associative memory neural networks with time delays.

    PubMed

    Arik, Sabri

    2005-05-01

    This paper presents a sufficient condition for the existence, uniqueness and global asymptotic stability of the equilibrium point for bidirectional associative memory (BAM) neural networks with distributed time delays. The results impose constraint conditions on the network parameters of neural system independently of the delay parameter, and they are applicable to all continuous nonmonotonic neuron activation functions. It is shown that in some special cases of the results, the stability criteria can be easily checked. Some examples are also given to compare the results with the previous results derived in the literature.

  16. Oscillator Neural Network Retrieving Sparsely Coded Phase Patterns

    NASA Astrophysics Data System (ADS)

    Aoyagi, Toshio; Nomura, Masaki

    1999-08-01

    Little is known theoretically about the associative memory capabilities of neural networks in which information is encoded not only in the mean firing rate but also in the timing of firings. Particularly, in the case of sparsely coded patterns, it is biologically important to consider the timings of firings and to study how such consideration influences storage capacities and quality of recalled patterns. For this purpose, we propose a simple extended model of oscillator neural networks to allow for expression of a nonfiring state. Analyzing both equilibrium states and dynamical properties in recalling processes, we find that the system possesses good associative memory.

  17. Increased functional connectivity within memory networks following memory rehabilitation in multiple sclerosis.

    PubMed

    Leavitt, Victoria M; Wylie, Glenn R; Girgis, Peter A; DeLuca, John; Chiaravalloti, Nancy D

    2014-09-01

    Identifying effective behavioral treatments to improve memory in persons with learning and memory impairment is a primary goal for neurorehabilitation researchers. Memory deficits are the most common cognitive symptom in multiple sclerosis (MS), and hold negative professional and personal consequences for people who are often in the prime of their lives when diagnosed. A 10-session behavioral treatment, the modified Story Memory Technique (mSMT), was studied in a randomized, placebo-controlled clinical trial. Behavioral improvements and increased fMRI activation were shown after treatment. Here, connectivity within the neural networks underlying memory function was examined with resting-state functional connectivity (RSFC) in a subset of participants from the clinical trial. We hypothesized that the treatment would result in increased integrity of connections within two primary memory networks of the brain, the hippocampal memory network, and the default network (DN). Seeds were placed in left and right hippocampus, and the posterior cingulate cortex. Increased connectivity was found between left hippocampus and cortical regions specifically involved in memory for visual imagery, as well as among critical hubs of the DN. These results represent the first evidence for efficacy of a behavioral intervention to impact the integrity of neural networks subserving memory functions in persons with MS.

  18. Use long short-term memory to enhance Internet of Things for combined sewer overflow monitoring

    NASA Astrophysics Data System (ADS)

    Zhang, Duo; Lindholm, Geir; Ratnaweera, Harsha

    2018-01-01

    Combined sewer overflow causes severe water pollution, urban flooding and reduced treatment plant efficiency. Understanding the behavior of CSO structures is vital for urban flooding prevention and overflow control. Neural networks have been extensively applied in water resource related fields. In this study, we collect data from an Internet of Things monitoring CSO structure and build different neural network models for simulating and predicting the water level of the CSO structure. Through a comparison of four different neural networks, namely multilayer perceptron (MLP), wavelet neural network (WNN), long short-term memory (LSTM) and gated recurrent unit (GRU), the LSTM and GRU present superior capabilities for multi-step-ahead time series prediction. Furthermore, GRU achieves prediction performances similar to LSTM with a quicker learning curve.

  19. Sex differences in the neural basis of emotional memories.

    PubMed

    Canli, Turhan; Desmond, John E; Zhao, Zuo; Gabrieli, John D E

    2002-08-06

    Psychological studies have found better memory in women than men for emotional events, but the neural basis for this difference is unknown. We used event-related functional MRI to assess whether sex differences in memory for emotional stimuli is associated with activation of different neural systems in men and women. Brain activation in 12 men and 12 women was recorded while they rated their experience of emotional arousal in response to neutral and emotionally negative pictures. In a recognition memory test 3 weeks after scanning, highly emotional pictures were remembered best, and remembered better by women than by men. Men and women activated different neural circuits to encode stimuli effectively into memory even when the analysis was restricted to pictures rated equally arousing by both groups. Men activated significantly more structures than women in a network that included the right amygdala, whereas women activated significantly fewer structures in a network that included the left amygdala. Women had significantly more brain regions where activation correlated with both ongoing evaluation of emotional experience and with subsequent memory for the most emotionally arousing pictures. Greater overlap in brain regions sensitive to current emotion and contributing to subsequent memory may be a neural mechanism for emotions to enhance memory more powerfully in women than in men.

  20. Neural-Network Simulator

    NASA Technical Reports Server (NTRS)

    Mitchell, Paul H.

    1991-01-01

    F77NNS (FORTRAN 77 Neural Network Simulator) computer program simulates popular back-error-propagation neural network. Designed to take advantage of vectorization when used on computers having this capability, also used on any computer equipped with ANSI-77 FORTRAN Compiler. Problems involving matching of patterns or mathematical modeling of systems fit class of problems F77NNS designed to solve. Program has restart capability so neural network solved in stages suitable to user's resources and desires. Enables user to customize patterns of connections between layers of network. Size of neural network F77NNS applied to limited only by amount of random-access memory available to user.

  1. Using chaotic artificial neural networks to model memory in the brain

    NASA Astrophysics Data System (ADS)

    Aram, Zainab; Jafari, Sajad; Ma, Jun; Sprott, Julien C.; Zendehrouh, Sareh; Pham, Viet-Thanh

    2017-03-01

    In the current study, a novel model for human memory is proposed based on the chaotic dynamics of artificial neural networks. This new model explains a biological fact about memory which is not yet explained by any other model: There are theories that the brain normally works in a chaotic mode, while during attention it shows ordered behavior. This model uses the periodic windows observed in a previously proposed model for the brain to store and then recollect the information.

  2. Changes in neural network homeostasis trigger neuropsychiatric symptoms.

    PubMed

    Winkelmann, Aline; Maggio, Nicola; Eller, Joanna; Caliskan, Gürsel; Semtner, Marcus; Häussler, Ute; Jüttner, René; Dugladze, Tamar; Smolinsky, Birthe; Kowalczyk, Sarah; Chronowska, Ewa; Schwarz, Günter; Rathjen, Fritz G; Rechavi, Gideon; Haas, Carola A; Kulik, Akos; Gloveli, Tengis; Heinemann, Uwe; Meier, Jochen C

    2014-02-01

    The mechanisms that regulate the strength of synaptic transmission and intrinsic neuronal excitability are well characterized; however, the mechanisms that promote disease-causing neural network dysfunction are poorly defined. We generated mice with targeted neuron type-specific expression of a gain-of-function variant of the neurotransmitter receptor for glycine (GlyR) that is found in hippocampectomies from patients with temporal lobe epilepsy. In this mouse model, targeted expression of gain-of-function GlyR in terminals of glutamatergic cells or in parvalbumin-positive interneurons persistently altered neural network excitability. The increased network excitability associated with gain-of-function GlyR expression in glutamatergic neurons resulted in recurrent epileptiform discharge, which provoked cognitive dysfunction and memory deficits without affecting bidirectional synaptic plasticity. In contrast, decreased network excitability due to gain-of-function GlyR expression in parvalbumin-positive interneurons resulted in an anxiety phenotype, but did not affect cognitive performance or discriminative associative memory. Our animal model unveils neuron type-specific effects on cognition, formation of discriminative associative memory, and emotional behavior in vivo. Furthermore, our data identify a presynaptic disease-causing molecular mechanism that impairs homeostatic regulation of neural network excitability and triggers neuropsychiatric symptoms.

  3. Periodic oscillation of higher-order bidirectional associative memory neural networks with periodic coefficients and delays

    NASA Astrophysics Data System (ADS)

    Ren, Fengli; Cao, Jinde

    2007-03-01

    In this paper, several sufficient conditions are obtained ensuring existence, global attractivity and global asymptotic stability of the periodic solution for the higher-order bidirectional associative memory neural networks with periodic coefficients and delays by using the continuation theorem of Mawhin's coincidence degree theory, the Lyapunov functional and the non-singular M-matrix. Two examples are exploited to illustrate the effectiveness of the proposed criteria. These results are more effective than the ones in the literature for some neural networks, and can be applied to the design of globally attractive or globally asymptotically stable networks and thus have important significance in both theory and applications.

  4. Optical resonators and neural networks

    NASA Astrophysics Data System (ADS)

    Anderson, Dana Z.

    1986-08-01

    It may be possible to implement neural network models using continuous field optical architectures. These devices offer the inherent parallelism of propagating waves and an information density in principle dictated by the wavelength of light and the quality of the bulk optical elements. Few components are needed to construct a relatively large equivalent network. Various associative memories based on optical resonators have been demonstrated in the literature, a ring resonator design is discussed in detail here. Information is stored in a holographic medium and recalled through a competitive processes in the gain medium supplying energy to the ring rsonator. The resonator memory is the first realized example of a neural network function implemented with this kind of architecture.

  5. Local interconnection neural networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang Jiajun; Zhang Li; Yan Dapen

    1993-06-01

    The idea of a local interconnection neural network (LINN) is presentd and compared with the globally interconnected Hopfield model. Under the storage limit requirement, LINN is shown to offer the same associative memory capability as the global interconnection neural network while having a much smaller interconnection matrix. LINN can be readily implemented optically using the currently available spatial light modulators. 15 refs.

  6. Deciphering Neural Codes of Memory during Sleep.

    PubMed

    Chen, Zhe; Wilson, Matthew A

    2017-05-01

    Memories of experiences are stored in the cerebral cortex. Sleep is critical for the consolidation of hippocampal memory of wake experiences into the neocortex. Understanding representations of neural codes of hippocampal-neocortical networks during sleep would reveal important circuit mechanisms in memory consolidation and provide novel insights into memory and dreams. Although sleep-associated ensemble spike activity has been investigated, identifying the content of memory in sleep remains challenging. Here we revisit important experimental findings on sleep-associated memory (i.e., neural activity patterns in sleep that reflect memory processing) and review computational approaches to the analysis of sleep-associated neural codes (SANCs). We focus on two analysis paradigms for sleep-associated memory and propose a new unsupervised learning framework ('memory first, meaning later') for unbiased assessment of SANCs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Similar patterns of neural activity predict memory function during encoding and retrieval.

    PubMed

    Kragel, James E; Ezzyat, Youssef; Sperling, Michael R; Gorniak, Richard; Worrell, Gregory A; Berry, Brent M; Inman, Cory; Lin, Jui-Jui; Davis, Kathryn A; Das, Sandhitsu R; Stein, Joel M; Jobst, Barbara C; Zaghloul, Kareem A; Sheth, Sameer A; Rizzuto, Daniel S; Kahana, Michael J

    2017-07-15

    Neural networks that span the medial temporal lobe (MTL), prefrontal cortex, and posterior cortical regions are essential to episodic memory function in humans. Encoding and retrieval are supported by the engagement of both distinct neural pathways across the cortex and common structures within the medial temporal lobes. However, the degree to which memory performance can be determined by neural processing that is common to encoding and retrieval remains to be determined. To identify neural signatures of successful memory function, we administered a delayed free-recall task to 187 neurosurgical patients implanted with subdural or intraparenchymal depth electrodes. We developed multivariate classifiers to identify patterns of spectral power across the brain that independently predicted successful episodic encoding and retrieval. During encoding and retrieval, patterns of increased high frequency activity in prefrontal, MTL, and inferior parietal cortices, accompanied by widespread decreases in low frequency power across the brain predicted successful memory function. Using a cross-decoding approach, we demonstrate the ability to predict memory function across distinct phases of the free-recall task. Furthermore, we demonstrate that classifiers that combine information from both encoding and retrieval states can outperform task-independent models. These findings suggest that the engagement of a core memory network during either encoding or retrieval shapes the ability to remember the past, despite distinct neural interactions that facilitate encoding and retrieval. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. The neural signature of emotional memories in serial crimes.

    PubMed

    Chassy, Philippe

    2017-10-01

    Neural plasticity is the process whereby semantic information and emotional responses are stored in neural networks. It is hypothesized that the neural networks built over time to encode the sexual fantasies that motivate serial killers to act should display a unique, detectable activation pattern. The pathological neural watermark hypothesis posits that such networks comprise activation of brain sites that reflect four cognitive components: autobiographical memory, sexual arousal, aggression, and control over aggression. The neural sites performing these cognitive functions have been successfully identified by previous research. The key findings are reviewed to hypothesise the typical pattern of activity that serial killers should display. Through the integration of biological findings into one framework, the neural approach proposed in this paper is in stark contrast with the many theories accounting for serial killers that offer non-medical taxonomies. The pathological neural watermark hypothesis offers a new framework to understand and detect deviant individuals. The technical and legal issues are briefly discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. As Working Memory Grows: A Developmental Account of Neural Bases of Working Memory Capacity in 5- to 8-Year Old Children and Adults.

    PubMed

    Kharitonova, Maria; Winter, Warren; Sheridan, Margaret A

    2015-09-01

    Working memory develops slowly: Even by age 8, children are able to maintain only half the number of items that adults can remember. Neural substrates that support performance on working memory tasks also have a slow developmental trajectory and typically activate to a lesser extent in children, relative to adults. Little is known about why younger participants elicit less neural activation. This may be due to maturational differences, differences in behavioral performance, or both. Here we investigate the neural correlates of working memory capacity in children (ages 5-8) and adults using a visual working memory task with parametrically increasing loads (from one to four items) using fMRI. This task allowed us to estimate working memory capacity limit for each group. We found that both age groups increased the activation of frontoparietal networks with increasing working memory loads, until working memory capacity was reached. Because children's working memory capacity limit was half of that for adults, the plateau occurred at lower loads for children. Had a parametric increase in load not been used, this would have given an impression of less activation overall and less load-dependent activation for children relative to adults. Our findings suggest that young children and adults recruit similar frontoparietal networks at working memory loads that do not exceed capacity and highlight the need to consider behavioral performance differences when interpreting developmental differences in neural activation.

  10. Slot-like capacity and resource-like coding in a neural model of multiple-item working memory.

    PubMed

    Standage, Dominic; Pare, Martin

    2018-06-27

    For the past decade, research on the storage limitations of working memory has been dominated by two fundamentally different hypotheses. On the one hand, the contents of working memory may be stored in a limited number of `slots', each with a fixed resolution. On the other hand, any number of items may be stored, but with decreasing resolution. These two hypotheses have been invaluable in characterizing the computational structure of working memory, but neither provides a complete account of the available experimental data, nor speaks to the neural basis of the limitations it characterizes. To address these shortcomings, we simulated a multiple-item working memory task with a cortical network model, the cellular resolution of which allowed us to quantify the coding fidelity of memoranda as a function of memory load, as measured by the discriminability, regularity and reliability of simulated neural spiking. Our simulations account for a wealth of neural and behavioural data from human and non-human primate studies, and they demonstrate that feedback inhibition lowers both capacity and coding fidelity. Because the strength of inhibition scales with the number of items stored by the network, increasing this number progressively lowers fidelity until capacity is reached. Crucially, the model makes specific, testable predictions for neural activity on multiple-item working memory tasks.

  11. Recall Performance for Content-Addressable Memory Using Adiabatic Quantum Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Imam, Neena; Humble, Travis S.; McCaskey, Alex

    A content-addressable memory (CAM) stores key-value associations such that the key is recalled by providing its associated value. While CAM recall is traditionally performed using recurrent neural network models, we show how to solve this problem using adiabatic quantum optimization. Our approach maps the recurrent neural network to a commercially available quantum processing unit by taking advantage of the common underlying Ising spin model. We then assess the accuracy of the quantum processor to store key-value associations by quantifying recall performance against an ensemble of problem sets. We observe that different learning rules from the neural network community influence recallmore » accuracy but performance appears to be limited by potential noise in the processor. The strong connection established between quantum processors and neural network problems supports the growing intersection of these two ideas.« less

  12. Livermore Big Artificial Neural Network Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Essen, Brian Van; Jacobs, Sam; Kim, Hyojin

    2016-07-01

    LBANN is a toolkit that is designed to train artificial neural networks efficiently on high performance computing architectures. It is optimized to take advantages of key High Performance Computing features to accelerate neural network training. Specifically it is optimized for low-latency, high bandwidth interconnects, node-local NVRAM, node-local GPU accelerators, and high bandwidth parallel file systems. It is built on top of the open source Elemental distributed-memory dense and spars-direct linear algebra and optimization library that is released under the BSD license. The algorithms contained within LBANN are drawn from the academic literature and implemented to work within a distributed-memory framework.

  13. Neural network based system for equipment surveillance

    DOEpatents

    Vilim, Richard B.; Gross, Kenneth C.; Wegerich, Stephan W.

    1998-01-01

    A method and system for performing surveillance of transient signals of an industrial device to ascertain the operating state. The method and system involves the steps of reading into a memory training data, determining neural network weighting values until achieving target outputs close to the neural network output. If the target outputs are inadequate, wavelet parameters are determined to yield neural network outputs close to the desired set of target outputs and then providing signals characteristic of an industrial process and comparing the neural network output to the industrial process signals to evaluate the operating state of the industrial process.

  14. Neural network based system for equipment surveillance

    DOEpatents

    Vilim, R.B.; Gross, K.C.; Wegerich, S.W.

    1998-04-28

    A method and system are disclosed for performing surveillance of transient signals of an industrial device to ascertain the operating state. The method and system involves the steps of reading into a memory training data, determining neural network weighting values until achieving target outputs close to the neural network output. If the target outputs are inadequate, wavelet parameters are determined to yield neural network outputs close to the desired set of target outputs and then providing signals characteristic of an industrial process and comparing the neural network output to the industrial process signals to evaluate the operating state of the industrial process. 33 figs.

  15. Random Boolean networks for autoassociative memory: Optimization and sequential learning

    NASA Astrophysics Data System (ADS)

    Sherrington, D.; Wong, K. Y. M.

    Conventional neural networks are based on synaptic storage of information, even when the neural states are discrete and bounded. In general, the set of potential local operations is much greater. Here we discuss some aspects of the properties of networks of binary neurons with more general Boolean functions controlling the local dynamics. Two specific aspects are emphasised; (i) optimization in the presence of noise and (ii) a simple model for short-term memory exhibiting primacy and recency in the recall of sequentially taught patterns.

  16. The functional neuroanatomy of autobiographical memory: A meta-analysis

    PubMed Central

    Svoboda, Eva; McKinnon, Margaret C.; Levine, Brian

    2007-01-01

    Autobiographical memory (AM) entails a complex set of operations, including episodic memory, self-reflection, emotion, visual imagery, attention, executive functions, and semantic processes. The heterogeneous nature of AM poses significant challenges in capturing its behavioral and neuroanatomical correlates. Investigators have recently turned their attention to the functional neuroanatomy of AM. We used the effect-location method of meta-analysis to analyze data from 24 functional imaging studies of AM. The results indicated a core neural network of left-lateralized regions, including the medial and ventrolateral prefrontal, medial and lateral temporal and retrosplenial/posterior cingulate cortices, the temporoparietal junction and the cerebellum. Secondary and tertiary regions, less frequently reported in imaging studies of AM, are also identified. We examined the neural correlates of putative component processes in AM, including, executive functions, self-reflection, episodic remembering and visuospatial processing. We also separately analyzed the effect of select variables on the AM network across individual studies, including memory age, qualitative factors (personal significance, level of detail and vividness), semantic and emotional content, and the effect of reference conditions. We found that memory age effects on medial temporal lobe structures may be modulated by qualitative aspects of memory. Studies using rest as a control task masked process-specific components of the AM neural network. Our findings support a neural distinction between episodic and semantic memory in AM. Finally, emotional events produced a shift in lateralization of the AM network with activation observed in emotion-centered regions and deactivation (or lack of activation) observed in regions associated with cognitive processes. PMID:16806314

  17. Consolidation in older adults depends upon competition between resting-state networks

    PubMed Central

    Jacobs, Heidi I. L.; Dillen, Kim N. H.; Risius, Okka; Göreci, Yasemin; Onur, Oezguer A.; Fink, Gereon R.; Kukolja, Juraj

    2015-01-01

    Memory encoding and retrieval problems are inherent to aging. To date, however, the effect of aging upon the neural correlates of forming memory traces remains poorly understood. Resting-state fMRI connectivity can be used to investigate initial consolidation. We compared within and between network connectivity differences between healthy young and older participants before encoding, after encoding and before retrieval by means of resting-state fMRI. Alterations over time in the between-network connectivity analyses correlated with retrieval performance, whereas within-network connectivity did not: a higher level of negative coupling or competition between the default mode and the executive networks during the after encoding condition was associated with increased retrieval performance in the older adults, but not in the young group. Data suggest that the effective formation of memory traces depends on an age-dependent, dynamic reorganization of the interaction between multiple, large-scale functional networks. Our findings demonstrate that a cross-network based approach can further the understanding of the neural underpinnings of aging-associated memory decline. PMID:25620930

  18. A network approach for modulating memory processes via direct and indirect brain stimulation: Toward a causal approach for the neural basis of memory.

    PubMed

    Kim, Kamin; Ekstrom, Arne D; Tandon, Nitin

    2016-10-01

    Electrical stimulation of the brain is a unique tool to perturb endogenous neural signals, allowing us to evaluate the necessity of given neural processes to cognitive processing. An important issue, gaining increasing interest in the literature, is whether and how stimulation can be employed to selectively improve or disrupt declarative memory processes. Here, we provide a comprehensive review of both invasive and non-invasive stimulation studies aimed at modulating memory performance. The majority of past studies suggest that invasive stimulation of the hippocampus impairs memory performance; similarly, most non-invasive studies show that disrupting frontal or parietal regions also impairs memory performance, suggesting that these regions also play necessary roles in declarative memory. On the other hand, a handful of both invasive and non-invasive studies have also suggested modest improvements in memory performance following stimulation. These studies typically target brain regions connected to the hippocampus or other memory "hubs," which may affect endogenous activity in connected areas like the hippocampus, suggesting that to augment declarative memory, altering the broader endogenous memory network activity is critical. Together, studies reporting memory improvements/impairments are consistent with the idea that a network of distinct brain "hubs" may be crucial for successful memory encoding and retrieval rather than a single primary hub such as the hippocampus. Thus, it is important to consider neurostimulation from the network perspective, rather than from a purely localizationalist viewpoint. We conclude by proposing a novel approach to neurostimulation for declarative memory modulation that aims to facilitate interactions between multiple brain "nodes" underlying memory rather than considering individual brain regions in isolation. Copyright © 2016. Published by Elsevier Inc.

  19. Phonological memory in sign language relies on the visuomotor neural system outside the left hemisphere language network.

    PubMed

    Kanazawa, Yuji; Nakamura, Kimihiro; Ishii, Toru; Aso, Toshihiko; Yamazaki, Hiroshi; Omori, Koichi

    2017-01-01

    Sign language is an essential medium for everyday social interaction for deaf people and plays a critical role in verbal learning. In particular, language development in those people should heavily rely on the verbal short-term memory (STM) via sign language. Most previous studies compared neural activations during signed language processing in deaf signers and those during spoken language processing in hearing speakers. For sign language users, it thus remains unclear how visuospatial inputs are converted into the verbal STM operating in the left-hemisphere language network. Using functional magnetic resonance imaging, the present study investigated neural activation while bilinguals of spoken and signed language were engaged in a sequence memory span task. On each trial, participants viewed a nonsense syllable sequence presented either as written letters or as fingerspelling (4-7 syllables in length) and then held the syllable sequence for 12 s. Behavioral analysis revealed that participants relied on phonological memory while holding verbal information regardless of the type of input modality. At the neural level, this maintenance stage broadly activated the left-hemisphere language network, including the inferior frontal gyrus, supplementary motor area, superior temporal gyrus and inferior parietal lobule, for both letter and fingerspelling conditions. Interestingly, while most participants reported that they relied on phonological memory during maintenance, direct comparisons between letters and fingers revealed strikingly different patterns of neural activation during the same period. Namely, the effortful maintenance of fingerspelling inputs relative to letter inputs activated the left superior parietal lobule and dorsal premotor area, i.e., brain regions known to play a role in visuomotor analysis of hand/arm movements. These findings suggest that the dorsal visuomotor neural system subserves verbal learning via sign language by relaying gestural inputs to the classical left-hemisphere language network.

  20. Analogue spin-orbit torque device for artificial-neural-network-based associative memory operation

    NASA Astrophysics Data System (ADS)

    Borders, William A.; Akima, Hisanao; Fukami, Shunsuke; Moriya, Satoshi; Kurihara, Shouta; Horio, Yoshihiko; Sato, Shigeo; Ohno, Hideo

    2017-01-01

    We demonstrate associative memory operations reminiscent of the brain using nonvolatile spintronics devices. Antiferromagnet-ferromagnet bilayer-based Hall devices, which show analogue-like spin-orbit torque switching under zero magnetic fields and behave as artificial synapses, are used. An artificial neural network is used to associate memorized patterns from their noisy versions. We develop a network consisting of a field-programmable gate array and 36 spin-orbit torque devices. An effect of learning on associative memory operations is successfully confirmed for several 3 × 3-block patterns. A discussion on the present approach for realizing spintronics-based artificial intelligence is given.

  1. Balanced Cortical Microcircuitry for Spatial Working Memory Based on Corrective Feedback Control

    PubMed Central

    2014-01-01

    A hallmark of working memory is the ability to maintain graded representations of both the spatial location and amplitude of a memorized stimulus. Previous work has identified a neural correlate of spatial working memory in the persistent maintenance of spatially specific patterns of neural activity. How such activity is maintained by neocortical circuits remains unknown. Traditional models of working memory maintain analog representations of either the spatial location or the amplitude of a stimulus, but not both. Furthermore, although most previous models require local excitation and lateral inhibition to maintain spatially localized persistent activity stably, the substrate for lateral inhibitory feedback pathways is unclear. Here, we suggest an alternative model for spatial working memory that is capable of maintaining analog representations of both the spatial location and amplitude of a stimulus, and that does not rely on long-range feedback inhibition. The model consists of a functionally columnar network of recurrently connected excitatory and inhibitory neural populations. When excitation and inhibition are balanced in strength but offset in time, drifts in activity trigger spatially specific negative feedback that corrects memory decay. The resulting networks can temporally integrate inputs at any spatial location, are robust against many commonly considered perturbations in network parameters, and, when implemented in a spiking model, generate irregular neural firing characteristic of that observed experimentally during persistent activity. This work suggests balanced excitatory–inhibitory memory circuits implementing corrective negative feedback as a substrate for spatial working memory. PMID:24828633

  2. Synchronization and long-time memory in neural networks with inhibitory hubs and synaptic plasticity

    NASA Astrophysics Data System (ADS)

    Bertolotti, Elena; Burioni, Raffaella; di Volo, Matteo; Vezzani, Alessandro

    2017-01-01

    We investigate the dynamical role of inhibitory and highly connected nodes (hub) in synchronization and input processing of leaky-integrate-and-fire neural networks with short term synaptic plasticity. We take advantage of a heterogeneous mean-field approximation to encode the role of network structure and we tune the fraction of inhibitory neurons fI and their connectivity level to investigate the cooperation between hub features and inhibition. We show that, depending on fI, highly connected inhibitory nodes strongly drive the synchronization properties of the overall network through dynamical transitions from synchronous to asynchronous regimes. Furthermore, a metastable regime with long memory of external inputs emerges for a specific fraction of hub inhibitory neurons, underlining the role of inhibition and connectivity also for input processing in neural networks.

  3. Abnormal Functional Activation and Connectivity in the Working Memory Network in Early-Onset Schizophrenia

    ERIC Educational Resources Information Center

    Kyriakopoulos, Marinos; Dima, Danai; Roiser, Jonathan P.; Corrigall, Richard; Barker, Gareth J.; Frangou, Sophia

    2012-01-01

    Objective: Disruption within the working memory (WM) neural network is considered an integral feature of schizophrenia. The WM network, and the dorsolateral prefrontal cortex (DLPFC) in particular, undergo significant remodeling in late adolescence. Potential interactions between developmental changes in the WM network and disease-related…

  4. Large-Scale Fluorescence Calcium-Imaging Methods for Studies of Long-Term Memory in Behaving Mammals

    PubMed Central

    Jercog, Pablo; Rogerson, Thomas; Schnitzer, Mark J.

    2016-01-01

    During long-term memory formation, cellular and molecular processes reshape how individual neurons respond to specific patterns of synaptic input. It remains poorly understood how such changes impact information processing across networks of mammalian neurons. To observe how networks encode, store, and retrieve information, neuroscientists must track the dynamics of large ensembles of individual cells in behaving animals, over timescales commensurate with long-term memory. Fluorescence Ca2+-imaging techniques can monitor hundreds of neurons in behaving mice, opening exciting avenues for studies of learning and memory at the network level. Genetically encoded Ca2+ indicators allow neurons to be targeted by genetic type or connectivity. Chronic animal preparations permit repeated imaging of neural Ca2+ dynamics over multiple weeks. Together, these capabilities should enable unprecedented analyses of how ensemble neural codes evolve throughout memory processing and provide new insights into how memories are organized in the brain. PMID:27048190

  5. Mittag-Leffler synchronization of delayed fractional-order bidirectional associative memory neural networks with discontinuous activations: state feedback control and impulsive control schemes.

    PubMed

    Ding, Xiaoshuai; Cao, Jinde; Zhao, Xuan; Alsaadi, Fuad E

    2017-08-01

    This paper is concerned with the drive-response synchronization for a class of fractional-order bidirectional associative memory neural networks with time delays, as well as in the presence of discontinuous activation functions. The global existence of solution under the framework of Filippov for such networks is firstly obtained based on the fixed-point theorem for condensing map. Then the state feedback and impulsive controllers are, respectively, designed to ensure the Mittag-Leffler synchronization of these neural networks and two new synchronization criteria are obtained, which are expressed in terms of a fractional comparison principle and Razumikhin techniques. Numerical simulations are presented to validate the proposed methodologies.

  6. Robust stability for stochastic bidirectional associative memory neural networks with time delays

    NASA Astrophysics Data System (ADS)

    Shu, H. S.; Lv, Z. W.; Wei, G. L.

    2008-02-01

    In this paper, the asymptotic stability is considered for a class of uncertain stochastic bidirectional associative memory neural networks with time delays and parameter uncertainties. The delays are time-invariant and the uncertainties are norm-bounded that enter into all network parameters. The aim of this paper is to establish easily verifiable conditions under which the delayed neural network is robustly asymptotically stable in the mean square for all admissible parameter uncertainties. By employing a Lyapunov-Krasovskii functional and conducting the stochastic analysis, a linear matrix inequality matrix inequality (LMI) approach is developed to derive the stability criteria. The proposed criteria can be easily checked by the Matlab LMI toolbox. A numerical example is given to demonstrate the usefulness of the proposed criteria.

  7. Visuospatial working memory in very preterm and term born children--impact of age and performance.

    PubMed

    Mürner-Lavanchy, I; Ritter, B C; Spencer-Smith, M M; Perrig, W J; Schroth, G; Steinlin, M; Everts, R

    2014-07-01

    Working memory is crucial for meeting the challenges of daily life and performing academic tasks, such as reading or arithmetic. Very preterm born children are at risk of low working memory capacity. The aim of this study was to examine the visuospatial working memory network of school-aged preterm children and to determine the effect of age and performance on the neural working memory network. Working memory was assessed in 41 very preterm born children and 36 term born controls (aged 7-12 years) using functional magnetic resonance imaging (fMRI) and neuropsychological assessment. While preterm children and controls showed equal working memory performance, preterm children showed less involvement of the right middle frontal gyrus, but higher fMRI activation in superior frontal regions than controls. The younger and low-performing preterm children presented an atypical working memory network whereas the older high-performing preterm children recruited a working memory network similar to the controls. Results suggest that younger and low-performing preterm children show signs of less neural efficiency in frontal brain areas. With increasing age and performance, compensational mechanisms seem to occur, so that in preterm children, the typical visuospatial working memory network is established by the age of 12 years. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  8. Nonvolatile Ionic Two-Terminal Memory Device

    NASA Technical Reports Server (NTRS)

    Williams, Roger M.

    1990-01-01

    Conceptual solid-state memory device nonvolatile and erasable and has only two terminals. Proposed device based on two effects: thermal phase transition and reversible intercalation of ions. Transfer of sodium ions between source of ions and electrical switching element increases or decreases electrical conductance of element, turning switch "on" or "off". Used in digital computers and neural-network computers. In neural networks, many small, densely packed switches function as erasable, nonvolatile synaptic elements.

  9. Neural Network Model For Fast Learning And Retrieval

    NASA Astrophysics Data System (ADS)

    Arsenault, Henri H.; Macukow, Bohdan

    1989-05-01

    An approach to learning in a multilayer neural network is presented. The proposed network learns by creating interconnections between the input layer and the intermediate layer. In one of the new storage prescriptions proposed, interconnections are excitatory (positive) only and the weights depend on the stored patterns. In the intermediate layer each mother cell is responsible for one stored pattern. Mutually interconnected neurons in the intermediate layer perform a winner-take-all operation, taking into account correlations between stored vectors. The performance of networks using this interconnection prescription is compared with two previously proposed schemes, one using inhibitory connections at the output and one using all-or-nothing interconnections. The network can be used as a content-addressable memory or as a symbolic substitution system that yields an arbitrarily defined output for any input. The training of a model to perform Boolean logical operations is also described. Computer simulations using the network as an autoassociative content-addressable memory show the model to be efficient. Content-addressable associative memories and neural logic modules can be combined to perform logic operations on highly corrupted data.

  10. The neural basis of involuntary episodic memories.

    PubMed

    Hall, Shana A; Rubin, David C; Miles, Amanda; Davis, Simon W; Wing, Erik A; Cabeza, Roberto; Berntsen, Dorthe

    2014-10-01

    Voluntary episodic memories require an intentional memory search, whereas involuntary episodic memories come to mind spontaneously without conscious effort. Cognitive neuroscience has largely focused on voluntary memory, leaving the neural mechanisms of involuntary memory largely unknown. We hypothesized that, because the main difference between voluntary and involuntary memory is the controlled retrieval processes required by the former, there would be greater frontal activity for voluntary than involuntary memories. Conversely, we predicted that other components of the episodic retrieval network would be similarly engaged in the two types of memory. During encoding, all participants heard sounds, half paired with pictures of complex scenes and half presented alone. During retrieval, paired and unpaired sounds were presented, panned to the left or to the right. Participants in the involuntary group were instructed to indicate the spatial location of the sound, whereas participants in the voluntary group were asked to additionally recall the pictures that had been paired with the sounds. All participants reported the incidence of their memories in a postscan session. Consistent with our predictions, voluntary memories elicited greater activity in dorsal frontal regions than involuntary memories, whereas other components of the retrieval network, including medial-temporal, ventral occipitotemporal, and ventral parietal regions were similarly engaged by both types of memories. These results clarify the distinct role of dorsal frontal and ventral occipitotemporal regions in predicting strategic retrieval and recalled information, respectively, and suggest that, although there are neural differences in retrieval, involuntary memories share neural components with established voluntary memory systems.

  11. The Neural Basis of Involuntary Episodic Memories

    PubMed Central

    Hall, Shana A.; Rubin, David C.; Miles, Amanda; Davis, Simon W.; Wing, Erik A.; Cabeza, Roberto; Berntsen, Dorthe

    2014-01-01

    Voluntary episodic memories require an intentional memory search, whereas involuntary episodic memories come to mind spontaneously without conscious effort. Cognitive neuroscience has largely focused on voluntary memory, leaving the neural mechanisms of involuntary memory largely unknown. We hypothesized that because the main difference between voluntary and involuntary memory is the controlled retrieval processes required by the former, there would be greater frontal activity for voluntary than involuntary memories. Conversely, we predicted that other components of the episodic retrieval network would be similarly engaged in the two types of memory. During encoding, all participants heard sounds, half paired with pictures of complex scenes and half presented alone. During retrieval, paired and unpaired sounds were presented panned to the left or to the right. Participants in the involuntary group were instructed to indicate the spatial location of the sound, whereas participants in the voluntary group were asked to additionally recall the pictures that had been paired with the sounds. All participants reported the incidence of their memories in a post-scan session. Consistent with our predictions, voluntary memories elicited greater activity in dorsal frontal regions than involuntary memories, whereas other components of the retrieval network, including medial temporal, ventral occipitotemporal, and ventral parietal regions were similarly engaged by both types of memories. These results clarify the distinct role of dorsal frontal and ventral occipitotemporal regions in predicting strategic retrieval and recalled information, respectively, and suggest that while there are neural differences in retrieval, involuntary memories share neural components with established voluntary memory systems. PMID:24702453

  12. Associative memory and its cerebral correlates in Alzheimer's disease: Evidence for distinct deficits of relational and conjunctive memory

    PubMed Central

    Bastin, Christine; Bahri, Mohamed Ali; Miévis, Frédéric; Lemaire, Christian; Collette, Fabienne; Genon, Sarah; Simon, Jessica; Guillaume, Bénédicte; Diana, Rachel A.; Yonelinas, Andrew P.; Salmon, Eric

    2014-01-01

    This study investigated the impact of Alzheimer's disease (AD) on conjunctive and relational binding in episodic memory. Mild AD patients and controls had to remember item-color associations by imagining color either as a contextual association (relational memory) or as a feature of the item to be encoded (conjunctive memory). Patients' performance in each condition was correlated with cerebral metabolism measured by FDG-PET. The results showed that AD patients had an impaired capacity to remember item-color associations, with deficits in both relational and conjunctive memory. However, performance in the two kinds of associative memory varied independently across patients. Partial least square analyses revealed that poor conjunctive memory was related to hypometabolism in an anterior temporal-posterior fusiform brain network, whereas relational memory correlated with metabolism in regions of the default mode network. These findings support the hypothesis of distinct neural systems specialized in different types of associative memory and point to heterogeneous profiles of memory alteration in Alzheimer's disease as a function of damage to the respective neural networks. PMID:25172390

  13. Associative memory in phasing neuron networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nair, Niketh S; Bochove, Erik J.; Braiman, Yehuda

    2014-01-01

    We studied pattern formation in a network of coupled Hindmarsh-Rose model neurons and introduced a new model for associative memory retrieval using networks of Kuramoto oscillators. Hindmarsh-Rose Neural Networks can exhibit a rich set of collective dynamics that can be controlled by their connectivity. Specifically, we showed an instance of Hebb's rule where spiking was correlated with network topology. Based on this, we presented a simple model of associative memory in coupled phase oscillators.

  14. Differentiating true and false schematic memories in older adults.

    PubMed

    Webb, Christina E; Dennis, Nancy A

    2018-02-06

    While schemas aid memory for schematically related information, the gist induced by the schema can also lead to high rates of false memories, especially in older adults. The neural mechanisms that support and differentiate true and false memories in aging are not well understood. The current study sought to clarify this, using a novel scene paradigm to investigate the role of schemas on true and false memories in older adults. Healthy older adults encoded schematic scenes (e.g., bathroom). At retrieval, participants were tested on their memory for both schematic and non-schematic targets and lures while fMRI data was collected. Results indicate that true memories were supported by the typical retrieval network, and activity in this network was greater for true than false memories. Schema specific retrieval was supported by mPFC, extending this common finding to aging. While no region differentiated false memories compared to correct rejections, results showed that individual differences in false memory rates were associated with variability in neural activity. The findings underscore the importance of elucidating the neural basis of cognition within older adults, as well as the specific contribution of individual differences to the neural basis of memory errors in aging. © The Author(s) 2018. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  15. Artificial Neural Networks and Instructional Technology.

    ERIC Educational Resources Information Center

    Carlson, Patricia A.

    1991-01-01

    Artificial neural networks (ANN), part of artificial intelligence, are discussed. Such networks are fed sample cases (training sets), learn how to recognize patterns in the sample data, and use this experience in handling new cases. Two cognitive roles for ANNs (intelligent filters and spreading, associative memories) are examined. Prototypes…

  16. Orthogonal Patterns In A Binary Neural Network

    NASA Technical Reports Server (NTRS)

    Baram, Yoram

    1991-01-01

    Report presents some recent developments in theory of binary neural networks. Subject matter relevant to associate (content-addressable) memories and to recognition of patterns - both of considerable importance in advancement of robotics and artificial intelligence. When probed by any pattern, network converges to one of stored patterns.

  17. Numerical Analysis of Modeling Based on Improved Elman Neural Network

    PubMed Central

    Jie, Shao

    2014-01-01

    A modeling based on the improved Elman neural network (IENN) is proposed to analyze the nonlinear circuits with the memory effect. The hidden layer neurons are activated by a group of Chebyshev orthogonal basis functions instead of sigmoid functions in this model. The error curves of the sum of squared error (SSE) varying with the number of hidden neurons and the iteration step are studied to determine the number of the hidden layer neurons. Simulation results of the half-bridge class-D power amplifier (CDPA) with two-tone signal and broadband signals as input have shown that the proposed behavioral modeling can reconstruct the system of CDPAs accurately and depict the memory effect of CDPAs well. Compared with Volterra-Laguerre (VL) model, Chebyshev neural network (CNN) model, and basic Elman neural network (BENN) model, the proposed model has better performance. PMID:25054172

  18. Binary synaptic connections based on memory switching in a-Si:H for artificial neural networks

    NASA Technical Reports Server (NTRS)

    Thakoor, A. P.; Lamb, J. L.; Moopenn, A.; Khanna, S. K.

    1987-01-01

    A scheme for nonvolatile associative electronic memory storage with high information storage density is proposed which is based on neural network models and which uses a matrix of two-terminal passive interconnections (synapses). It is noted that the massive parallelism in the architecture would require the ON state of a synaptic connection to be unusually weak (highly resistive). Memory switching using a-Si:H along with ballast resistors patterned from amorphous Ge-metal alloys is investigated for a binary programmable read only memory matrix. The fabrication of a 1600 synapse test array of uniform connection strengths and a-Si:H switching elements is discussed.

  19. Cortical connectivity and memory performance in cognitive decline: A study via graph theory from EEG data.

    PubMed

    Vecchio, F; Miraglia, F; Quaranta, D; Granata, G; Romanello, R; Marra, C; Bramanti, P; Rossini, P M

    2016-03-01

    Functional brain abnormalities including memory loss are found to be associated with pathological changes in connectivity and network neural structures. Alzheimer's disease (AD) interferes with memory formation from the molecular level, to synaptic functions and neural networks organization. Here, we determined whether brain connectivity of resting-state networks correlate with memory in patients affected by AD and in subjects with mild cognitive impairment (MCI). One hundred and forty-four subjects were recruited: 70 AD (MMSE Mini Mental State Evaluation 21.4), 50 MCI (MMSE 25.2) and 24 healthy subjects (MMSE 29.8). Undirected and weighted cortical brain network was built to evaluate graph core measures to obtain Small World parameters. eLORETA lagged linear connectivity as extracted by electroencephalogram (EEG) signals was used to weight the network. A high statistical correlation between Small World and memory performance was found. Namely, higher Small World characteristic in EEG gamma frequency band during the resting state, better performance in short-term memory as evaluated by the digit span tests. Such Small World pattern might represent a biomarker of working memory impairment in older people both in physiological and pathological conditions. Copyright © 2015 IBRO. Published by Elsevier Ltd. All rights reserved.

  20. A Neural Network Architecture For Rapid Model Indexing In Computer Vision Systems

    NASA Astrophysics Data System (ADS)

    Pawlicki, Ted

    1988-03-01

    Models of objects stored in memory have been shown to be useful for guiding the processing of computer vision systems. A major consideration in such systems, however, is how stored models are initially accessed and indexed by the system. As the number of stored models increases, the time required to search memory for the correct model becomes high. Parallel distributed, connectionist, neural networks' have been shown to have appealing content addressable memory properties. This paper discusses an architecture for efficient storage and reference of model memories stored as stable patterns of activity in a parallel, distributed, connectionist, neural network. The emergent properties of content addressability and resistance to noise are exploited to perform indexing of the appropriate object centered model from image centered primitives. The system consists of three network modules each of which represent information relative to a different frame of reference. The model memory network is a large state space vector where fields in the vector correspond to ordered component objects and relative, object based spatial relationships between the component objects. The component assertion network represents evidence about the existence of object primitives in the input image. It establishes local frames of reference for object primitives relative to the image based frame of reference. The spatial relationship constraint network is an intermediate representation which enables the association between the object based and the image based frames of reference. This intermediate level represents information about possible object orderings and establishes relative spatial relationships from the image based information in the component assertion network below. It is also constrained by the lawful object orderings in the model memory network above. The system design is consistent with current psychological theories of recognition by component. It also seems to support Marr's notions of hierarchical indexing. (i.e. the specificity, adjunct, and parent indices) It supports the notion that multiple canonical views of an object may have to be stored in memory to enable its efficient identification. The use of variable fields in the state space vectors appears to keep the number of required nodes in the network down to a tractable number while imposing a semantic value on different areas of the state space. This semantic imposition supports an interface between the analogical aspects of neural networks and the propositional paradigms of symbolic processing.

  1. Synchronization of heteroclinic circuits through learning in coupled neural networks

    NASA Astrophysics Data System (ADS)

    Selskii, Anton; Makarov, Valeri A.

    2016-01-01

    The synchronization of oscillatory activity in neural networks is usually implemented by coupling the state variables describing neuronal dynamics. Here we study another, but complementary mechanism based on a learning process with memory. A driver network, acting as a teacher, exhibits winner-less competition (WLC) dynamics, while a driven network, a learner, tunes its internal couplings according to the oscillations observed in the teacher. We show that under appropriate training the learner can "copy" the coupling structure and thus synchronize oscillations with the teacher. The replication of the WLC dynamics occurs for intermediate memory lengths only, consequently, the learner network exhibits a phenomenon of learning resonance.

  2. Efficiently modeling neural networks on massively parallel computers

    NASA Technical Reports Server (NTRS)

    Farber, Robert M.

    1993-01-01

    Neural networks are a very useful tool for analyzing and modeling complex real world systems. Applying neural network simulations to real world problems generally involves large amounts of data and massive amounts of computation. To efficiently handle the computational requirements of large problems, we have implemented at Los Alamos a highly efficient neural network compiler for serial computers, vector computers, vector parallel computers, and fine grain SIMD computers such as the CM-2 connection machine. This paper describes the mapping used by the compiler to implement feed-forward backpropagation neural networks for a SIMD (Single Instruction Multiple Data) architecture parallel computer. Thinking Machines Corporation has benchmarked our code at 1.3 billion interconnects per second (approximately 3 gigaflops) on a 64,000 processor CM-2 connection machine (Singer 1990). This mapping is applicable to other SIMD computers and can be implemented on MIMD computers such as the CM-5 connection machine. Our mapping has virtually no communications overhead with the exception of the communications required for a global summation across the processors (which has a sub-linear runtime growth on the order of O(log(number of processors)). We can efficiently model very large neural networks which have many neurons and interconnects and our mapping can extend to arbitrarily large networks (within memory limitations) by merging the memory space of separate processors with fast adjacent processor interprocessor communications. This paper will consider the simulation of only feed forward neural network although this method is extendable to recurrent networks.

  3. Vector Symbolic Spiking Neural Network Model of Hippocampal Subarea CA1 Novelty Detection Functionality.

    PubMed

    Agerskov, Claus

    2016-04-01

    A neural network model is presented of novelty detection in the CA1 subdomain of the hippocampal formation from the perspective of information flow. This computational model is restricted on several levels by both anatomical information about hippocampal circuitry and behavioral data from studies done in rats. Several studies report that the CA1 area broadcasts a generalized novelty signal in response to changes in the environment. Using the neural engineering framework developed by Eliasmith et al., a spiking neural network architecture is created that is able to compare high-dimensional vectors, symbolizing semantic information, according to the semantic pointer hypothesis. This model then computes the similarity between the vectors, as both direct inputs and a recalled memory from a long-term memory network by performing the dot-product operation in a novelty neural network architecture. The developed CA1 model agrees with available neuroanatomical data, as well as the presented behavioral data, and so it is a biologically realistic model of novelty detection in the hippocampus, which can provide a feasible explanation for experimentally observed dynamics.

  4. Associative memory in an analog iterated-map neural network

    NASA Astrophysics Data System (ADS)

    Marcus, C. M.; Waugh, F. R.; Westervelt, R. M.

    1990-03-01

    The behavior of an analog neural network with parallel dynamics is studied analytically and numerically for two associative-memory learning algorithms, the Hebb rule and the pseudoinverse rule. Phase diagrams in the parameter space of analog gain β and storage ratio α are presented. For both learning rules, the networks have large ``recall'' phases in which retrieval states exist and convergence to a fixed point is guaranteed by a global stability criterion. We also demonstrate numerically that using a reduced analog gain increases the probability of recall starting from a random initial state. This phenomenon is comparable to thermal annealing used to escape local minima but has the advantage of being deterministic, and therefore easily implemented in electronic hardware. Similarities and differences between analog neural networks and networks with two-state neurons at finite temperature are also discussed.

  5. How the amygdala affects emotional memory by altering brain network properties.

    PubMed

    Hermans, Erno J; Battaglia, Francesco P; Atsak, Piray; de Voogd, Lycia D; Fernández, Guillén; Roozendaal, Benno

    2014-07-01

    The amygdala has long been known to play a key role in supporting memory for emotionally arousing experiences. For example, classical fear conditioning depends on neural plasticity within this anterior medial temporal lobe region. Beneficial effects of emotional arousal on memory, however, are not restricted to simple associative learning. Our recollection of emotional experiences often includes rich representations of, e.g., spatiotemporal context, visceral states, and stimulus-response associations. Critically, such memory features are known to bear heavily on regions elsewhere in the brain. These observations led to the modulation account of amygdala function, which postulates that amygdala activation enhances memory consolidation by facilitating neural plasticity and information storage processes in its target regions. Rodent work in past decades has identified the most important brain regions and neurochemical processes involved in these modulatory actions, and neuropsychological and neuroimaging work in humans has produced a large body of convergent data. Importantly, recent methodological developments make it increasingly realistic to monitor neural interactions underlying such modulatory effects as they unfold. For instance, functional connectivity network modeling in humans has demonstrated how information exchanges between the amygdala and specific target regions occur within the context of large-scale neural network interactions. Furthermore, electrophysiological and optogenetic techniques in rodents are beginning to make it possible to quantify and even manipulate such interactions with millisecond precision. In this paper we will discuss that these developments will likely lead to an updated view of the amygdala as a critical nexus within large-scale networks supporting different aspects of memory processing for emotionally arousing experiences. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Iterative free-energy optimization for recurrent neural networks (INFERNO).

    PubMed

    Pitti, Alexandre; Gaussier, Philippe; Quoy, Mathias

    2017-01-01

    The intra-parietal lobe coupled with the Basal Ganglia forms a working memory that demonstrates strong planning capabilities for generating robust yet flexible neuronal sequences. Neurocomputational models however, often fails to control long range neural synchrony in recurrent spiking networks due to spontaneous activity. As a novel framework based on the free-energy principle, we propose to see the problem of spikes' synchrony as an optimization problem of the neurons sub-threshold activity for the generation of long neuronal chains. Using a stochastic gradient descent, a reinforcement signal (presumably dopaminergic) evaluates the quality of one input vector to move the recurrent neural network to a desired activity; depending on the error made, this input vector is strengthened to hill-climb the gradient or elicited to search for another solution. This vector can be learned then by one associative memory as a model of the basal-ganglia to control the recurrent neural network. Experiments on habit learning and on sequence retrieving demonstrate the capabilities of the dual system to generate very long and precise spatio-temporal sequences, above two hundred iterations. Its features are applied then to the sequential planning of arm movements. In line with neurobiological theories, we discuss its relevance for modeling the cortico-basal working memory to initiate flexible goal-directed neuronal chains of causation and its relation to novel architectures such as Deep Networks, Neural Turing Machines and the Free-Energy Principle.

  7. Stochastic associative memory

    NASA Astrophysics Data System (ADS)

    Baumann, Erwin W.; Williams, David L.

    1993-08-01

    Artificial neural networks capable of learning and recalling stochastic associations between non-deterministic quantities have received relatively little attention to date. One potential application of such stochastic associative networks is the generation of sensory 'expectations' based on arbitrary subsets of sensor inputs to support anticipatory and investigate behavior in sensor-based robots. Another application of this type of associative memory is the prediction of how a scene will look in one spectral band, including noise, based upon its appearance in several other wavebands. This paper describes a semi-supervised neural network architecture composed of self-organizing maps associated through stochastic inter-layer connections. This 'Stochastic Associative Memory' (SAM) can learn and recall non-deterministic associations between multi-dimensional probability density functions. The stochastic nature of the network also enables it to represent noise distributions that are inherent in any true sensing process. The SAM architecture, training process, and initial application to sensor image prediction are described. Relationships to Fuzzy Associative Memory (FAM) are discussed.

  8. Finite-Time Stability for Fractional-Order Bidirectional Associative Memory Neural Networks with Time Delays

    NASA Astrophysics Data System (ADS)

    Xu, Chang-Jin; Li, Pei-Luan; Pang, Yi-Cheng

    2017-02-01

    This paper is concerned with fractional-order bidirectional associative memory (BAM) neural networks with time delays. Applying Laplace transform, the generalized Gronwall inequality and estimates of Mittag-Leffler functions, some sufficient conditions which ensure the finite-time stability of fractional-order bidirectional associative memory neural networks with time delays are obtained. Two examples with their simulations are given to illustrate the theoretical findings. Our results are new and complement previously known results. Supported by National Natural Science Foundation of China under Grant Nos.~61673008, 11261010, 11101126, Project of High-Level Innovative Talents of Guizhou Province ([2016]5651), Natural Science and Technology Foundation of Guizhou Province (J[2015]2025 and J[2015]2026), 125 Special Major Science and Technology of Department of Education of Guizhou Province ([2012]011) and Natural Science Foundation of the Education Department of Guizhou Province (KY[2015]482)

  9. Segmented-memory recurrent neural networks.

    PubMed

    Chen, Jinmiao; Chaudhari, Narendra S

    2009-08-01

    Conventional recurrent neural networks (RNNs) have difficulties in learning long-term dependencies. To tackle this problem, we propose an architecture called segmented-memory recurrent neural network (SMRNN). A symbolic sequence is broken into segments and then presented as inputs to the SMRNN one symbol per cycle. The SMRNN uses separate internal states to store symbol-level context, as well as segment-level context. The symbol-level context is updated for each symbol presented for input. The segment-level context is updated after each segment. The SMRNN is trained using an extended real-time recurrent learning algorithm. We test the performance of SMRNN on the information latching problem, the "two-sequence problem" and the problem of protein secondary structure (PSS) prediction. Our implementation results indicate that SMRNN performs better on long-term dependency problems than conventional RNNs. Besides, we also theoretically analyze how the segmented memory of SMRNN helps learning long-term temporal dependencies and study the impact of the segment length.

  10. Binary Associative Memories as a Benchmark for Spiking Neuromorphic Hardware

    PubMed Central

    Stöckel, Andreas; Jenzen, Christoph; Thies, Michael; Rückert, Ulrich

    2017-01-01

    Large-scale neuromorphic hardware platforms, specialized computer systems for energy efficient simulation of spiking neural networks, are being developed around the world, for example as part of the European Human Brain Project (HBP). Due to conceptual differences, a universal performance analysis of these systems in terms of runtime, accuracy and energy efficiency is non-trivial, yet indispensable for further hard- and software development. In this paper we describe a scalable benchmark based on a spiking neural network implementation of the binary neural associative memory. We treat neuromorphic hardware and software simulators as black-boxes and execute exactly the same network description across all devices. Experiments on the HBP platforms under varying configurations of the associative memory show that the presented method allows to test the quality of the neuron model implementation, and to explain significant deviations from the expected reference output. PMID:28878642

  11. Cortical networks dynamically emerge with the interplay of slow and fast oscillations for memory of a natural scene.

    PubMed

    Mizuhara, Hiroaki; Sato, Naoyuki; Yamaguchi, Yoko

    2015-05-01

    Neural oscillations are crucial for revealing dynamic cortical networks and for serving as a possible mechanism of inter-cortical communication, especially in association with mnemonic function. The interplay of the slow and fast oscillations might dynamically coordinate the mnemonic cortical circuits to rehearse stored items during working memory retention. We recorded simultaneous EEG-fMRI during a working memory task involving a natural scene to verify whether the cortical networks emerge with the neural oscillations for memory of the natural scene. The slow EEG power was enhanced in association with the better accuracy of working memory retention, and accompanied cortical activities in the mnemonic circuits for the natural scene. Fast oscillation showed a phase-amplitude coupling to the slow oscillation, and its power was tightly coupled with the cortical activities for representing the visual images of natural scenes. The mnemonic cortical circuit with the slow neural oscillations would rehearse the distributed natural scene representations with the fast oscillation for working memory retention. The coincidence of the natural scene representations could be obtained by the slow oscillation phase to create a coherent whole of the natural scene in the working memory. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Discrete-time BAM neural networks with variable delays

    NASA Astrophysics Data System (ADS)

    Liu, Xin-Ge; Tang, Mei-Lan; Martin, Ralph; Liu, Xin-Bi

    2007-07-01

    This Letter deals with the global exponential stability of discrete-time bidirectional associative memory (BAM) neural networks with variable delays. Using a Lyapunov functional, and linear matrix inequality techniques (LMI), we derive a new delay-dependent exponential stability criterion for BAM neural networks with variable delays. As this criterion has no extra constraints on the variable delay functions, it can be applied to quite general BAM neural networks with a broad range of time delay functions. It is also easy to use in practice. An example is provided to illustrate the theoretical development.

  13. An application of neural network for Structural Health Monitoring of an adaptive wing with an array of FBG sensors

    NASA Astrophysics Data System (ADS)

    Mieloszyk, Magdalena; Krawczuk, Marek; Skarbek, Lukasz; Ostachowicz, Wieslaw

    2011-07-01

    This paper presents an application of neural networks to determinate the level of activation of shape memory alloy actuators of an adaptive wing. In this concept the shape of the wing can be controlled and altered thanks to the wing design and the use of integrated shape memory alloy actuators. The wing is assumed as assembled from a number of wing sections that relative positions can be controlled independently by thermal activation of shape memory actuators. The investigated wing is employed with an array of Fibre Bragg Grating sensors. The Fibre Bragg Grating sensors with combination of a neural network have been used to Structural Health Monitoring of the wing condition. The FBG sensors are a great tool to control the condition of composite structures due to their immunity to electromagnetic fields as well as their small size and weight. They can be mounted onto the surface or embedded into the wing composite material without any significant influence on the wing strength. The paper concentrates on analysis of the determination of the twisting moment produced by an activated shape memory alloy actuator. This has been analysed both numerically using the finite element method by a commercial code ABAQUS® and experimentally using Fibre Bragg Grating sensor measurements. The results of the analysis have been then used by a neural network to determine twisting moments produced by each shape memory alloy actuator.

  14. Balanced cortical microcircuitry for spatial working memory based on corrective feedback control.

    PubMed

    Lim, Sukbin; Goldman, Mark S

    2014-05-14

    A hallmark of working memory is the ability to maintain graded representations of both the spatial location and amplitude of a memorized stimulus. Previous work has identified a neural correlate of spatial working memory in the persistent maintenance of spatially specific patterns of neural activity. How such activity is maintained by neocortical circuits remains unknown. Traditional models of working memory maintain analog representations of either the spatial location or the amplitude of a stimulus, but not both. Furthermore, although most previous models require local excitation and lateral inhibition to maintain spatially localized persistent activity stably, the substrate for lateral inhibitory feedback pathways is unclear. Here, we suggest an alternative model for spatial working memory that is capable of maintaining analog representations of both the spatial location and amplitude of a stimulus, and that does not rely on long-range feedback inhibition. The model consists of a functionally columnar network of recurrently connected excitatory and inhibitory neural populations. When excitation and inhibition are balanced in strength but offset in time, drifts in activity trigger spatially specific negative feedback that corrects memory decay. The resulting networks can temporally integrate inputs at any spatial location, are robust against many commonly considered perturbations in network parameters, and, when implemented in a spiking model, generate irregular neural firing characteristic of that observed experimentally during persistent activity. This work suggests balanced excitatory-inhibitory memory circuits implementing corrective negative feedback as a substrate for spatial working memory. Copyright © 2014 the authors 0270-6474/14/346790-17$15.00/0.

  15. Hippocampal brain-network coordination during volitional exploratory behavior enhances learning

    PubMed Central

    Voss, Joel L.; Gonsalves, Brian D.; Federmeier, Kara D.; Tranel, Daniel; Cohen, Neal J.

    2010-01-01

    Exploratory behaviors during learning determine what is studied and when, helping to optimize subsequent memory performance. We manipulated how much control subjects had over the position of a moving window through which they studied objects and their locations, in order to elucidate the cognitive and neural determinants of exploratory behaviors. Our behavioral, neuropsychological, and neuroimaging data indicate volitional control benefits memory performance, and is linked to a brain network centered on the hippocampus. Increases in correlated activity between the hippocampus and other areas were associated with specific aspects of memory, suggesting that volitional control optimizes interactions among specialized neural systems via the hippocampus. Memory is therefore an active process intrinsically linked to behavior. Furthermore, brain structures typically seen as passive participants in memory encoding (e.g., the hippocampus) are actually part of an active network that controls behavior dynamically as it unfolds. PMID:21102449

  16. Hippocampal brain-network coordination during volitional exploratory behavior enhances learning.

    PubMed

    Voss, Joel L; Gonsalves, Brian D; Federmeier, Kara D; Tranel, Daniel; Cohen, Neal J

    2011-01-01

    Exploratory behaviors during learning determine what is studied and when, helping to optimize subsequent memory performance. To elucidate the cognitive and neural determinants of exploratory behaviors, we manipulated the control that human subjects had over the position of a moving window through which they studied objects and their locations. Our behavioral, neuropsychological and neuroimaging data indicate that volitional control benefits memory performance and is linked to a brain network that is centered on the hippocampus. Increases in correlated activity between the hippocampus and other areas were associated with specific aspects of memory, which suggests that volitional control optimizes interactions among specialized neural systems through the hippocampus. Memory is therefore an active process that is intrinsically linked to behavior. Furthermore, brain structures that are typically seen as passive participants in memory encoding (for example, the hippocampus) are actually part of an active network that controls behavior dynamically as it unfolds.

  17. Quantum neural networks: Current status and prospects for development

    NASA Astrophysics Data System (ADS)

    Altaisky, M. V.; Kaputkina, N. E.; Krylov, V. A.

    2014-11-01

    The idea of quantum artificial neural networks, first formulated in [34], unites the artificial neural network concept with the quantum computation paradigm. Quantum artificial neural networks were first systematically considered in the PhD thesis by T. Menneer (1998). Based on the works of Menneer and Narayanan [42, 43], Kouda, Matsui, and Nishimura [35, 36], Altaisky [2, 68], Zhou [67], and others, quantum-inspired learning algorithms for neural networks were developed, and are now used in various training programs and computer games [29, 30]. The first practically realizable scaled hardware-implemented model of the quantum artificial neural network is obtained by D-Wave Systems, Inc. [33]. It is a quantum Hopfield network implemented on the basis of superconducting quantum interference devices (SQUIDs). In this work we analyze possibilities and underlying principles of an alternative way to implement quantum neural networks on the basis of quantum dots. A possibility of using quantum neural network algorithms in automated control systems, associative memory devices, and in modeling biological and social networks is examined.

  18. Neural bases of prospective memory: a meta-analysis and the "Attention to Delayed Intention" (AtoDI) model.

    PubMed

    Cona, Giorgia; Scarpazza, Cristina; Sartori, Giuseppe; Moscovitch, Morris; Bisiacchi, Patrizia Silvia

    2015-05-01

    Remembering to realize delayed intentions is a multi-phase process, labelled as prospective memory (PM), and involves a plurality of neural networks. The present study utilized the activation likelihood estimation method of meta-analysis to provide a complete overview of the brain regions that are consistently activated in each PM phase. We formulated the 'Attention to Delayed Intention' (AtoDI) model to explain the neural dissociation found between intention maintenance and retrieval phases. The dorsal frontoparietal network is involved mainly in the maintenance phase and seems to mediate the strategic monitoring processes, such as the allocation of top-down attention both towards external stimuli, to monitor for the occurrence of the PM cues, and to internal memory contents, to maintain the intention active in memory. The ventral frontoparietal network is recruited in the retrieval phase and might subserve the bottom-up attention captured externally by the PM cues and, internally, by the intention stored in memory. Together with other brain regions (i.e., insula and posterior cingulate cortex), the ventral frontoparietal network would support the spontaneous retrieval processes. The functional contribution of the anterior prefrontal cortex is discussed extensively for each PM phase. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Neural associative memories for the integration of language, vision and action in an autonomous agent.

    PubMed

    Markert, H; Kaufmann, U; Kara Kayikci, Z; Palm, G

    2009-03-01

    Language understanding is a long-standing problem in computer science. However, the human brain is capable of processing complex languages with seemingly no difficulties. This paper shows a model for language understanding using biologically plausible neural networks composed of associative memories. The model is able to deal with ambiguities on the single word and grammatical level. The language system is embedded into a robot in order to demonstrate the correct semantical understanding of the input sentences by letting the robot perform corresponding actions. For that purpose, a simple neural action planning system has been combined with neural networks for visual object recognition and visual attention control mechanisms.

  20. Dynamical synapses enhance neural information processing: gracefulness, accuracy, and mobility.

    PubMed

    Fung, C C Alan; Wong, K Y Michael; Wang, He; Wu, Si

    2012-05-01

    Experimental data have revealed that neuronal connection efficacy exhibits two forms of short-term plasticity: short-term depression (STD) and short-term facilitation (STF). They have time constants residing between fast neural signaling and rapid learning and may serve as substrates for neural systems manipulating temporal information on relevant timescales. This study investigates the impact of STD and STF on the dynamics of continuous attractor neural networks and their potential roles in neural information processing. We find that STD endows the network with slow-decaying plateau behaviors: the network that is initially being stimulated to an active state decays to a silent state very slowly on the timescale of STD rather than on that of neuralsignaling. This provides a mechanism for neural systems to hold sensory memory easily and shut off persistent activities gracefully. With STF, we find that the network can hold a memory trace of external inputs in the facilitated neuronal interactions, which provides a way to stabilize the network response to noisy inputs, leading to improved accuracy in population decoding. Furthermore, we find that STD increases the mobility of the network states. The increased mobility enhances the tracking performance of the network in response to time-varying stimuli, leading to anticipative neural responses. In general, we find that STD and STP tend to have opposite effects on network dynamics and complementary computational advantages, suggesting that the brain may employ a strategy of weighting them differentially depending on the computational purpose.

  1. Implicity Defined Neural Networks for Sequence Labeling

    DTIC Science & Technology

    2017-02-13

    popularity of the Long Short - Term Memory (LSTM) (Hochreiter and Schmidhuber, 1997) and variants such as the Gated Recurrent Unit (GRU) (Cho et al., 2014...bidirectional lstm and other neural network architectures. Neural Net- works 18(5):602–610. Sepp Hochreiter and Jürgen Schmidhuber. 1997. Long short - term ...hid- den states of the network to coupled together, allowing potential improvement on problems with complex, long -distance dependencies. Initial

  2. Iconic memory and parietofrontal network: fMRI study using temporal integration.

    PubMed

    Saneyoshi, Ayako; Niimi, Ryosuke; Suetsugu, Tomoko; Kaminaga, Tatsuro; Yokosawa, Kazuhiko

    2011-08-03

    We investigated the neural basis of iconic memory using functional magnetic resonance imaging. The parietofrontal network of selective attention is reportedly relevant to readout from iconic memory. We adopted a temporal integration task that requires iconic memory but not selective attention. The results showed that the task activated the parietofrontal network, confirming that the network is involved in readout from iconic memory. We further tested a condition in which temporal integration was performed by visual short-term memory but not by iconic memory. However, no brain region revealed higher activation for temporal integration by iconic memory than for temporal integration by visual short-term memory. This result suggested that there is no localized brain region specialized for iconic memory per se.

  3. Hopf bifurcation of an (n + 1) -neuron bidirectional associative memory neural network model with delays.

    PubMed

    Xiao, Min; Zheng, Wei Xing; Cao, Jinde

    2013-01-01

    Recent studies on Hopf bifurcations of neural networks with delays are confined to simplified neural network models consisting of only two, three, four, five, or six neurons. It is well known that neural networks are complex and large-scale nonlinear dynamical systems, so the dynamics of the delayed neural networks are very rich and complicated. Although discussing the dynamics of networks with a few neurons may help us to understand large-scale networks, there are inevitably some complicated problems that may be overlooked if simplified networks are carried over to large-scale networks. In this paper, a general delayed bidirectional associative memory neural network model with n + 1 neurons is considered. By analyzing the associated characteristic equation, the local stability of the trivial steady state is examined, and then the existence of the Hopf bifurcation at the trivial steady state is established. By applying the normal form theory and the center manifold reduction, explicit formulae are derived to determine the direction and stability of the bifurcating periodic solution. Furthermore, the paper highlights situations where the Hopf bifurcations are particularly critical, in the sense that the amplitude and the period of oscillations are very sensitive to errors due to tolerances in the implementation of neuron interconnections. It is shown that the sensitivity is crucially dependent on the delay and also significantly influenced by the feature of the number of neurons. Numerical simulations are carried out to illustrate the main results.

  4. Targeted Memory Reactivation during Sleep Adaptively Promotes the Strengthening or Weakening of Overlapping Memories.

    PubMed

    Oyarzún, Javiera P; Morís, Joaquín; Luque, David; de Diego-Balaguer, Ruth; Fuentemilla, Lluís

    2017-08-09

    System memory consolidation is conceptualized as an active process whereby newly encoded memory representations are strengthened through selective memory reactivation during sleep. However, our learning experience is highly overlapping in content (i.e., shares common elements), and memories of these events are organized in an intricate network of overlapping associated events. It remains to be explored whether and how selective memory reactivation during sleep has an impact on these overlapping memories acquired during awake time. Here, we test in a group of adult women and men the prediction that selective memory reactivation during sleep entails the reactivation of associated events and that this may lead the brain to adaptively regulate whether these associated memories are strengthened or pruned from memory networks on the basis of their relative associative strength with the shared element. Our findings demonstrate the existence of efficient regulatory neural mechanisms governing how complex memory networks are shaped during sleep as a function of their associative memory strength. SIGNIFICANCE STATEMENT Numerous studies have demonstrated that system memory consolidation is an active, selective, and sleep-dependent process in which only subsets of new memories become stabilized through their reactivation. However, the learning experience is highly overlapping in content and thus events are encoded in an intricate network of related memories. It remains to be explored whether and how memory reactivation has an impact on overlapping memories acquired during awake time. Here, we show that sleep memory reactivation promotes strengthening and weakening of overlapping memories based on their associative memory strength. These results suggest the existence of an efficient regulatory neural mechanism that avoids the formation of cluttered memory representation of multiple events and promotes stabilization of complex memory networks. Copyright © 2017 the authors 0270-6474/17/377748-11$15.00/0.

  5. Memory loss in Alzheimer's disease

    PubMed Central

    Jahn, Holger

    2013-01-01

    Loss of memory is among the first symptoms reported by patients suffering from Alzheimer's disease (AD) and by their caretakers. Working memory and long-term declarative memory are affected early during the course of the disease. The individual pattern of impaired memory functions correlates with parameters of structural or functional brain integrity. AD pathology interferes with the formation of memories from the molecular level to the framework of neural networks. The investigation of AD memory loss helps to identify the involved neural structures, such as the default mode network, the influence of epigenetic and genetic factors, such as ApoE4 status, and evolutionary aspects of human cognition. Clinically, the analysis of memory assists the definition of AD subtypes, disease grading, and prognostic predictions. Despite new AD criteria that allow the earlier diagnosis of the disease by inclusion of biomarkers derived from cerebrospinal fluid or hippocampal volume analysis, neuropsychological testing remains at the core of AD diagnosis. PMID:24459411

  6. Auto-programmable impulse neural circuits

    NASA Technical Reports Server (NTRS)

    Watula, D.; Meador, J.

    1990-01-01

    Impulse neural networks use pulse trains to communicate neuron activation levels. Impulse neural circuits emulate natural neurons at a more detailed level than that typically employed by contemporary neural network implementation methods. An impulse neural circuit which realizes short term memory dynamics is presented. The operation of that circuit is then characterized in terms of pulse frequency modulated signals. Both fixed and programmable synapse circuits for realizing long term memory are also described. The implementation of a simple and useful unsupervised learning law is then presented. The implementation of a differential Hebbian learning rule for a specific mean-frequency signal interpretation is shown to have a straightforward implementation using digital combinational logic with a variation of a previously developed programmable synapse circuit. This circuit is expected to be exploited for simple and straightforward implementation of future auto-adaptive neural circuits.

  7. Anti-synchronization control of BAM memristive neural networks with multiple proportional delays and stochastic perturbations

    NASA Astrophysics Data System (ADS)

    Wang, Weiping; Yuan, Manman; Luo, Xiong; Liu, Linlin; Zhang, Yao

    2018-01-01

    Proportional delay is a class of unbounded time-varying delay. A class of bidirectional associative memory (BAM) memristive neural networks with multiple proportional delays is concerned in this paper. First, we propose the model of BAM memristive neural networks with multiple proportional delays and stochastic perturbations. Furthermore, by choosing suitable nonlinear variable transformations, the BAM memristive neural networks with multiple proportional delays can be transformed into the BAM memristive neural networks with constant delays. Based on the drive-response system concept, differential inclusions theory and Lyapunov stability theory, some anti-synchronization criteria are obtained. Finally, the effectiveness of proposed criteria are demonstrated through numerical examples.

  8. Social memory engram in the hippocampus.

    PubMed

    Okuyama, Teruhiro

    2018-04-01

    Social memory is one of the crucial components of episodic memories. Gregarious animals living in societies utilize social memory to exhibit the appropriate social behaviors such as aggression, avoidance, cooperative behavior, and even mating behavior. However, the neural mechanisms underlying social memory in the hippocampus remains mysterious. Here, I review some evidence from work done in rodents and primates on the brain region(s) and circuits encoding and/or retrieving social memory, as well as a storage for social memory (i.e. social memory engram neurons). Based on our recent findings that neural ensemble in ventral CA1 sub-region of the hippocampus possesses social memory engram, I would discuss the neural network for social information processing in order to encode social memory; and its evolutionary conservation between rodents and human. Copyright © 2017 Elsevier Ireland Ltd and Japan Neuroscience Society. All rights reserved.

  9. Associative memory and its cerebral correlates in Alzheimer׳s disease: evidence for distinct deficits of relational and conjunctive memory.

    PubMed

    Bastin, Christine; Bahri, Mohamed Ali; Miévis, Frédéric; Lemaire, Christian; Collette, Fabienne; Genon, Sarah; Simon, Jessica; Guillaume, Bénédicte; Diana, Rachel A; Yonelinas, Andrew P; Salmon, Eric

    2014-10-01

    This study investigated the impact of Alzheimer׳s disease (AD) on conjunctive and relational binding in episodic memory. Mild AD patients and controls had to remember item-color associations by imagining color either as a contextual association (relational memory) or as a feature of the item to be encoded (conjunctive memory). Patients׳ performance in each condition was correlated with cerebral metabolism measured by FDG-PET. The results showed that AD patients had an impaired capacity to remember item-color associations, with deficits in both relational and conjunctive memory. However, performance in the two kinds of associative memory varied independently across patients. Partial Least Square analyses revealed that poor conjunctive memory was related to hypometabolism in an anterior temporal-posterior fusiform brain network, whereas relational memory correlated with metabolism in regions of the default mode network. These findings support the hypothesis of distinct neural systems specialized in different types of associative memory and point to heterogeneous profiles of memory alteration in Alzheimer׳s disease as a function of damage to the respective neural networks. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Common and unique gray matter correlates of episodic memory dysfunction in frontotemporal dementia and Alzheimer's disease.

    PubMed

    Irish, Muireann; Piguet, Olivier; Hodges, John R; Hornberger, Michael

    2014-04-01

    Conflicting evidence exists regarding the integrity of episodic memory in the behavioral variant of frontotemporal dementia (bvFTD). Recent converging evidence suggests that episodic memory in progressive cases of bvFTD is compromised to the same extent as in Alzheimer's disease (AD). The underlying neural substrates of these episodic memory deficits, however, likely differ contingent on dementia type. In this study we sought to elucidate the neural substrates of episodic memory performance, across recall and recognition tasks, in both patient groups using voxel-based morphometry (VBM) analyses. We predicted that episodic memory dysfunction would be apparent in both patient groups but would relate to divergent patterns of neural atrophy specific to each dementia type. We assessed episodic memory, across verbal and visual domains, in 19 bvFTD, 18 AD patients, and 19 age- and education-matched controls. Behaviorally, patient groups were indistinguishable for immediate and delayed recall, across verbal and visual domains. Whole-brain VBM analyses revealed regions commonly implicated in episodic retrieval across groups, namely the right temporal pole, right frontal lobe, left paracingulate gyrus, and right anterior hippocampus. Divergent neural networks specific to each group were also identified. Whereas a widespread network including posterior regions such as the posterior cingulate cortex, parietal and occipital cortices was exclusively implicated in AD, the frontal and anterior temporal lobes underpinned the episodic memory deficits in bvFTD. Our results point to distinct neural changes underlying episodic memory decline specific to each dementia syndrome. Copyright © 2013 Wiley Periodicals, Inc.

  11. Framewise phoneme classification with bidirectional LSTM and other neural network architectures.

    PubMed

    Graves, Alex; Schmidhuber, Jürgen

    2005-01-01

    In this paper, we present bidirectional Long Short Term Memory (LSTM) networks, and a modified, full gradient version of the LSTM learning algorithm. We evaluate Bidirectional LSTM (BLSTM) and several other network architectures on the benchmark task of framewise phoneme classification, using the TIMIT database. Our main findings are that bidirectional networks outperform unidirectional ones, and Long Short Term Memory (LSTM) is much faster and also more accurate than both standard Recurrent Neural Nets (RNNs) and time-windowed Multilayer Perceptrons (MLPs). Our results support the view that contextual information is crucial to speech processing, and suggest that BLSTM is an effective architecture with which to exploit it.

  12. Sequential associative memory with nonuniformity of the layer sizes.

    PubMed

    Teramae, Jun-Nosuke; Fukai, Tomoki

    2007-01-01

    Sequence retrieval has a fundamental importance in information processing by the brain, and has extensively been studied in neural network models. Most of the previous sequential associative memory embedded sequences of memory patterns have nearly equal sizes. It was recently shown that local cortical networks display many diverse yet repeatable precise temporal sequences of neuronal activities, termed "neuronal avalanches." Interestingly, these avalanches displayed size and lifetime distributions that obey power laws. Inspired by these experimental findings, here we consider an associative memory model of binary neurons that stores sequences of memory patterns with highly variable sizes. Our analysis includes the case where the statistics of these size variations obey the above-mentioned power laws. We study the retrieval dynamics of such memory systems by analytically deriving the equations that govern the time evolution of macroscopic order parameters. We calculate the critical sequence length beyond which the network cannot retrieve memory sequences correctly. As an application of the analysis, we show how the present variability in sequential memory patterns degrades the power-law lifetime distribution of retrieved neural activities.

  13. Clique-Based Neural Associative Memories with Local Coding and Precoding.

    PubMed

    Mofrad, Asieh Abolpour; Parker, Matthew G; Ferdosi, Zahra; Tadayon, Mohammad H

    2016-08-01

    Techniques from coding theory are able to improve the efficiency of neuroinspired and neural associative memories by forcing some construction and constraints on the network. In this letter, the approach is to embed coding techniques into neural associative memory in order to increase their performance in the presence of partial erasures. The motivation comes from recent work by Gripon, Berrou, and coauthors, which revisited Willshaw networks and presented a neural network with interacting neurons that partitioned into clusters. The model introduced stores patterns as small-size cliques that can be retrieved in spite of partial error. We focus on improving the success of retrieval by applying two techniques: doing a local coding in each cluster and then applying a precoding step. We use a slightly different decoding scheme, which is appropriate for partial erasures and converges faster. Although the ideas of local coding and precoding are not new, the way we apply them is different. Simulations show an increase in the pattern retrieval capacity for both techniques. Moreover, we use self-dual additive codes over field [Formula: see text], which have very interesting properties and a simple-graph representation.

  14. Automatic disease diagnosis using optimised weightless neural networks for low-power wearable devices

    PubMed Central

    Edla, Damodar Reddy; Kuppili, Venkatanareshbabu; Dharavath, Ramesh; Beechu, Nareshkumar Reddy

    2017-01-01

    Low-power wearable devices for disease diagnosis are used at anytime and anywhere. These are non-invasive and pain-free for the better quality of life. However, these devices are resource constrained in terms of memory and processing capability. Memory constraint allows these devices to store a limited number of patterns and processing constraint provides delayed response. It is a challenging task to design a robust classification system under above constraints with high accuracy. In this Letter, to resolve this problem, a novel architecture for weightless neural networks (WNNs) has been proposed. It uses variable sized random access memories to optimise the memory usage and a modified binary TRIE data structure for reducing the test time. In addition, a bio-inspired-based genetic algorithm has been employed to improve the accuracy. The proposed architecture is experimented on various disease datasets using its software and hardware realisations. The experimental results prove that the proposed architecture achieves better performance in terms of accuracy, memory saving and test time as compared to standard WNNs. It also outperforms in terms of accuracy as compared to conventional neural network-based classifiers. The proposed architecture is a powerful part of most of the low-power wearable devices for the solution of memory, accuracy and time issues. PMID:28868148

  15. Image watermarking capacity analysis based on Hopfield neural network

    NASA Astrophysics Data System (ADS)

    Zhang, Fan; Zhang, Hongbin

    2004-11-01

    In watermarking schemes, watermarking can be viewed as a form of communication problems. Almost all of previous works on image watermarking capacity are based on information theory, using Shannon formula to calculate the capacity of watermarking. In this paper, we present a blind watermarking algorithm using Hopfield neural network, and analyze watermarking capacity based on neural network. In our watermarking algorithm, watermarking capacity is decided by attraction basin of associative memory.

  16. Multiview fusion for activity recognition using deep neural networks

    NASA Astrophysics Data System (ADS)

    Kavi, Rahul; Kulathumani, Vinod; Rohit, Fnu; Kecojevic, Vlad

    2016-07-01

    Convolutional neural networks (ConvNets) coupled with long short term memory (LSTM) networks have been recently shown to be effective for video classification as they combine the automatic feature extraction capabilities of a neural network with additional memory in the temporal domain. This paper shows how multiview fusion can be applied to such a ConvNet LSTM architecture. Two different fusion techniques are presented. The system is first evaluated in the context of a driver activity recognition system using data collected in a multicamera driving simulator. These results show significant improvement in accuracy with multiview fusion and also show that deep learning performs better than a traditional approach using spatiotemporal features even without requiring any background subtraction. The system is also validated on another publicly available multiview action recognition dataset that has 12 action classes and 8 camera views.

  17. Convergence dynamics and pseudo almost periodicity of a class of nonautonomous RFDEs with applications

    NASA Astrophysics Data System (ADS)

    Fan, Meng; Ye, Dan

    2005-09-01

    This paper studies the dynamics of a system of retarded functional differential equations (i.e., RF=Es), which generalize the Hopfield neural network models, the bidirectional associative memory neural networks, the hybrid network models of the cellular neural network type, and some population growth model. Sufficient criteria are established for the globally exponential stability and the existence and uniqueness of pseudo almost periodic solution. The approaches are based on constructing suitable Lyapunov functionals and the well-known Banach contraction mapping principle. The paper ends with some applications of the main results to some neural network models and population growth models and numerical simulations.

  18. Cognitive memory.

    PubMed

    Widrow, Bernard; Aragon, Juan Carlos

    2013-05-01

    Regarding the workings of the human mind, memory and pattern recognition seem to be intertwined. You generally do not have one without the other. Taking inspiration from life experience, a new form of computer memory has been devised. Certain conjectures about human memory are keys to the central idea. The design of a practical and useful "cognitive" memory system is contemplated, a memory system that may also serve as a model for many aspects of human memory. The new memory does not function like a computer memory where specific data is stored in specific numbered registers and retrieval is done by reading the contents of the specified memory register, or done by matching key words as with a document search. Incoming sensory data would be stored at the next available empty memory location, and indeed could be stored redundantly at several empty locations. The stored sensory data would neither have key words nor would it be located in known or specified memory locations. Sensory inputs concerning a single object or subject are stored together as patterns in a single "file folder" or "memory folder". When the contents of the folder are retrieved, sights, sounds, tactile feel, smell, etc., are obtained all at the same time. Retrieval would be initiated by a query or a prompt signal from a current set of sensory inputs or patterns. A search through the memory would be made to locate stored data that correlates with or relates to the prompt input. The search would be done by a retrieval system whose first stage makes use of autoassociative artificial neural networks and whose second stage relies on exhaustive search. Applications of cognitive memory systems have been made to visual aircraft identification, aircraft navigation, and human facial recognition. Concerning human memory, reasons are given why it is unlikely that long-term memory is stored in the synapses of the brain's neural networks. Reasons are given suggesting that long-term memory is stored in DNA or RNA. Neural networks are an important component of the human memory system, and their purpose is for information retrieval, not for information storage. The brain's neural networks are analog devices, subject to drift and unplanned change. Only with constant training is reliable action possible. Good training time is during sleep and while awake and making use of one's memory. A cognitive memory is a learning system. Learning involves storage of patterns or data in a cognitive memory. The learning process for cognitive memory is unsupervised, i.e. autonomous. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Neural networks for data compression and invariant image recognition

    NASA Technical Reports Server (NTRS)

    Gardner, Sheldon

    1989-01-01

    An approach to invariant image recognition (I2R), based upon a model of biological vision in the mammalian visual system (MVS), is described. The complete I2R model incorporates several biologically inspired features: exponential mapping of retinal images, Gabor spatial filtering, and a neural network associative memory. In the I2R model, exponentially mapped retinal images are filtered by a hierarchical set of Gabor spatial filters (GSF) which provide compression of the information contained within a pixel-based image. A neural network associative memory (AM) is used to process the GSF coded images. We describe a 1-D shape function method for coding of scale and rotationally invariant shape information. This method reduces image shape information to a periodic waveform suitable for coding as an input vector to a neural network AM. The shape function method is suitable for near term applications on conventional computing architectures equipped with VLSI FFT chips to provide a rapid image search capability.

  20. Delay-Dependent Stability Criterion for Bidirectional Associative Memory Neural Networks with Interval Time-Varying Delays

    NASA Astrophysics Data System (ADS)

    Park, Ju H.; Kwon, O. M.

    In the letter, the global asymptotic stability of bidirectional associative memory (BAM) neural networks with delays is investigated. The delay is assumed to be time-varying and belongs to a given interval. A novel stability criterion for the stability is presented based on the Lyapunov method. The criterion is represented in terms of linear matrix inequality (LMI), which can be solved easily by various optimization algorithms. Two numerical examples are illustrated to show the effectiveness of our new result.

  1. Multi-Temporal Land Cover Classification with Long Short-Term Memory Neural Networks

    NASA Astrophysics Data System (ADS)

    Rußwurm, M.; Körner, M.

    2017-05-01

    Land cover classification (LCC) is a central and wide field of research in earth observation and has already put forth a variety of classification techniques. Many approaches are based on classification techniques considering observation at certain points in time. However, some land cover classes, such as crops, change their spectral characteristics due to environmental influences and can thus not be monitored effectively with classical mono-temporal approaches. Nevertheless, these temporal observations should be utilized to benefit the classification process. After extensive research has been conducted on modeling temporal dynamics by spectro-temporal profiles using vegetation indices, we propose a deep learning approach to utilize these temporal characteristics for classification tasks. In this work, we show how long short-term memory (LSTM) neural networks can be employed for crop identification purposes with SENTINEL 2A observations from large study areas and label information provided by local authorities. We compare these temporal neural network models, i.e., LSTM and recurrent neural network (RNN), with a classical non-temporal convolutional neural network (CNN) model and an additional support vector machine (SVM) baseline. With our rather straightforward LSTM variant, we exceeded state-of-the-art classification performance, thus opening promising potential for further research.

  2. Parsing recursive sentences with a connectionist model including a neural stack and synaptic gating.

    PubMed

    Fedor, Anna; Ittzés, Péter; Szathmáry, Eörs

    2011-02-21

    It is supposed that humans are genetically predisposed to be able to recognize sequences of context-free grammars with centre-embedded recursion while other primates are restricted to the recognition of finite state grammars with tail-recursion. Our aim was to construct a minimalist neural network that is able to parse artificial sentences of both grammars in an efficient way without using the biologically unrealistic backpropagation algorithm. The core of this network is a neural stack-like memory where the push and pop operations are regulated by synaptic gating on the connections between the layers of the stack. The network correctly categorizes novel sentences of both grammars after training. We suggest that the introduction of the neural stack memory will turn out to be substantial for any biological 'hierarchical processor' and the minimalist design of the model suggests a quest for similar, realistic neural architectures. Copyright © 2010 Elsevier Ltd. All rights reserved.

  3. Global exponential stability of BAM neural networks with time-varying delays and diffusion terms

    NASA Astrophysics Data System (ADS)

    Wan, Li; Zhou, Qinghua

    2007-11-01

    The stability property of bidirectional associate memory (BAM) neural networks with time-varying delays and diffusion terms are considered. By using the method of variation parameter and inequality technique, the delay-independent sufficient conditions to guarantee the uniqueness and global exponential stability of the equilibrium solution of such networks are established.

  4. Brain-Based Devices for Neuromorphic Computer Systems

    DTIC Science & Technology

    2013-07-01

    and Deco, G. (2012). Effective Visual Working Memory Capacity: An Emergent Effect from the Neural Dynamics in an Attractor Network. PLoS ONE 7, e42719...models, apply them to a recognition task, and to demonstrate a working memory . In the course of this work a new analytical method for spiking data was...4 3.4 Spiking Neural Model Simulation of Working Memory ..................................... 5 3.5 A Novel Method for Analysis

  5. Thinking about thinking: Neural mechanisms and effects on memory.

    PubMed

    Bonhage, Corinna; Weber, Friederike; Exner, Cornelia; Kanske, Philipp

    2016-02-15

    It is a well-established finding that memory encoding is impaired if an external secondary task (e.g. tone discrimination) is performed simultaneously. Yet, while studying we are also often engaged in internal secondary tasks such as planning, ruminating, or daydreaming. It remains unclear whether such a secondary internal task has similar effects on memory and what the neural mechanisms underlying such an influence are. We therefore measured participants' blood oxygenation level dependent responses while they learned word-pairs and simultaneously performed different types of secondary tasks (i.e., internal, external, and control). Memory performance decreased in both internal and external secondary tasks compared to the easy control condition. However, while the external task reduced activity in memory-encoding related regions (hippocampus), the internal task increased neural activity in brain regions associated with self-reflection (anterior medial prefrontal cortex), as well as in regions associated with performance monitoring and the perception of salience (anterior insula, dorsal anterior cingulate cortex). Resting-state functional connectivity analyses confirmed that anterior medial prefrontal cortex and anterior insula/dorsal anterior cingulate cortex are part of the default mode network and salience network, respectively. In sum, a secondary internal task impairs memory performance just as a secondary external task, but operates through different neural mechanisms. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. A review on the neural bases of episodic odor memory: from laboratory-based to autobiographical approaches

    PubMed Central

    Saive, Anne-Lise; Royet, Jean-Pierre; Plailly, Jane

    2014-01-01

    Odors are powerful cues that trigger episodic memories. However, in light of the amount of behavioral data describing the characteristics of episodic odor memory, the paucity of information available on the neural substrates of this function is startling. Furthermore, the diversity of experimental paradigms complicates the identification of a generic episodic odor memory network. We conduct a systematic review of the literature depicting the current state of the neural correlates of episodic odor memory in healthy humans by placing a focus on the experimental approaches. Functional neuroimaging data are introduced by a brief characterization of the memory processes investigated. We present and discuss laboratory-based approaches, such as odor recognition and odor associative memory, and autobiographical approaches, such as the evaluation of odor familiarity and odor-evoked autobiographical memory. We then suggest the development of new laboratory-ecological approaches allowing for the controlled encoding and retrieval of specific multidimensional events that could open up new prospects for the comprehension of episodic odor memory and its neural underpinnings. While large conceptual differences distinguish experimental approaches, the overview of the functional neuroimaging findings suggests relatively stable neural correlates of episodic odor memory. PMID:25071494

  7. Neural-activity mapping of memory-based dominance in the crow: neural networks integrating individual discrimination and social behaviour control.

    PubMed

    Nishizawa, K; Izawa, E-I; Watanabe, S

    2011-12-01

    Large-billed crows (Corvus macrorhynchos), highly social birds, form stable dominance relationships based on the memory of win/loss outcomes of first encounters and on individual discrimination. This socio-cognitive behaviour predicts the existence of neural mechanisms for integration of social behaviour control and individual discrimination. This study aimed to elucidate the neural substrates of memory-based dominance in crows. First, the formation of dominance relationships was confirmed between males in a dyadic encounter paradigm. Next, we examined whether neural activities in 22 focal nuclei of pallium and subpallium were correlated with social behaviour and stimulus familiarity after exposure to dominant/subordinate familiar individuals and unfamiliar conspecifics. Neural activity was determined by measuring expression level of the immediate-early-gene (IEG) protein Zenk. Crows displayed aggressive and/or submissive behaviour to opponents less frequently but more discriminatively in subsequent encounters, suggesting stable dominance based on memory, including win/loss outcomes of the first encounters and individual discrimination. Neural correlates of aggressive and submissive behaviour were found in limbic subpallium including septum, bed nucleus of the striae terminalis (BST), and nucleus taeniae of amygdala (TnA), but also those to familiarity factor in BST and TnA. Contrastingly, correlates of social behaviour were little in pallium and those of familiarity with exposed individuals were identified in hippocampus, medial meso-/nidopallium, and ventro-caudal nidopallium. Given the anatomical connection and neural response patterns of the focal nuclei, neural networks connecting pallium and limbic subpallium via hippocampus could be involved in the integration of individual discrimination and social behaviour control in memory-based dominance in the crow. Copyright © 2011 IBRO. Published by Elsevier Ltd. All rights reserved.

  8. Memory and pattern storage in neural networks with activity dependent synapses

    NASA Astrophysics Data System (ADS)

    Mejias, J. F.; Torres, J. J.

    2009-01-01

    We present recently obtained results on the influence of the interplay between several activity dependent synaptic mechanisms, such as short-term depression and facilitation, on the maximum memory storage capacity in an attractor neural network [1]. In contrast with the case of synaptic depression, which drastically reduces the capacity of the network to store and retrieve activity patterns [2], synaptic facilitation is able to enhance the memory capacity in different situations. In particular, we find that a convenient balance between depression and facilitation can enhance the memory capacity, reaching maximal values similar to those obtained with static synapses, that is, without activity-dependent processes. We also argue, employing simple arguments, that this level of balance is compatible with experimental data recorded from some cortical areas, where depression and facilitation may play an important role for both memory-oriented tasks and information processing. We conclude that depressing synapses with a certain level of facilitation allow to recover the good retrieval properties of networks with static synapses while maintaining the nonlinear properties of dynamic synapses, convenient for information processing and coding.

  9. Towards representation of a perceptual color manifold using associative memory for color constancy.

    PubMed

    Seow, Ming-Jung; Asari, Vijayan K

    2009-01-01

    In this paper, we propose the concept of a manifold of color perception through empirical observation that the center-surround properties of images in a perceptually similar environment define a manifold in the high dimensional space. Such a manifold representation can be learned using a novel recurrent neural network based learning algorithm. Unlike the conventional recurrent neural network model in which the memory is stored in an attractive fixed point at discrete locations in the state space, the dynamics of the proposed learning algorithm represent memory as a nonlinear line of attraction. The region of convergence around the nonlinear line is defined by the statistical characteristics of the training data. This learned manifold can then be used as a basis for color correction of the images having different color perception to the learned color perception. Experimental results show that the proposed recurrent neural network learning algorithm is capable of color balance the lighting variations in images captured in different environments successfully.

  10. Using imagination to understand the neural basis of episodic memory

    PubMed Central

    Hassabis, Demis; Kumaran, Dharshan; Maguire, Eleanor A.

    2008-01-01

    Functional MRI (fMRI) studies investigating the neural basis of episodic memory recall, and the related task of thinking about plausible personal future events, have revealed a consistent network of associated brain regions. Surprisingly little, however, is understood about the contributions individual brain areas make to the overall recollective experience. In order to examine this, we employed a novel fMRI paradigm where subjects had to imagine fictitious experiences. In contrast to future thinking, this results in experiences that are not explicitly temporal in nature or as reliant on self-processing. By using previously imagined fictitious experiences as a comparison for episodic memories, we identified the neural basis of a key process engaged in common, namely scene construction, involving the generation, maintenance and visualisation of complex spatial contexts. This was associated with activations in a distributed network, including hippocampus, parahippocampal gyrus, and retrosplenial cortex. Importantly, we disambiguated these common effects from episodic memory-specific responses in anterior medial prefrontal cortex, posterior cingulate cortex and precuneus. These latter regions may support self-schema and familiarity processes, and contribute to the brain's ability to distinguish real from imaginary memories. We conclude that scene construction constitutes a common process underlying episodic memory and imagination of fictitious experiences, and suggest it may partially account for the similar brain networks implicated in navigation, episodic future thinking, and the default mode. We suggest that further brain regions are co-opted into this core network in a task-specific manner to support functions such as episodic memory that may have additional requirements. PMID:18160644

  11. Using imagination to understand the neural basis of episodic memory.

    PubMed

    Hassabis, Demis; Kumaran, Dharshan; Maguire, Eleanor A

    2007-12-26

    Functional MRI (fMRI) studies investigating the neural basis of episodic memory recall, and the related task of thinking about plausible personal future events, have revealed a consistent network of associated brain regions. Surprisingly little, however, is understood about the contributions individual brain areas make to the overall recollective experience. To examine this, we used a novel fMRI paradigm in which subjects had to imagine fictitious experiences. In contrast to future thinking, this results in experiences that are not explicitly temporal in nature or as reliant on self-processing. By using previously imagined fictitious experiences as a comparison for episodic memories, we identified the neural basis of a key process engaged in common, namely scene construction, involving the generation, maintenance and visualization of complex spatial contexts. This was associated with activations in a distributed network, including hippocampus, parahippocampal gyrus, and retrosplenial cortex. Importantly, we disambiguated these common effects from episodic memory-specific responses in anterior medial prefrontal cortex, posterior cingulate cortex and precuneus. These latter regions may support self-schema and familiarity processes, and contribute to the brain's ability to distinguish real from imaginary memories. We conclude that scene construction constitutes a common process underlying episodic memory and imagination of fictitious experiences, and suggest it may partially account for the similar brain networks implicated in navigation, episodic future thinking, and the default mode. We suggest that additional brain regions are co-opted into this core network in a task-specific manner to support functions such as episodic memory that may have additional requirements.

  12. Cholinergic modulation of cognitive processing: insights drawn from computational models

    PubMed Central

    Newman, Ehren L.; Gupta, Kishan; Climer, Jason R.; Monaghan, Caitlin K.; Hasselmo, Michael E.

    2012-01-01

    Acetylcholine plays an important role in cognitive function, as shown by pharmacological manipulations that impact working memory, attention, episodic memory, and spatial memory function. Acetylcholine also shows striking modulatory influences on the cellular physiology of hippocampal and cortical neurons. Modeling of neural circuits provides a framework for understanding how the cognitive functions may arise from the influence of acetylcholine on neural and network dynamics. We review the influences of cholinergic manipulations on behavioral performance in working memory, attention, episodic memory, and spatial memory tasks, the physiological effects of acetylcholine on neural and circuit dynamics, and the computational models that provide insight into the functional relationships between the physiology and behavior. Specifically, we discuss the important role of acetylcholine in governing mechanisms of active maintenance in working memory tasks and in regulating network dynamics important for effective processing of stimuli in attention and episodic memory tasks. We also propose that theta rhythm plays a crucial role as an intermediary between the physiological influences of acetylcholine and behavior in episodic and spatial memory tasks. We conclude with a synthesis of the existing modeling work and highlight future directions that are likely to be rewarding given the existing state of the literature for both empiricists and modelers. PMID:22707936

  13. Dynamic Photorefractive Memory and its Application for Opto-Electronic Neural Networks.

    NASA Astrophysics Data System (ADS)

    Sasaki, Hironori

    This dissertation describes the analysis of the photorefractive crystal dynamics and its application for opto-electronic neural network systems. The realization of the dynamic photorefractive memory is investigated in terms of the following aspects: fast memory update, uniform grating multiplexing schedules and the prevention of the partial erasure of existing gratings. The fast memory update is realized by the selective erasure process that superimposes a new grating on the original one with an appropriate phase shift. The dynamics of the selective erasure process is analyzed using the first-order photorefractive material equations and experimentally confirmed. The effects of beam coupling and fringe bending on the selective erasure dynamics are also analyzed by numerically solving a combination of coupled wave equations and the photorefractive material equation. Incremental recording technique is proposed as a uniform grating multiplexing schedule and compared with the conventional scheduled recording technique in terms of phase distribution in the presence of an external dc electric field, as well as the image gray scale dependence. The theoretical analysis and experimental results proved the superiority of the incremental recording technique over the scheduled recording. Novel recirculating information memory architecture is proposed and experimentally demonstrated to prevent partial degradation of the existing gratings by accessing the memory. Gratings are circulated through a memory feed back loop based on the incremental recording dynamics and demonstrate robust read/write/erase capabilities. The dynamic photorefractive memory is applied to opto-electronic neural network systems. Module architecture based on the page-oriented dynamic photorefractive memory is proposed. This module architecture can implement two complementary interconnection organizations, fan-in and fan-out. The module system scalability and the learning capabilities are theoretically investigated using the photorefractive dynamics described in previous chapters of the dissertation. The implementation of the feed-forward image compression network with 900 input and 9 output neurons with 6-bit interconnection accuracy is experimentally demonstrated. Learning of the Perceptron network that determines sex based on input face images of 900 pixels is also successfully demonstrated.

  14. Network, cellular, and molecular mechanisms underlying long-term memory formation.

    PubMed

    Carasatorre, Mariana; Ramírez-Amaya, Víctor

    2013-01-01

    The neural network stores information through activity-dependent synaptic plasticity that occurs in populations of neurons. Persistent forms of synaptic plasticity may account for long-term memory storage, and the most salient forms are the changes in the structure of synapses. The theory proposes that encoding should use a sparse code and evidence suggests that this can be achieved through offline reactivation or by sparse initial recruitment of the network units. This idea implies that in some cases the neurons that underwent structural synaptic plasticity might be a subpopulation of those originally recruited; However, it is not yet clear whether all the neurons recruited during acquisition are the ones that underwent persistent forms of synaptic plasticity and responsible for memory retrieval. To determine which neural units underlie long-term memory storage, we need to characterize which are the persistent forms of synaptic plasticity occurring in these neural ensembles and the best hints so far are the molecular signals underlying structural modifications of the synapses. Structural synaptic plasticity can be achieved by the activity of various signal transduction pathways, including the NMDA-CaMKII and ACh-MAPK. These pathways converge with the Rho family of GTPases and the consequent ERK 1/2 activation, which regulates multiple cellular functions such as protein translation, protein trafficking, and gene transcription. The most detailed explanation may come from models that allow us to determine the contribution of each piece of this fascinating puzzle that is the neuron and the neural network.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Y.C.; Doolen, G.; Chen, H.H.

    A high-order correlation tensor formalism for neural networks is described. The model can simulate auto associative, heteroassociative, as well as multiassociative memory. For the autoassociative model, simulation results show a drastic increase in the memory capacity and speed over that of the standard Hopfield-like correlation matrix methods. The possibility of using multiassociative memory for a learning universal inference network is also discussed. 9 refs., 5 figs.

  16. Slave to the Rhythm: Experimental Tests of a Model for Verbal Short-Term Memory and Long-Term Sequence Learning

    ERIC Educational Resources Information Center

    Hitch, Graham J.; Flude, Brenda; Burgess, Neil

    2009-01-01

    Three experiments tested predictions of a neural network model of phonological short-term memory that assumes separate representations for order and item information, order being coded via a context-timing signal [Burgess, N., & Hitch, G. J. (1999). Memory for serial order: A network model of the phonological loop and its timing. "Psychological…

  17. Non-equilibrium physics of neural networks for leaning, memory and decision making: landscape and flux perspectives

    NASA Astrophysics Data System (ADS)

    Wang, Jin

    Cognitive behaviors are determined by underlying neural networks. Many brain functions, such as learning and memory, can be described by attractor dynamics. We developed a theoretical framework for global dynamics by quantifying the landscape associated with the steady state probability distributions and steady state curl flux, measuring the degree of non-equilibrium through detailed balance breaking. We found the dynamics and oscillations in human brains responsible for cognitive processes and physiological rhythm regulations are determined not only by the landscape gradient but also by the flux. We found that the flux is closely related to the degrees of the asymmetric connections in neural networks and is the origin of the neural oscillations. The neural oscillation landscape shows a closed-ring attractor topology. The landscape gradient attracts the network down to the ring. The flux is responsible for coherent oscillations on the ring. We suggest the flux may provide the driving force for associations among memories. Both landscape and flux determine the kinetic paths and speed of decision making. The kinetics and global stability of decision making are explored by quantifying the landscape topography through the barrier heights and the mean first passage time. The theoretical predictions are in agreement with experimental observations: more errors occur under time pressure. We quantitatively explored two mechanisms of the speed-accuracy tradeoff with speed emphasis and further uncovered the tradeoffs among speed, accuracy, and energy cost. Our results show an optimal balance among speed, accuracy, and the energy cost in decision making. We uncovered possible mechanisms of changes of mind and how mind changes improve performance in decision processes. Our landscape approach can help facilitate an understanding of the underlying physical mechanisms of cognitive processes and identify the key elements in neural networks.

  18. A Spiking Working Memory Model Based on Hebbian Short-Term Potentiation.

    PubMed

    Fiebig, Florian; Lansner, Anders

    2017-01-04

    A dominant theory of working memory (WM), referred to as the persistent activity hypothesis, holds that recurrently connected neural networks, presumably located in the prefrontal cortex, encode and maintain WM memory items through sustained elevated activity. Reexamination of experimental data has shown that prefrontal cortex activity in single units during delay periods is much more variable than predicted by such a theory and associated computational models. Alternative models of WM maintenance based on synaptic plasticity, such as short-term nonassociative (non-Hebbian) synaptic facilitation, have been suggested but cannot account for encoding of novel associations. Here we test the hypothesis that a recently identified fast-expressing form of Hebbian synaptic plasticity (associative short-term potentiation) is a possible mechanism for WM encoding and maintenance. Our simulations using a spiking neural network model of cortex reproduce a range of cognitive memory effects in the classical multi-item WM task of encoding and immediate free recall of word lists. Memory reactivation in the model occurs in discrete oscillatory bursts rather than as sustained activity. We relate dynamic network activity as well as key synaptic characteristics to electrophysiological measurements. Our findings support the hypothesis that fast Hebbian short-term potentiation is a key WM mechanism. Working memory (WM) is a key component of cognition. Hypotheses about the neural mechanism behind WM are currently under revision. Reflecting recent findings of fast Hebbian synaptic plasticity in cortex, we test whether a cortical spiking neural network model with such a mechanism can learn a multi-item WM task (word list learning). We show that our model can reproduce human cognitive phenomena and achieve comparable memory performance in both free and cued recall while being simultaneously compatible with experimental data on structure, connectivity, and neurophysiology of the underlying cortical tissue. These findings are directly relevant to the ongoing paradigm shift in the WM field. Copyright © 2017 Fiebig and Lansner.

  19. A Spiking Working Memory Model Based on Hebbian Short-Term Potentiation

    PubMed Central

    Fiebig, Florian

    2017-01-01

    A dominant theory of working memory (WM), referred to as the persistent activity hypothesis, holds that recurrently connected neural networks, presumably located in the prefrontal cortex, encode and maintain WM memory items through sustained elevated activity. Reexamination of experimental data has shown that prefrontal cortex activity in single units during delay periods is much more variable than predicted by such a theory and associated computational models. Alternative models of WM maintenance based on synaptic plasticity, such as short-term nonassociative (non-Hebbian) synaptic facilitation, have been suggested but cannot account for encoding of novel associations. Here we test the hypothesis that a recently identified fast-expressing form of Hebbian synaptic plasticity (associative short-term potentiation) is a possible mechanism for WM encoding and maintenance. Our simulations using a spiking neural network model of cortex reproduce a range of cognitive memory effects in the classical multi-item WM task of encoding and immediate free recall of word lists. Memory reactivation in the model occurs in discrete oscillatory bursts rather than as sustained activity. We relate dynamic network activity as well as key synaptic characteristics to electrophysiological measurements. Our findings support the hypothesis that fast Hebbian short-term potentiation is a key WM mechanism. SIGNIFICANCE STATEMENT Working memory (WM) is a key component of cognition. Hypotheses about the neural mechanism behind WM are currently under revision. Reflecting recent findings of fast Hebbian synaptic plasticity in cortex, we test whether a cortical spiking neural network model with such a mechanism can learn a multi-item WM task (word list learning). We show that our model can reproduce human cognitive phenomena and achieve comparable memory performance in both free and cued recall while being simultaneously compatible with experimental data on structure, connectivity, and neurophysiology of the underlying cortical tissue. These findings are directly relevant to the ongoing paradigm shift in the WM field. PMID:28053032

  20. Predicting local field potentials with recurrent neural networks.

    PubMed

    Kim, Louis; Harer, Jacob; Rangamani, Akshay; Moran, James; Parks, Philip D; Widge, Alik; Eskandar, Emad; Dougherty, Darin; Chin, Sang Peter

    2016-08-01

    We present a Recurrent Neural Network using LSTM (Long Short Term Memory) that is capable of modeling and predicting Local Field Potentials. We train and test the network on real data recorded from epilepsy patients. We construct networks that predict multi-channel LFPs for 1, 10, and 100 milliseconds forward in time. Our results show that prediction using LSTM outperforms regression when predicting 10 and 100 millisecond forward in time.

  1. Study of the Gray Scale, Polychromatic, Distortion Invariant Neural Networks Using the Ipa Model.

    NASA Astrophysics Data System (ADS)

    Uang, Chii-Maw

    Research in the optical neural network field is primarily motivated by the fact that humans recognize objects better than the conventional digital computers and the massively parallel inherent nature of optics. This research represents a continuous effort during the past several years in the exploitation of using neurocomputing for pattern recognition. Based on the interpattern association (IPA) model and Hamming net model, many new systems and applications are introduced. A gray level discrete associative memory that is based on object decomposition/composition is proposed for recognizing gray-level patterns. This technique extends the processing ability from the binary mode to gray-level mode, and thus the information capacity is increased. Two polychromatic optical neural networks using color liquid crystal television (LCTV) panels for color pattern recognition are introduced. By introducing a color encoding technique in conjunction with the interpattern associative algorithm, a color associative memory was realized. Based on the color decomposition and composition technique, a color exemplar-based Hamming net was built for color image classification. A shift-invariant neural network is presented through use of the translation invariant property of the modulus of the Fourier transformation and the hetero-associative interpattern association (IPA) memory. To extract the main features, a quadrantal sampling method is used to sampled data and then replace the training patterns. Using the concept of hetero-associative memory to recall the distorted object. A shift and rotation invariant neural network using an interpattern hetero-association (IHA) model is presented. To preserve the shift and rotation invariant properties, a set of binarized-encoded circular harmonic expansion (CHE) functions at the Fourier domain is used as the training set. We use the shift and symmetric properties of the modulus of the Fourier spectrum to avoid the problem of centering the CHE functions. Almost all neural networks have the positive and negative weights, which increases the difficulty of optical implementation. A method to construct a unipolar IPA IWM is discussed. By searching the redundant interconnection links, an effective way that removes all negative links is discussed.

  2. Global exponential stability of BAM neural networks with time-varying delays: The discrete-time case

    NASA Astrophysics Data System (ADS)

    Raja, R.; Marshal Anthoni, S.

    2011-02-01

    This paper deals with the problem of stability analysis for a class of discrete-time bidirectional associative memory (BAM) neural networks with time-varying delays. By employing the Lyapunov functional and linear matrix inequality (LMI) approach, a new sufficient conditions is proposed for the global exponential stability of discrete-time BAM neural networks. The proposed LMI based results can be easily checked by LMI control toolbox. Moreover, an example is also provided to demonstrate the effectiveness of the proposed method.

  3. Inhibition delay increases neural network capacity through Stirling transform.

    PubMed

    Nogaret, Alain; King, Alastair

    2018-03-01

    Inhibitory neural networks are found to encode high volumes of information through delayed inhibition. We show that inhibition delay increases storage capacity through a Stirling transform of the minimum capacity which stabilizes locally coherent oscillations. We obtain both the exact and asymptotic formulas for the total number of dynamic attractors. Our results predict a (ln2)^{-N}-fold increase in capacity for an N-neuron network and demonstrate high-density associative memories which host a maximum number of oscillations in analog neural devices.

  4. Inhibition delay increases neural network capacity through Stirling transform

    NASA Astrophysics Data System (ADS)

    Nogaret, Alain; King, Alastair

    2018-03-01

    Inhibitory neural networks are found to encode high volumes of information through delayed inhibition. We show that inhibition delay increases storage capacity through a Stirling transform of the minimum capacity which stabilizes locally coherent oscillations. We obtain both the exact and asymptotic formulas for the total number of dynamic attractors. Our results predict a (ln2) -N-fold increase in capacity for an N -neuron network and demonstrate high-density associative memories which host a maximum number of oscillations in analog neural devices.

  5. Impact of leakage delay on bifurcation in high-order fractional BAM neural networks.

    PubMed

    Huang, Chengdai; Cao, Jinde

    2018-02-01

    The effects of leakage delay on the dynamics of neural networks with integer-order have lately been received considerable attention. It has been confirmed that fractional neural networks more appropriately uncover the dynamical properties of neural networks, but the results of fractional neural networks with leakage delay are relatively few. This paper primarily concentrates on the issue of bifurcation for high-order fractional bidirectional associative memory(BAM) neural networks involving leakage delay. The first attempt is made to tackle the stability and bifurcation of high-order fractional BAM neural networks with time delay in leakage terms in this paper. The conditions for the appearance of bifurcation for the proposed systems with leakage delay are firstly established by adopting time delay as a bifurcation parameter. Then, the bifurcation criteria of such system without leakage delay are successfully acquired. Comparative analysis wondrously detects that the stability performance of the proposed high-order fractional neural networks is critically weakened by leakage delay, they cannot be overlooked. Numerical examples are ultimately exhibited to attest the efficiency of the theoretical results. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Musical and verbal semantic memory: two distinct neural networks?

    PubMed

    Groussard, M; Viader, F; Hubert, V; Landeau, B; Abbas, A; Desgranges, B; Eustache, F; Platel, H

    2010-02-01

    Semantic memory has been investigated in numerous neuroimaging and clinical studies, most of which have used verbal or visual, but only very seldom, musical material. Clinical studies have suggested that there is a relative neural independence between verbal and musical semantic memory. In the present study, "musical semantic memory" is defined as memory for "well-known" melodies without any knowledge of the spatial or temporal circumstances of learning, while "verbal semantic memory" corresponds to general knowledge about concepts, again without any knowledge of the spatial or temporal circumstances of learning. Our aim was to compare the neural substrates of musical and verbal semantic memory by administering the same type of task in each modality. We used high-resolution PET H(2)O(15) to observe 11 young subjects performing two main tasks: (1) a musical semantic memory task, where the subjects heard the first part of familiar melodies and had to decide whether the second part they heard matched the first, and (2) a verbal semantic memory task with the same design, but where the material consisted of well-known expressions or proverbs. The musical semantic memory condition activated the superior temporal area and inferior and middle frontal areas in the left hemisphere and the inferior frontal area in the right hemisphere. The verbal semantic memory condition activated the middle temporal region in the left hemisphere and the cerebellum in the right hemisphere. We found that the verbal and musical semantic processes activated a common network extending throughout the left temporal neocortex. In addition, there was a material-dependent topographical preference within this network, with predominantly anterior activation during musical tasks and predominantly posterior activation during semantic verbal tasks. Copyright (c) 2009 Elsevier Inc. All rights reserved.

  7. Gender differences in the functional neuroanatomy of emotional episodic autobiographical memory.

    PubMed

    Piefke, Martina; Weiss, Peter H; Markowitsch, Hans J; Fink, Gereon R

    2005-04-01

    Autobiographical memory is based on interactions between episodic memory contents, associated emotions, and a sense of self-continuity along the time axis of one's life. The functional neuroanatomy subserving autobiographical memory is known to include prefrontal, medial and lateral temporal, as well as retrosplenial brain areas; however, whether gender differences exist in neural correlates of autobiographical memory remains to be clarified. We reanalyzed data from a previous functional magnetic resonance imaging (fMRI) experiment to investigate gender-related differences in the neural bases of autobiographical memories with differential remoteness and emotional valence. On the behavioral level, there were no significant gender differences in memory performance or emotional intensity of memories. Activations common to males and females during autobiographical memory retrieval were observed in a bilateral network of brain areas comprising medial and lateral temporal regions, including hippocampal and parahippocampal structures, posterior cingulate, as well as prefrontal cortex. In males (relative to females), all types of autobiographical memories investigated were associated with differential activation of the left parahippocampal gyrus. By contrast, right dorsolateral prefrontal cortex was activated differentially by females. In addition, the right insula was activated differentially in females during remote and negative memory retrieval. The data show gender-related differential neural activations within the network subserving autobiographical memory in both genders. We suggest that the differential activations may reflect gender-specific cognitive strategies during access to autobiographical memories that do not necessarily affect the behavioral level of memory performance and emotionality. (c) 2005 Wiley-Liss, Inc.

  8. From network heterogeneities to familiarity detection and hippocampal memory management

    PubMed Central

    Wang, Jane X.; Poe, Gina; Zochowski, Michal

    2009-01-01

    Hippocampal-neocortical interactions are key to the rapid formation of novel associative memories in the hippocampus and consolidation to long term storage sites in the neocortex. We investigated the role of network correlates during information processing in hippocampal-cortical networks. We found that changes in the intrinsic network dynamics due to the formation of structural network heterogeneities alone act as a dynamical and regulatory mechanism for stimulus novelty and familiarity detection, thereby controlling memory management in the context of memory consolidation. This network dynamic, coupled with an anatomically established feedback between the hippocampus and the neocortex, recovered heretofore unexplained properties of neural activity patterns during memory management tasks which we observed during sleep in multiunit recordings from behaving animals. Our simple dynamical mechanism shows an experimentally matched progressive shift of memory activation from the hippocampus to the neocortex and thus provides the means to achieve an autonomous off-line progression of memory consolidation. PMID:18999453

  9. Constructive autoassociative neural network for facial recognition.

    PubMed

    Fernandes, Bruno J T; Cavalcanti, George D C; Ren, Tsang I

    2014-01-01

    Autoassociative artificial neural networks have been used in many different computer vision applications. However, it is difficult to define the most suitable neural network architecture because this definition is based on previous knowledge and depends on the problem domain. To address this problem, we propose a constructive autoassociative neural network called CANet (Constructive Autoassociative Neural Network). CANet integrates the concepts of receptive fields and autoassociative memory in a dynamic architecture that changes the configuration of the receptive fields by adding new neurons in the hidden layer, while a pruning algorithm removes neurons from the output layer. Neurons in the CANet output layer present lateral inhibitory connections that improve the recognition rate. Experiments in face recognition and facial expression recognition show that the CANet outperforms other methods presented in the literature.

  10. Deep recurrent neural network reveals a hierarchy of process memory during dynamic natural vision.

    PubMed

    Shi, Junxing; Wen, Haiguang; Zhang, Yizhen; Han, Kuan; Liu, Zhongming

    2018-05-01

    The human visual cortex extracts both spatial and temporal visual features to support perception and guide behavior. Deep convolutional neural networks (CNNs) provide a computational framework to model cortical representation and organization for spatial visual processing, but unable to explain how the brain processes temporal information. To overcome this limitation, we extended a CNN by adding recurrent connections to different layers of the CNN to allow spatial representations to be remembered and accumulated over time. The extended model, or the recurrent neural network (RNN), embodied a hierarchical and distributed model of process memory as an integral part of visual processing. Unlike the CNN, the RNN learned spatiotemporal features from videos to enable action recognition. The RNN better predicted cortical responses to natural movie stimuli than the CNN, at all visual areas, especially those along the dorsal stream. As a fully observable model of visual processing, the RNN also revealed a cortical hierarchy of temporal receptive window, dynamics of process memory, and spatiotemporal representations. These results support the hypothesis of process memory, and demonstrate the potential of using the RNN for in-depth computational understanding of dynamic natural vision. © 2018 Wiley Periodicals, Inc.

  11. NDRAM: nonlinear dynamic recurrent associative memory for learning bipolar and nonbipolar correlated patterns.

    PubMed

    Chartier, Sylvain; Proulx, Robert

    2005-11-01

    This paper presents a new unsupervised attractor neural network, which, contrary to optimal linear associative memory models, is able to develop nonbipolar attractors as well as bipolar attractors. Moreover, the model is able to develop less spurious attractors and has a better recall performance under random noise than any other Hopfield type neural network. Those performances are obtained by a simple Hebbian/anti-Hebbian online learning rule that directly incorporates feedback from a specific nonlinear transmission rule. Several computer simulations show the model's distinguishing properties.

  12. Feedforward, high density, programmable read only neural network based memory system

    NASA Technical Reports Server (NTRS)

    Daud, Taher; Moopenn, Alex; Lamb, James; Thakoor, Anil; Khanna, Satish

    1988-01-01

    Neural network-inspired, nonvolatile, programmable associative memory using thin-film technology is demonstrated. The details of the architecture, which uses programmable resistive connection matrices in synaptic arrays and current summing and thresholding amplifiers as neurons, are described. Several synapse configurations for a high-density array of a binary connection matrix are also described. Test circuits are evaluated for operational feasibility and to demonstrate the speed of the read operation. The results are discussed to highlight the potential for a read data rate exceeding 10 megabits/sec.

  13. Neural Correlates Associated with Successful Working Memory Performance in Older Adults as Revealed by Spatial ICA

    PubMed Central

    Saliasi, Emi; Geerligs, Linda; Lorist, Monicque M.; Maurits, Natasha M.

    2014-01-01

    To investigate which neural correlates are associated with successful working memory performance, fMRI was recorded in healthy younger and older adults during performance on an n-back task with varying task demands. To identify functional networks supporting working memory processes, we used independent component analysis (ICA) decomposition of the fMRI data. Compared to younger adults, older adults showed a larger neural (BOLD) response in the more complex (2-back) than in the baseline (0-back) task condition, in the ventral lateral prefrontal cortex (VLPFC) and in the right fronto-parietal network (FPN). Our results indicated that a higher BOLD response in the VLPFC was associated with increased performance accuracy in older adults, in both the baseline and the more complex task condition. This ‘BOLD-performance’ relationship suggests that the neural correlates linked with successful performance in the older adults are not uniquely related to specific working memory processes present in the complex but not in the baseline task condition. Furthermore, the selective presence of this relationship in older but not in younger adults suggests that increased neural activity in the VLPFC serves a compensatory role in the aging brain which benefits task performance in the elderly. PMID:24911016

  14. Application of neural networks to group technology

    NASA Astrophysics Data System (ADS)

    Caudell, Thomas P.; Smith, Scott D. G.; Johnson, G. C.; Wunsch, Donald C., II

    1991-08-01

    Adaptive resonance theory (ART) neural networks are being developed for application to the industrial engineering problem of group technology--the reuse of engineering designs. Two- and three-dimensional representations of engineering designs are input to ART-1 neural networks to produce groups or families of similar parts. These representations, in their basic form, amount to bit maps of the part, and can become very large when the part is represented in high resolution. This paper describes an enhancement to an algorithmic form of ART-1 that allows it to operate directly on compressed input representations and to generate compressed memory templates. The performance of this compressed algorithm is compared to that of the regular algorithm on real engineering designs and a significant savings in memory storage as well as a speed up in execution is observed. In additions, a `neural database'' system under development is described. This system demonstrates the feasibility of training an ART-1 network to first cluster designs into families, and then to recall the family when presented a similar design. This application is of large practical value to industry, making it possible to avoid duplication of design efforts.

  15. Neural coding in graphs of bidirectional associative memories.

    PubMed

    Bouchain, A David; Palm, Günther

    2012-01-24

    In the last years we have developed large neural network models for the realization of complex cognitive tasks in a neural network architecture that resembles the network of the cerebral cortex. We have used networks of several cortical modules that contain two populations of neurons (one excitatory, one inhibitory). The excitatory populations in these so-called "cortical networks" are organized as a graph of Bidirectional Associative Memories (BAMs), where edges of the graph correspond to BAMs connecting two neural modules and nodes of the graph correspond to excitatory populations with associative feedback connections (and inhibitory interneurons). The neural code in each of these modules consists essentially of the firing pattern of the excitatory population, where mainly it is the subset of active neurons that codes the contents to be represented. The overall activity can be used to distinguish different properties of the patterns that are represented which we need to distinguish and control when performing complex tasks like language understanding with these cortical networks. The most important pattern properties or situations are: exactly fitting or matching input, incomplete information or partially matching pattern, superposition of several patterns, conflicting information, and new information that is to be learned. We show simple simulations of these situations in one area or module and discuss how to distinguish these situations based on the overall internal activation of the module. This article is part of a Special Issue entitled "Neural Coding". Copyright © 2011 Elsevier B.V. All rights reserved.

  16. The Cognitive, Perceptual, and Neural Bases of Skilled Performance

    DTIC Science & Technology

    1988-09-01

    shunting, masking field, bidirectional associative memory, Volterra - Lotka , Gilpin-Ayala, ani Eigen-Schuster models. The Cohen-Grossberg model thus...field, bidirectional associative memory, Volterra - Lotka , Gilpin-Ayala, and Eigen-Schuster models. A Liapunov functional method is described for...storage by neural networks: A general model and global Liapunov method. In E.L. Schwartz (Ed.), Computational neuroscience. Cambridge, MA: MIT Press

  17. Using Long-Short-Term-Memory Recurrent Neural Networks to Predict Aviation Engine Vibrations

    NASA Astrophysics Data System (ADS)

    ElSaid, AbdElRahman Ahmed

    This thesis examines building viable Recurrent Neural Networks (RNN) using Long Short Term Memory (LSTM) neurons to predict aircraft engine vibrations. The different networks are trained on a large database of flight data records obtained from an airline containing flights that suffered from excessive vibration. RNNs can provide a more generalizable and robust method for prediction over analytical calculations of engine vibration, as analytical calculations must be solved iteratively based on specific empirical engine parameters, and this database contains multiple types of engines. Further, LSTM RNNs provide a "memory" of the contribution of previous time series data which can further improve predictions of future vibration values. LSTM RNNs were used over traditional RNNs, as those suffer from vanishing/exploding gradients when trained with back propagation. The study managed to predict vibration values for 1, 5, 10, and 20 seconds in the future, with 2.84% 3.3%, 5.51% and 10.19% mean absolute error, respectively. These neural networks provide a promising means for the future development of warning systems so that suitable actions can be taken before the occurrence of excess vibration to avoid unfavorable situations during flight.

  18. [Memory engram of brain circuit].

    PubMed

    Kojima, Hiroto; Sakaguchi, Tetsuya; Ikegaya, Yuji

    2015-05-01

    How are memories stored in the brain and retrieved on demand? This is a frequently asked question. Indeed, we acquire new memories daily and remember old ones. However, how we can memorize one-time experiences is yet to be investigated. Here, we review possible mechanisms by which memories are maintained in neural networks.

  19. Emergent latent symbol systems in recurrent neural networks

    NASA Astrophysics Data System (ADS)

    Monner, Derek; Reggia, James A.

    2012-12-01

    Fodor and Pylyshyn [(1988). Connectionism and cognitive architecture: A critical analysis. Cognition, 28(1-2), 3-71] famously argued that neural networks cannot behave systematically short of implementing a combinatorial symbol system. A recent response from Frank et al. [(2009). Connectionist semantic systematicity. Cognition, 110(3), 358-379] claimed to have trained a neural network to behave systematically without implementing a symbol system and without any in-built predisposition towards combinatorial representations. We believe systems like theirs may in fact implement a symbol system on a deeper and more interesting level: one where the symbols are latent - not visible at the level of network structure. In order to illustrate this possibility, we demonstrate our own recurrent neural network that learns to understand sentence-level language in terms of a scene. We demonstrate our model's learned understanding by testing it on novel sentences and scenes. By paring down our model into an architecturally minimal version, we demonstrate how it supports combinatorial computation over distributed representations by using the associative memory operations of Vector Symbolic Architectures. Knowledge of the model's memory scheme gives us tools to explain its errors and construct superior future models. We show how the model designs and manipulates a latent symbol system in which the combinatorial symbols are patterns of activation distributed across the layers of a neural network, instantiating a hybrid of classical symbolic and connectionist representations that combines advantages of both.

  20. Exponential H(infinity) synchronization of general discrete-time chaotic neural networks with or without time delays.

    PubMed

    Qi, Donglian; Liu, Meiqin; Qiu, Meikang; Zhang, Senlin

    2010-08-01

    This brief studies exponential H(infinity) synchronization of a class of general discrete-time chaotic neural networks with external disturbance. On the basis of the drive-response concept and H(infinity) control theory, and using Lyapunov-Krasovskii (or Lyapunov) functional, state feedback controllers are established to not only guarantee exponential stable synchronization between two general chaotic neural networks with or without time delays, but also reduce the effect of external disturbance on the synchronization error to a minimal H(infinity) norm constraint. The proposed controllers can be obtained by solving the convex optimization problems represented by linear matrix inequalities. Most discrete-time chaotic systems with or without time delays, such as Hopfield neural networks, cellular neural networks, bidirectional associative memory networks, recurrent multilayer perceptrons, Cohen-Grossberg neural networks, Chua's circuits, etc., can be transformed into this general chaotic neural network to be H(infinity) synchronization controller designed in a unified way. Finally, some illustrated examples with their simulations have been utilized to demonstrate the effectiveness of the proposed methods.

  1. A quantum-implementable neural network model

    NASA Astrophysics Data System (ADS)

    Chen, Jialin; Wang, Lingli; Charbon, Edoardo

    2017-10-01

    A quantum-implementable neural network, namely quantum probability neural network (QPNN) model, is proposed in this paper. QPNN can use quantum parallelism to trace all possible network states to improve the result. Due to its unique quantum nature, this model is robust to several quantum noises under certain conditions, which can be efficiently implemented by the qubus quantum computer. Another advantage is that QPNN can be used as memory to retrieve the most relevant data and even to generate new data. The MATLAB experimental results of Iris data classification and MNIST handwriting recognition show that much less neuron resources are required in QPNN to obtain a good result than the classical feedforward neural network. The proposed QPNN model indicates that quantum effects are useful for real-life classification tasks.

  2. Optimal exponential synchronization of general chaotic delayed neural networks: an LMI approach.

    PubMed

    Liu, Meiqin

    2009-09-01

    This paper investigates the optimal exponential synchronization problem of general chaotic neural networks with or without time delays by virtue of Lyapunov-Krasovskii stability theory and the linear matrix inequality (LMI) technique. This general model, which is the interconnection of a linear delayed dynamic system and a bounded static nonlinear operator, covers several well-known neural networks, such as Hopfield neural networks, cellular neural networks (CNNs), bidirectional associative memory (BAM) networks, and recurrent multilayer perceptrons (RMLPs) with or without delays. Using the drive-response concept, time-delay feedback controllers are designed to synchronize two identical chaotic neural networks as quickly as possible. The control design equations are shown to be a generalized eigenvalue problem (GEVP) which can be easily solved by various convex optimization algorithms to determine the optimal control law and the optimal exponential synchronization rate. Detailed comparisons with existing results are made and numerical simulations are carried out to demonstrate the effectiveness of the established synchronization laws.

  3. Selective attention on representations in working memory: cognitive and neural mechanisms.

    PubMed

    Ku, Yixuan

    2018-01-01

    Selective attention and working memory are inter-dependent core cognitive functions. It is critical to allocate attention on selected targets during the capacity-limited working memory processes to fulfill the goal-directed behavior. The trends of research on both topics are increasing exponentially in recent years, and it is considered that selective attention and working memory share similar underlying neural mechanisms. Different types of attention orientation in working memory are introduced by distinctive cues, and the means using retrospective cues are strengthened currently as it is manipulating the representation in memory, instead of the perceptual representation. The cognitive and neural mechanisms of the retro-cue effects are further reviewed, as well as the potential molecular mechanism. The frontal-parietal network that is involved in both attention and working memory is also the neural candidate for attention orientation during working memory. Neural oscillations in the gamma and alpha/beta oscillations may respectively be employed for the feedforward and feedback information transfer between the sensory cortices and the association cortices. Dopamine and serotonin systems might interact with each other subserving the communication between memory and attention. In conclusion, representations which attention shifts towards are strengthened, while representations which attention moves away from are degraded. Studies on attention orientation during working memory indicates the flexibility of the processes of working memory, and the beneficial way that overcome the limited capacity of working memory.

  4. Selective attention on representations in working memory: cognitive and neural mechanisms

    PubMed Central

    2018-01-01

    Selective attention and working memory are inter-dependent core cognitive functions. It is critical to allocate attention on selected targets during the capacity-limited working memory processes to fulfill the goal-directed behavior. The trends of research on both topics are increasing exponentially in recent years, and it is considered that selective attention and working memory share similar underlying neural mechanisms. Different types of attention orientation in working memory are introduced by distinctive cues, and the means using retrospective cues are strengthened currently as it is manipulating the representation in memory, instead of the perceptual representation. The cognitive and neural mechanisms of the retro-cue effects are further reviewed, as well as the potential molecular mechanism. The frontal-parietal network that is involved in both attention and working memory is also the neural candidate for attention orientation during working memory. Neural oscillations in the gamma and alpha/beta oscillations may respectively be employed for the feedforward and feedback information transfer between the sensory cortices and the association cortices. Dopamine and serotonin systems might interact with each other subserving the communication between memory and attention. In conclusion, representations which attention shifts towards are strengthened, while representations which attention moves away from are degraded. Studies on attention orientation during working memory indicates the flexibility of the processes of working memory, and the beneficial way that overcome the limited capacity of working memory. PMID:29629245

  5. Systematic construction and control of stereo nerve vision network in intelligent manufacturing

    NASA Astrophysics Data System (ADS)

    Liu, Hua; Wang, Helong; Guo, Chunjie; Ding, Quanxin; Zhou, Liwei

    2017-10-01

    A system method of constructing stereo vision by using neural network is proposed, and the operation and control mechanism in actual operation are proposed. This method makes effective use of the neural network in learning and memory function, by after training with samples. Moreover, the neural network can learn the nonlinear relationship in the stereoscopic vision system and the internal and external orientation elements. These considerations are Worthy of attention, which includes limited constraints, the scientific of critical group, the operating speed and the operability in technical aspects. The results support our theoretical forecast.

  6. Open quantum generalisation of Hopfield neural networks

    NASA Astrophysics Data System (ADS)

    Rotondo, P.; Marcuzzi, M.; Garrahan, J. P.; Lesanovsky, I.; Müller, M.

    2018-03-01

    We propose a new framework to understand how quantum effects may impact on the dynamics of neural networks. We implement the dynamics of neural networks in terms of Markovian open quantum systems, which allows us to treat thermal and quantum coherent effects on the same footing. In particular, we propose an open quantum generalisation of the Hopfield neural network, the simplest toy model of associative memory. We determine its phase diagram and show that quantum fluctuations give rise to a qualitatively new non-equilibrium phase. This novel phase is characterised by limit cycles corresponding to high-dimensional stationary manifolds that may be regarded as a generalisation of storage patterns to the quantum domain.

  7. Jordan recurrent neural network versus IHACRES in modelling daily streamflows

    NASA Astrophysics Data System (ADS)

    Carcano, Elena Carla; Bartolini, Paolo; Muselli, Marco; Piroddi, Luigi

    2008-12-01

    SummaryA study of possible scenarios for modelling streamflow data from daily time series, using artificial neural networks (ANNs), is presented. Particular emphasis is devoted to the reconstruction of drought periods where water resource management and control are most critical. This paper considers two connectionist models: a feedforward multilayer perceptron (MLP) and a Jordan recurrent neural network (JNN), comparing network performance on real world data from two small catchments (192 and 69 km 2 in size) with irregular and torrential regimes. Several network configurations are tested to ensure a good combination of input features (rainfall and previous streamflow data) that capture the variability of the physical processes at work. Tapped delayed line (TDL) and memory effect techniques are introduced to recognize and reproduce temporal dependence. Results show a poor agreement when using TDL only, but a remarkable improvement can be obtained with JNN and its memory effect procedures, which are able to reproduce the system memory over a catchment in a more effective way. Furthermore, the IHACRES conceptual model, which relies on both rainfall and temperature input data, is introduced for comparative study. The results suggest that when good input data is unavailable, metric models perform better than conceptual ones and, in general, it is difficult to justify substantial conceptualization of complex processes.

  8. Generalized memory associativity in a network model for the neuroses

    NASA Astrophysics Data System (ADS)

    Wedemann, Roseli S.; Donangelo, Raul; de Carvalho, Luís A. V.

    2009-03-01

    We review concepts introduced in earlier work, where a neural network mechanism describes some mental processes in neurotic pathology and psychoanalytic working-through, as associative memory functioning, according to the findings of Freud. We developed a complex network model, where modules corresponding to sensorial and symbolic memories interact, representing unconscious and conscious mental processes. The model illustrates Freud's idea that consciousness is related to symbolic and linguistic memory activity in the brain. We have introduced a generalization of the Boltzmann machine to model memory associativity. Model behavior is illustrated with simulations and some of its properties are analyzed with methods from statistical mechanics.

  9. Levels of processing and language modality specificity in working memory.

    PubMed

    Rudner, Mary; Karlsson, Thomas; Gunnarsson, Johan; Rönnberg, Jerker

    2013-03-01

    Neural networks underpinning working memory demonstrate sign language specific components possibly related to differences in temporary storage mechanisms. A processing approach to memory systems suggests that the organisation of memory storage is related to type of memory processing as well. In the present study, we investigated for the first time semantic, phonological and orthographic processing in working memory for sign- and speech-based language. During fMRI we administered a picture-based 2-back working memory task with Semantic, Phonological, Orthographic and Baseline conditions to 11 deaf signers and 20 hearing non-signers. Behavioural data showed poorer and slower performance for both groups in Phonological and Orthographic conditions than in the Semantic condition, in line with depth-of-processing theory. An exclusive masking procedure revealed distinct sign-specific neural networks supporting working memory components at all three levels of processing. The overall pattern of sign-specific activations may reflect a relative intermodality difference in the relationship between phonology and semantics influencing working memory storage and processing. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. Neural dynamics based on the recognition of neural fingerprints

    PubMed Central

    Carrillo-Medina, José Luis; Latorre, Roberto

    2015-01-01

    Experimental evidence has revealed the existence of characteristic spiking features in different neural signals, e.g., individual neural signatures identifying the emitter or functional signatures characterizing specific tasks. These neural fingerprints may play a critical role in neural information processing, since they allow receptors to discriminate or contextualize incoming stimuli. This could be a powerful strategy for neural systems that greatly enhances the encoding and processing capacity of these networks. Nevertheless, the study of information processing based on the identification of specific neural fingerprints has attracted little attention. In this work, we study (i) the emerging collective dynamics of a network of neurons that communicate with each other by exchange of neural fingerprints and (ii) the influence of the network topology on the self-organizing properties within the network. Complex collective dynamics emerge in the network in the presence of stimuli. Predefined inputs, i.e., specific neural fingerprints, are detected and encoded into coexisting patterns of activity that propagate throughout the network with different spatial organization. The patterns evoked by a stimulus can survive after the stimulation is over, which provides memory mechanisms to the network. The results presented in this paper suggest that neural information processing based on neural fingerprints can be a plausible, flexible, and powerful strategy. PMID:25852531

  11. Functional magnetic resonance imaging study of external source memory and its relation to cognitive insight in non-clinical subjects.

    PubMed

    Buchy, Lisa; Hawco, Colin; Bodnar, Michael; Izadi, Sarah; Dell'Elce, Jennifer; Messina, Katrina; Lepage, Martin

    2014-09-01

    Previous research has linked cognitive insight (a measure of self-reflectiveness and self-certainty) in psychosis with neurocognitive and neuroanatomical disturbances in the fronto-hippocampal neural network. The authors' goal was to use functional magnetic resonance imaging (fMRI) to investigate the neural correlates of cognitive insight during an external source memory paradigm in non-clinical subjects. At encoding, 24 non-clinical subjects travelled through a virtual city where they came across 20 separate people, each paired with a unique object in a distinct location. fMRI data were then acquired while participants viewed images of the city, and completed source recognition memory judgments of where and with whom objects were seen, which is known to involve prefrontal cortex. Cognitive insight was assessed with the Beck Cognitive Insight Scale. External source memory was associated with neural activity in a widespread network consisting of frontal cortex, including ventrolateral prefrontal cortex (VLPFC), temporal and occipital cortices. Activation in VLPFC correlated with higher self-reflectiveness and activation in midbrain correlated with lower self-certainty during source memory attributions. Neither self-reflectiveness nor self-certainty significantly correlated with source memory accuracy. By means of virtual reality and in the context of an external source memory paradigm, the study identified a preliminary functional neural basis for cognitive insight in the VLPFC in healthy people that accords with our fronto-hippocampal theoretical model as well as recent neuroimaging data in people with psychosis. The results may facilitate the understanding of the role of neural mechanisms in psychotic disorders associated with cognitive insight distortions. © 2014 The Authors. Psychiatry and Clinical Neurosciences © 2014 Japanese Society of Psychiatry and Neurology.

  12. Neural correlates of working memory training in HIV patients: study protocol for a randomized controlled trial.

    PubMed

    Chang, L; Løhaugen, G C; Douet, V; Miller, E N; Skranes, J; Ernst, T

    2016-02-02

    Potent combined antiretroviral therapy decreased the incidence and severity of HIV-associated neurocognitive disorders (HAND); however, no specific effective pharmacotherapy exists for HAND. Patients with HIV commonly have deficits in working memory and attention, which may negatively impact many other cognitive domains, leading to HAND. Since HAND may lead to loss of independence in activities of daily living and negative emotional well-being, and incur a high economic burden, effective treatments for HAND are urgently needed. This study aims to determine whether adaptive working memory training might improve cognitive functions and neural network efficiency and possibly decrease neuroinflammation. This study also aims to assess whether subjects with the LMX1A-rs4657412 TT(AA) genotype show greater training effects from working memory training than TC(AG) or CC(GG)-carriers. 60 HIV-infected and 60 seronegative control participants will be randomized to a double-blind active-controlled study, using adaptive versus non-adaptive Cogmed Working Memory Training® (CWMT), 20-25 sessions over 5-8 weeks. Each subject will be assessed with near- and far-transfer cognitive tasks, self-reported mood and executive function questionnaires, and blood-oxygenation level-dependent functional MRI during working memory (n-back) and visual attention (ball tracking) tasks, at baseline, 1-month, and 6-months after CWMT. Furthermore, genotyping for LMX1A-rs4657412 will be performed to identify whether subjects with the TT(AA)-genotype show greater gain or neural efficiency after CWMT than those with other genotypes. Lastly, cerebrospinal fluid will be obtained before and after CWMT to explore changes in levels of inflammatory proteins (cytokines and chemokines) and monoamines. Improving working memory in HIV patients, using CWMT, might slow the progression or delay the onset of HAND. Observation of decreased brain activation or normalized neural networks, using fMRI, after CWMT would lead to a better understanding of how neural networks are modulated by CWMT. Moreover, validating the greater training gain in subjects with the LMX1A-TT(AA) genotype could lead to a personalized approach for future working memory training studies. Demonstrating and understanding the neural correlates of the efficacy of CWMT in HIV patients could lead to a safe adjunctive therapy for HAND, and possibly other brain disorders. ClinicalTrial.gov, NCT02602418.

  13. Spike timing analysis in neural networks with unsupervised synaptic plasticity

    NASA Astrophysics Data System (ADS)

    Mizusaki, B. E. P.; Agnes, E. J.; Brunnet, L. G.; Erichsen, R., Jr.

    2013-01-01

    The synaptic plasticity rules that sculpt a neural network architecture are key elements to understand cortical processing, as they may explain the emergence of stable, functional activity, while avoiding runaway excitation. For an associative memory framework, they should be built in a way as to enable the network to reproduce a robust spatio-temporal trajectory in response to an external stimulus. Still, how these rules may be implemented in recurrent networks and the way they relate to their capacity of pattern recognition remains unclear. We studied the effects of three phenomenological unsupervised rules in sparsely connected recurrent networks for associative memory: spike-timing-dependent-plasticity, short-term-plasticity and an homeostatic scaling. The system stability is monitored during the learning process of the network, as the mean firing rate converges to a value determined by the homeostatic scaling. Afterwards, it is possible to measure the recovery efficiency of the activity following each initial stimulus. This is evaluated by a measure of the correlation between spike fire timings, and we analysed the full memory separation capacity and limitations of this system.

  14. A generalized LSTM-like training algorithm for second-order recurrent neural networks

    PubMed Central

    Monner, Derek; Reggia, James A.

    2011-01-01

    The Long Short Term Memory (LSTM) is a second-order recurrent neural network architecture that excels at storing sequential short-term memories and retrieving them many time-steps later. LSTM’s original training algorithm provides the important properties of spatial and temporal locality, which are missing from other training approaches, at the cost of limiting it’s applicability to a small set of network architectures. Here we introduce the Generalized Long Short-Term Memory (LSTM-g) training algorithm, which provides LSTM-like locality while being applicable without modification to a much wider range of second-order network architectures. With LSTM-g, all units have an identical set of operating instructions for both activation and learning, subject only to the configuration of their local environment in the network; this is in contrast to the original LSTM training algorithm, where each type of unit has its own activation and training instructions. When applied to LSTM architectures with peephole connections, LSTM-g takes advantage of an additional source of back-propagated error which can enable better performance than the original algorithm. Enabled by the broad architectural applicability of LSTM-g, we demonstrate that training recurrent networks engineered for specific tasks can produce better results than single-layer networks. We conclude that LSTM-g has the potential to both improve the performance and broaden the applicability of spatially and temporally local gradient-based training algorithms for recurrent neural networks. PMID:21803542

  15. Using an Extended Kalman Filter Learning Algorithm for Feed-Forward Neural Networks to Describe Tracer Correlations

    NASA Technical Reports Server (NTRS)

    Lary, David J.; Mussa, Yussuf

    2004-01-01

    In this study a new extended Kalman filter (EKF) learning algorithm for feed-forward neural networks (FFN) is used. With the EKF approach, the training of the FFN can be seen as state estimation for a non-linear stationary process. The EKF method gives excellent convergence performances provided that there is enough computer core memory and that the machine precision is high. Neural networks are ideally suited to describe the spatial and temporal dependence of tracer-tracer correlations. The neural network performs well even in regions where the correlations are less compact and normally a family of correlation curves would be required. For example, the CH4-N2O correlation can be well described using a neural network trained with the latitude, pressure, time of year, and CH4 volume mixing ratio (v.m.r.). The neural network was able to reproduce the CH4-N2O correlation with a correlation coefficient between simulated and training values of 0.9997. The neural network Fortran code used is available for download.

  16. A phenomenological memristor model for short-term/long-term memory

    NASA Astrophysics Data System (ADS)

    Chen, Ling; Li, Chuandong; Huang, Tingwen; Ahmad, Hafiz Gulfam; Chen, Yiran

    2014-08-01

    Memristor is considered to be a natural electrical synapse because of its distinct memory property and nanoscale. In recent years, more and more similar behaviors are observed between memristors and biological synapse, e.g., short-term memory (STM) and long-term memory (LTM). The traditional mathematical models are unable to capture the new emerging behaviors. In this article, an updated phenomenological model based on the model of the Hewlett-Packard (HP) Labs has been proposed to capture such new behaviors. The new dynamical memristor model with an improved ion diffusion term can emulate the synapse behavior with forgetting effect, and exhibit the transformation between the STM and the LTM. Further, this model can be used in building new type of neural networks with forgetting ability like biological systems, and it is verified by our experiment with Hopfield neural network.

  17. An Interactive Simulation Program for Exploring Computational Models of Auto-Associative Memory.

    PubMed

    Fink, Christian G

    2017-01-01

    While neuroscience students typically learn about activity-dependent plasticity early in their education, they often struggle to conceptually connect modification at the synaptic scale with network-level neuronal dynamics, not to mention with their own everyday experience of recalling a memory. We have developed an interactive simulation program (based on the Hopfield model of auto-associative memory) that enables the user to visualize the connections generated by any pattern of neural activity, as well as to simulate the network dynamics resulting from such connectivity. An accompanying set of student exercises introduces the concepts of pattern completion, pattern separation, and sparse versus distributed neural representations. Results from a conceptual assessment administered before and after students worked through these exercises indicate that the simulation program is a useful pedagogical tool for illustrating fundamental concepts of computational models of memory.

  18. Fault Tolerant Characteristics of Artificial Neural Network Electronic Hardware

    NASA Technical Reports Server (NTRS)

    Zee, Frank

    1995-01-01

    The fault tolerant characteristics of analog-VLSI artificial neural network (with 32 neurons and 532 synapses) chips are studied by exposing them to high energy electrons, high energy protons, and gamma ionizing radiations under biased and unbiased conditions. The biased chips became nonfunctional after receiving a cumulative dose of less than 20 krads, while the unbiased chips only started to show degradation with a cumulative dose of over 100 krads. As the total radiation dose increased, all the components demonstrated graceful degradation. The analog sigmoidal function of the neuron became steeper (increase in gain), current leakage from the synapses progressively shifted the sigmoidal curve, and the digital memory of the synapses and the memory addressing circuits began to gradually fail. From these radiation experiments, we can learn how to modify certain designs of the neural network electronic hardware without using radiation-hardening techniques to increase its reliability and fault tolerance.

  19. Adaptive online inverse control of a shape memory alloy wire actuator using a dynamic neural network

    NASA Astrophysics Data System (ADS)

    Mai, Huanhuan; Song, Gangbing; Liao, Xiaofeng

    2013-01-01

    Shape memory alloy (SMA) actuators exhibit severe hysteresis, a nonlinear behavior, which complicates control strategies and limits their applications. This paper presents a new approach to controlling an SMA actuator through an adaptive inverse model based controller that consists of a dynamic neural network (DNN) identifier, a copy dynamic neural network (CDNN) feedforward term and a proportional (P) feedback action. Unlike fixed hysteresis models used in most inverse controllers, the proposed one uses a DNN to identify online the relationship between the applied voltage to the actuator and the displacement (the inverse model). Even without a priori knowledge of the SMA hysteresis and without pre-training, the proposed controller can precisely control the SMA wire actuator in various tracking tasks by identifying online the inverse model of the SMA actuator. Experiments were conducted, and experimental results demonstrated real-time modeling capabilities of DNN and the performance of the adaptive inverse controller.

  20. Neural Signatures of Controlled and Automatic Retrieval Processes in Memory-based Decision-making.

    PubMed

    Khader, Patrick H; Pachur, Thorsten; Weber, Lilian A E; Jost, Kerstin

    2016-01-01

    Decision-making often requires retrieval from memory. Drawing on the neural ACT-R theory [Anderson, J. R., Fincham, J. M., Qin, Y., & Stocco, A. A central circuit of the mind. Trends in Cognitive Sciences, 12, 136-143, 2008] and other neural models of memory, we delineated the neural signatures of two fundamental retrieval aspects during decision-making: automatic and controlled activation of memory representations. To disentangle these processes, we combined a paradigm developed to examine neural correlates of selective and sequential memory retrieval in decision-making with a manipulation of associative fan (i.e., the decision options were associated with one, two, or three attributes). The results show that both the automatic activation of all attributes associated with a decision option and the controlled sequential retrieval of specific attributes can be traced in material-specific brain areas. Moreover, the two facets of memory retrieval were associated with distinct activation patterns within the frontoparietal network: The dorsolateral prefrontal cortex was found to reflect increasing retrieval effort during both automatic and controlled activation of attributes. In contrast, the superior parietal cortex only responded to controlled retrieval, arguably reflecting the sequential updating of attribute information in working memory. This dissociation in activation pattern is consistent with ACT-R and constitutes an important step toward a neural model of the retrieval dynamics involved in memory-based decision-making.

  1. Robust stability of interval bidirectional associative memory neural network with time delays.

    PubMed

    Liao, Xiaofeng; Wong, Kwok-wo

    2004-04-01

    In this paper, the conventional bidirectional associative memory (BAM) neural network with signal transmission delay is intervalized in order to study the bounded effect of deviations in network parameters and external perturbations. The resultant model is referred to as a novel interval dynamic BAM (IDBAM) model. By combining a number of different Lyapunov functionals with the Razumikhin technique, some sufficient conditions for the existence of unique equilibrium and robust stability are derived. These results are fairly general and can be verified easily. To go further, we extend our investigation to the time-varying delay case. Some robust stability criteria for BAM with perturbations of time-varying delays are derived. Besides, our approach for the analysis allows us to consider several different types of activation functions, including piecewise linear sigmoids with bounded activations as well as the usual C1-smooth sigmoids. We believe that the results obtained have leading significance in the design and application of BAM neural networks.

  2. Ghosts in the Machine II: Neural Correlates of Memory Interference from the Previous Trial.

    PubMed

    Papadimitriou, Charalampos; White, Robert L; Snyder, Lawrence H

    2017-04-01

    Previous memoranda interfere with working memory. For example, spatial memories are biased toward locations memorized on the previous trial. We predicted, based on attractor network models of memory, that activity in the frontal eye fields (FEFs) encoding a previous target location can persist into the subsequent trial and that this ghost will then bias the readout of the current target. Contrary to this prediction, we find that FEF memory representations appear biased away from (not toward) the previous target location. The behavioral and neural data can be reconciled by a model in which receptive fields of memory neurons converge toward remembered locations, much as receptive fields converge toward attended locations. Convergence increases the resources available to encode the relevant memoranda and decreases overall error in the network, but the residual convergence from the previous trial can give rise to an attractive behavioral bias on the next trial. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  3. Stability and Hopf bifurcation in a simplified BAM neural network with two time delays.

    PubMed

    Cao, Jinde; Xiao, Min

    2007-03-01

    Various local periodic solutions may represent different classes of storage patterns or memory patterns, and arise from the different equilibrium points of neural networks (NNs) by applying Hopf bifurcation technique. In this paper, a bidirectional associative memory NN with four neurons and multiple delays is considered. By applying the normal form theory and the center manifold theorem, analysis of its linear stability and Hopf bifurcation is performed. An algorithm is worked out for determining the direction and stability of the bifurcated periodic solutions. Numerical simulation results supporting the theoretical analysis are also given.

  4. Intellectual system for images restoration

    NASA Astrophysics Data System (ADS)

    Mardare, Igor

    2005-02-01

    Intelligence systems on basis of artificial neural networks and associative memory allow to solve effectively problems of recognition and restoration of images. However, within analytical technologies there are no dominating approaches of deciding of intellectual problems. Choice of the best technology depends on nature of problem, features of objects, volume of represented information about the object, number of classes of objects, etc. It is required to determine opportunities, preconditions and field of application of neural networks and associative memory for decision of problem of restoration of images and to use their supplementary benefits for further development of intelligence systems.

  5. INDIRECT INTELLIGENT SLIDING MODE CONTROL OF A SHAPE MEMORY ALLOY ACTUATED FLEXIBLE BEAM USING HYSTERETIC RECURRENT NEURAL NETWORKS.

    PubMed

    Hannen, Jennifer C; Crews, John H; Buckner, Gregory D

    2012-08-01

    This paper introduces an indirect intelligent sliding mode controller (IISMC) for shape memory alloy (SMA) actuators, specifically a flexible beam deflected by a single offset SMA tendon. The controller manipulates applied voltage, which alters SMA tendon temperature to track reference bending angles. A hysteretic recurrent neural network (HRNN) captures the nonlinear, hysteretic relationship between SMA temperature and bending angle. The variable structure control strategy provides robustness to model uncertainties and parameter variations, while effectively compensating for system nonlinearities, achieving superior tracking compared to an optimized PI controller.

  6. The Effects of Age, Memory Performance, and Callosal Integrity on the Neural Correlates of Successful Associative Encoding

    PubMed Central

    Wang, Tracy H.; Minton, Brian; Muftuler, L. Tugan; Rugg, Michael D.

    2011-01-01

    This functional magnetic resonance imaging study investigated the relationship between the neural correlates of associative memory encoding, callosal integrity, and memory performance in older adults. Thirty-six older and 18 young subjects were scanned while making relational judgments on word pairs. Neural correlates of successful encoding (subsequent memory effects) were identified by contrasting the activity elicited by study pairs that were correctly identified as having been studied together with the activity elicited by pairs wrongly judged to have come from different study trials. Subsequent memory effects common to the 2 age groups were identified in several regions, including left inferior frontal gyrus and bilateral hippocampus. Negative effects (greater activity for forgotten than for remembered items) in default network regions in young subjects were reversed in the older group, and the amount of reversal correlated negatively with memory performance. Additionally, older subjects' subsequent memory effects in right frontal cortex correlated positively with anterior callosal integrity and negatively with memory performance. It is suggested that recruitment of right frontal cortex during verbal memory encoding may reflect the engagement of processes that compensate only partially for age-related neural degradation. PMID:21282317

  7. A theory for how sensorimotor skills are learned and retained in noisy and nonstationary neural circuits

    PubMed Central

    Ajemian, Robert; D’Ausilio, Alessandro; Moorman, Helene; Bizzi, Emilio

    2013-01-01

    During the process of skill learning, synaptic connections in our brains are modified to form motor memories of learned sensorimotor acts. The more plastic the adult brain is, the easier it is to learn new skills or adapt to neurological injury. However, if the brain is too plastic and the pattern of synaptic connectivity is constantly changing, new memories will overwrite old memories, and learning becomes unstable. This trade-off is known as the stability–plasticity dilemma. Here a theory of sensorimotor learning and memory is developed whereby synaptic strengths are perpetually fluctuating without causing instability in motor memory recall, as long as the underlying neural networks are sufficiently noisy and massively redundant. The theory implies two distinct stages of learning—preasymptotic and postasymptotic—because once the error drops to a level comparable to that of the noise-induced error, further error reduction requires altered network dynamics. A key behavioral prediction derived from this analysis is tested in a visuomotor adaptation experiment, and the resultant learning curves are modeled with a nonstationary neural network. Next, the theory is used to model two-photon microscopy data that show, in animals, high rates of dendritic spine turnover, even in the absence of overt behavioral learning. Finally, the theory predicts enhanced task selectivity in the responses of individual motor cortical neurons as the level of task expertise increases. From these considerations, a unique interpretation of sensorimotor memory is proposed—memories are defined not by fixed patterns of synaptic weights but, rather, by nonstationary synaptic patterns that fluctuate coherently. PMID:24324147

  8. Sparse distributed memory and related models

    NASA Technical Reports Server (NTRS)

    Kanerva, Pentti

    1992-01-01

    Described here is sparse distributed memory (SDM) as a neural-net associative memory. It is characterized by two weight matrices and by a large internal dimension - the number of hidden units is much larger than the number of input or output units. The first matrix, A, is fixed and possibly random, and the second matrix, C, is modifiable. The SDM is compared and contrasted to (1) computer memory, (2) correlation-matrix memory, (3) feet-forward artificial neural network, (4) cortex of the cerebellum, (5) Marr and Albus models of the cerebellum, and (6) Albus' cerebellar model arithmetic computer (CMAC). Several variations of the basic SDM design are discussed: the selected-coordinate and hyperplane designs of Jaeckel, the pseudorandom associative neural memory of Hassoun, and SDM with real-valued input variables by Prager and Fallside. SDM research conducted mainly at the Research Institute for Advanced Computer Science (RIACS) in 1986-1991 is highlighted.

  9. Crew exploration vehicle (CEV) attitude control using a neural-immunology/memory network

    NASA Astrophysics Data System (ADS)

    Weng, Liguo; Xia, Min; Wang, Wei; Liu, Qingshan

    2015-01-01

    This paper addresses the problem of the crew exploration vehicle (CEV) attitude control. CEVs are NASA's next-generation human spaceflight vehicles, and they use reaction control system (RCS) jet engines for attitude adjustment, which calls for control algorithms for firing the small propulsion engines mounted on vehicles. In this work, the resultant CEV dynamics combines both actuation and attitude dynamics. Therefore, it is highly nonlinear and even coupled with significant uncertainties. To cope with this situation, a neural-immunology/memory network is proposed. It is inspired by the human memory and immune systems. The control network does not rely on precise system dynamics information. Furthermore, the overall control scheme has a simple structure and demands much less computation as compared with most existing methods, making it attractive for real-time implementation. The effectiveness of this approach is also verified via simulation.

  10. Distributed memory approaches for robotic neural controllers

    NASA Technical Reports Server (NTRS)

    Jorgensen, Charles C.

    1990-01-01

    The suitability is explored of two varieties of distributed memory neutral networks as trainable controllers for a simulated robotics task. The task requires that two cameras observe an arbitrary target point in space. Coordinates of the target on the camera image planes are passed to a neural controller which must learn to solve the inverse kinematics of a manipulator with one revolute and two prismatic joints. Two new network designs are evaluated. The first, radial basis sparse distributed memory (RBSDM), approximates functional mappings as sums of multivariate gaussians centered around previously learned patterns. The second network types involved variations of Adaptive Vector Quantizers or Self Organizing Maps. In these networks, random N dimensional points are given local connectivities. They are then exposed to training patterns and readjust their locations based on a nearest neighbor rule. Both approaches are tested based on their ability to interpolate manipulator joint coordinates for simulated arm movement while simultaneously performing stereo fusion of the camera data. Comparisons are made with classical k-nearest neighbor pattern recognition techniques.

  11. Neural basis for dynamic updating of object representation in visual working memory.

    PubMed

    Takahama, Sachiko; Miyauchi, Satoru; Saiki, Jun

    2010-02-15

    In real world, objects have multiple features and change dynamically. Thus, object representations must satisfy dynamic updating and feature binding. Previous studies have investigated the neural activity of dynamic updating or feature binding alone, but not both simultaneously. We investigated the neural basis of feature-bound object representation in a dynamically updating situation by conducting a multiple object permanence tracking task, which required observers to simultaneously process both the maintenance and dynamic updating of feature-bound objects. Using an event-related design, we separated activities during memory maintenance and change detection. In the search for regions showing selective activation in dynamic updating of feature-bound objects, we identified a network during memory maintenance that was comprised of the inferior precentral sulcus, superior parietal lobule, and middle frontal gyrus. In the change detection period, various prefrontal regions, including the anterior prefrontal cortex, were activated. In updating object representation of dynamically moving objects, the inferior precentral sulcus closely cooperates with a so-called "frontoparietal network", and subregions of the frontoparietal network can be decomposed into those sensitive to spatial updating and feature binding. The anterior prefrontal cortex identifies changes in object representation by comparing memory and perceptual representations rather than maintaining object representations per se, as previously suggested. Copyright 2009 Elsevier Inc. All rights reserved.

  12. Proceedings of the Government Neural Network Applications Workshop Held at Wright-Patterson AFB, Ohio on August 24-26, 1992. Volume 1

    DTIC Science & Technology

    1992-08-01

    history trace of input u(t). (b) A common network struc- 1 ture makes use of the feedforward tapped delay line. For this structure the memory depth D...theories and analyses that will be used world- wide for a long time to come. The reason for this contribution has generally been the government’s need to...that emulate the neural reasoning behavior of biological neural systems (e.g. the human brain). As such, they are loosely based on biological neural

  13. Hadoop neural network for parallel and distributed feature selection.

    PubMed

    Hodge, Victoria J; O'Keefe, Simon; Austin, Jim

    2016-06-01

    In this paper, we introduce a theoretical basis for a Hadoop-based neural network for parallel and distributed feature selection in Big Data sets. It is underpinned by an associative memory (binary) neural network which is highly amenable to parallel and distributed processing and fits with the Hadoop paradigm. There are many feature selectors described in the literature which all have various strengths and weaknesses. We present the implementation details of five feature selection algorithms constructed using our artificial neural network framework embedded in Hadoop YARN. Hadoop allows parallel and distributed processing. Each feature selector can be divided into subtasks and the subtasks can then be processed in parallel. Multiple feature selectors can also be processed simultaneously (in parallel) allowing multiple feature selectors to be compared. We identify commonalities among the five features selectors. All can be processed in the framework using a single representation and the overall processing can also be greatly reduced by only processing the common aspects of the feature selectors once and propagating these aspects across all five feature selectors as necessary. This allows the best feature selector and the actual features to select to be identified for large and high dimensional data sets through exploiting the efficiency and flexibility of embedding the binary associative-memory neural network in Hadoop. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Emergence of low noise frustrated states in E/I balanced neural networks.

    PubMed

    Recio, I; Torres, J J

    2016-12-01

    We study emerging phenomena in binary neural networks where, with a probability c synaptic intensities are chosen according with a Hebbian prescription, and with probability (1-c) there is an extra random contribution to synaptic weights. This new term, randomly taken from a Gaussian bimodal distribution, balances the synaptic population in the network so that one has 80%-20% relation in E/I population ratio, mimicking the balance observed in mammals cortex. For some regions of the relevant parameters, our system depicts standard memory (at low temperature) and non-memory attractors (at high temperature). However, as c decreases and the level of the underlying noise also decreases below a certain temperature T t , a kind of memory-frustrated state, which resembles spin-glass behavior, sharply emerges. Contrary to what occurs in Hopfield-like neural networks, the frustrated state appears here even in the limit of the loading parameter α→0. Moreover, we observed that the frustrated state in fact corresponds to two states of non-vanishing activity uncorrelated with stored memories, associated, respectively, to a high activity or Up state and to a low activity or Down state. Using a linear stability analysis, we found regions in the space of relevant parameters for locally stable steady states and demonstrated that frustrated states coexist with memory attractors below T t . Then, multistability between memory and frustrated states is present for relatively small c, and metastability of memory attractors can emerge as c decreases even more. We studied our system using standard mean-field techniques and with Monte Carlo simulations, obtaining a perfect agreement between theory and simulations. Our study can be useful to explain the role of synapse heterogeneity on the emergence of stable Up and Down states not associated to memory attractors, and to explore the conditions to induce transitions among them, as in sleep-wake transitions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Readout from iconic memory and selective spatial attention involve similar neural processes.

    PubMed

    Ruff, Christian C; Kristjánsson, Arni; Driver, Jon

    2007-10-01

    Iconic memory and spatial attention are often considered separately, but they may have functional similarities. Here we provide functional magnetic resonance imaging evidence for some common underlying neural effects. Subjects judged three visual stimuli in one hemifield of a bilateral array comprising six stimuli. The relevant hemifield for partial report was indicated by an auditory cue, administered either before the visual array (precue, spatial attention) or shortly after the array (postcue, iconic memory). Pre- and postcues led to similar activity modulations in lateral occipital cortex contralateral to the cued side. This finding indicates that readout from iconic memory can have some neural effects similar to those of spatial attention. We also found common bilateral activation of a fronto-parietal network for postcue and precue trials. These neuroimaging data suggest that some common neural mechanisms underlie selective spatial attention and readout from iconic memory. Some differences were also found; compared with precues, postcues led to higher activity in the right middle frontal gyrus.

  16. Readout From Iconic Memory and Selective Spatial Attention Involve Similar Neural Processes

    PubMed Central

    Ruff, Christian C; Kristjánsson, Árni; Driver, Jon

    2007-01-01

    Iconic memory and spatial attention are often considered separately, but they may have functional similarities. Here we provide functional magnetic resonance imaging evidence for some common underlying neural effects. Subjects judged three visual stimuli in one hemifield of a bilateral array comprising six stimuli. The relevant hemifield for partial report was indicated by an auditory cue, administered either before the visual array (precue, spatial attention) or shortly after the array (postcue, iconic memory). Pre- and postcues led to similar activity modulations in lateral occipital cortex contralateral to the cued side. This finding indicates that readout from iconic memory can have some neural effects similar to those of spatial attention. We also found common bilateral activation of a fronto-parietal network for postcue and precue trials. These neuroimaging data suggest that some common neural mechanisms underlie selective spatial attention and readout from iconic memory. Some differences were also found; compared with precues, postcues led to higher activity in the right middle frontal gyrus. PMID:17894608

  17. New Passivity Criteria for Fuzzy Bam Neural Networks with Markovian Jumping Parameters and Time-Varying Delays

    NASA Astrophysics Data System (ADS)

    Vadivel, P.; Sakthivel, R.; Mathiyalagan, K.; Thangaraj, P.

    2013-02-01

    This paper addresses the problem of passivity analysis issue for a class of fuzzy bidirectional associative memory (BAM) neural networks with Markovian jumping parameters and time varying delays. A set of sufficient conditions for the passiveness of the considered fuzzy BAM neural network model is derived in terms of linear matrix inequalities by using the delay fractioning technique together with the Lyapunov function approach. In addition, the uncertainties are inevitable in neural networks because of the existence of modeling errors and external disturbance. Further, this result is extended to study the robust passivity criteria for uncertain fuzzy BAM neural networks with time varying delays and uncertainties. These criteria are expressed in the form of linear matrix inequalities (LMIs), which can be efficiently solved via standard numerical software. Two numerical examples are provided to demonstrate the effectiveness of the obtained results.

  18. The role of trauma-related distractors on neural systems for working memory and emotion processing in posttraumatic stress disorder

    PubMed Central

    Morey, Rajendra A.; Dolcos, Florin; Petty, Christopher M.; Cooper, Debra A.; Hayes, Jasmeet Pannu; LaBar, Kevin S.; McCarthy, Gregory

    2009-01-01

    The relevance of emotional stimuli to threat and survival confers a privileged role in their processing. In PTSD, the ability of trauma-related information to divert attention is especially pronounced. Information unrelated to the trauma may also be highly distracting when it shares perceptual features with trauma material. Our goal was to study how trauma-related environmental cues modulate working memory networks in PTSD. We examined neural activity in participants performing a visual working memory task while distracted by task-irrelevant trauma and non-trauma material. Recent post-9/11 veterans were divided into a PTSD group (n = 22) and a trauma-exposed control group (n = 20) based on the Davidson trauma scale. Using fMRI, we measured hemodynamic change in response to emotional (trauma-related) and neutral distraction presented during the active maintenance period of a delayed-response working memory task. The goal was to examine differences in functional networks associated with working memory (dorsolateral prefrontal cortex and lateral parietal cortex) and emotion processing (amygdala, ventrolateral prefrontal cortex, and fusiform gyrus). The PTSD group showed markedly different neural activity compared to the trauma-exposed control group in response to task-irrelevant visual distractors. Enhanced activity in ventral emotion processing regions was associated with trauma distractors in the PTSD group, whereas activity in brain regions associated with working memory and attention regions was disrupted by distractor stimuli independent of trauma content. Neural evidence for the impact of distraction on working memory is consistent with PTSD symptoms of hypervigilance and general distractibility during goal-directed cognitive processing. PMID:19091328

  19. Load matters: neural correlates of verbal working memory in children with autism spectrum disorder.

    PubMed

    Vogan, Vanessa M; Francis, Kaitlyn E; Morgan, Benjamin R; Smith, Mary Lou; Taylor, Margot J

    2018-06-01

    Autism spectrum disorder (ASD) is a pervasive neurodevelopmental disorder characterised by diminished social reciprocity and communication skills and the presence of stereotyped and restricted behaviours. Executive functioning deficits, such as working memory, are associated with core ASD symptoms. Working memory allows for temporary storage and manipulation of information and relies heavily on frontal-parietal networks of the brain. There are few reports on the neural correlates of working memory in youth with ASD. The current study identified the neural systems underlying verbal working memory capacity in youth with and without ASD using functional magnetic resonance imaging (fMRI). Fifty-seven youth, 27 with ASD and 30 sex- and age-matched typically developing (TD) controls (9-16 years), completed a one-back letter matching task (LMT) with four levels of difficulty (i.e. cognitive load) while fMRI data were recorded. Linear trend analyses were conducted to examine brain regions that were recruited as a function of increasing cognitive load. We found similar behavioural performance on the LMT in terms of reaction times, but in the two higher load conditions, the ASD youth had lower accuracy than the TD group. Neural patterns of activations differed significantly between TD and ASD groups. In TD youth, areas classically used for working memory, including the lateral and medial frontal, as well as superior parietal brain regions, increased in activation with increasing task difficulty, while areas related to the default mode network (DMN) showed decreasing activation (i.e., deactivation). The youth with ASD did not appear to use this opposing cognitive processing system; they showed little recruitment of frontal and parietal regions across the load but did show similar modulation of the DMN. In a working memory task, where the load was manipulated without changing executive demands, TD youth showed increasing recruitment with increasing load of the classic fronto-parietal brain areas and decreasing involvement in default mode regions. In contrast, although they modulated the default mode network, youth with ASD did not show the modulation of increasing brain activation with increasing load, suggesting that they may be unable to manage increasing verbal information. Impaired verbal working memory in ASD would interfere with the youths' success academically and socially. Thus, determining the nature of atypical neural processing could help establish or monitor working memory interventions for ASD.

  20. Deep neural network for traffic sign recognition systems: An analysis of spatial transformers and stochastic optimisation methods.

    PubMed

    Arcos-García, Álvaro; Álvarez-García, Juan A; Soria-Morillo, Luis M

    2018-03-01

    This paper presents a Deep Learning approach for traffic sign recognition systems. Several classification experiments are conducted over publicly available traffic sign datasets from Germany and Belgium using a Deep Neural Network which comprises Convolutional layers and Spatial Transformer Networks. Such trials are built to measure the impact of diverse factors with the end goal of designing a Convolutional Neural Network that can improve the state-of-the-art of traffic sign classification task. First, different adaptive and non-adaptive stochastic gradient descent optimisation algorithms such as SGD, SGD-Nesterov, RMSprop and Adam are evaluated. Subsequently, multiple combinations of Spatial Transformer Networks placed at distinct positions within the main neural network are analysed. The recognition rate of the proposed Convolutional Neural Network reports an accuracy of 99.71% in the German Traffic Sign Recognition Benchmark, outperforming previous state-of-the-art methods and also being more efficient in terms of memory requirements. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Recurrent Network models of sequence generation and memory

    PubMed Central

    Rajan, Kanaka; Harvey, Christopher D; Tank, David W

    2016-01-01

    SUMMARY Sequential activation of neurons is a common feature of network activity during a variety of behaviors, including working memory and decision making. Previous network models for sequences and memory emphasized specialized architectures in which a principled mechanism is pre-wired into their connectivity. Here, we demonstrate that starting from random connectivity and modifying a small fraction of connections, a largely disordered recurrent network can produce sequences and implement working memory efficiently. We use this process, called Partial In-Network training (PINning), to model and match cellular-resolution imaging data from the posterior parietal cortex during a virtual memory-guided two-alternative forced choice task [Harvey, Coen and Tank, 2012]. Analysis of the connectivity reveals that sequences propagate by the cooperation between recurrent synaptic interactions and external inputs, rather than through feedforward or asymmetric connections. Together our results suggest that neural sequences may emerge through learning from largely unstructured network architectures. PMID:26971945

  2. Global Hopf bifurcation analysis on a BAM neural network with delays

    NASA Astrophysics Data System (ADS)

    Sun, Chengjun; Han, Maoan; Pang, Xiaoming

    2007-01-01

    A delayed differential equation that models a bidirectional associative memory (BAM) neural network with four neurons is considered. By using a global Hopf bifurcation theorem for FDE and a Bendixon's criterion for high-dimensional ODE, a group of sufficient conditions for the system to have multiple periodic solutions are obtained when the sum of delays is sufficiently large.

  3. Applying Neural Networks in Optical Communication Systems: Possible Pitfalls

    NASA Astrophysics Data System (ADS)

    Eriksson, Tobias A.; Bulow, Henning; Leven, Andreas

    2017-12-01

    We investigate the risk of overestimating the performance gain when applying neural network based receivers in systems with pseudo random bit sequences or with limited memory depths, resulting in repeated short patterns. We show that with such sequences, a large artificial gain can be obtained which comes from pattern prediction rather than predicting or compensating the studied channel/phenomena.

  4. Hopfield neural network and optical fiber sensor as intelligent heart rate monitor

    NASA Astrophysics Data System (ADS)

    Mutter, Kussay Nugamesh

    2018-01-01

    This paper presents a design and fabrication of an intelligent fiber-optic sensor used for examining and monitoring heart rate activity. It is found in the literature that the use of fiber sensors as heart rate sensor is widely studied. However, the use of smart sensors based on Hopfield neural networks is very low. In this work, the sensor is a three fibers without cladding of about 1 cm, fed by laser light of 1550 nm of wavelength. The sensing portions are mounted with a micro sensitive diaphragm to transfer the pulse pressure on the left radial wrist. The influenced light intensity will be detected by a three photodetectors as inputs into the Hopfield neural network algorithm. The latter is a singlelayer auto-associative memory structure with a same input and output layers. The prior training weights are stored in the net memory for the standard recorded normal heart rate signals. The sensors' heads work on the reflection intensity basis. The novelty here is that the sensor uses a pulse pressure and Hopfield neural network in an integrity approach. The results showed a significant output measurements of heart rate and counting with a plausible error rate.

  5. F77NNS - A FORTRAN-77 NEURAL NETWORK SIMULATOR

    NASA Technical Reports Server (NTRS)

    Mitchell, P. H.

    1994-01-01

    F77NNS (A FORTRAN-77 Neural Network Simulator) simulates the popular back error propagation neural network. F77NNS is an ANSI-77 FORTRAN program designed to take advantage of vectorization when run on machines having this capability, but it will run on any computer with an ANSI-77 FORTRAN Compiler. Artificial neural networks are formed from hundreds or thousands of simulated neurons, connected to each other in a manner similar to biological nerve cells. Problems which involve pattern matching or system modeling readily fit the class of problems which F77NNS is designed to solve. The program's formulation trains a neural network using Rumelhart's back-propagation algorithm. Typically the nodes of a network are grouped together into clumps called layers. A network will generally have an input layer through which the various environmental stimuli are presented to the network, and an output layer for determining the network's response. The number of nodes in these two layers is usually tied to features of the problem being solved. Other layers, which form intermediate stops between the input and output layers, are called hidden layers. The back-propagation training algorithm can require massive computational resources to implement a large network such as a network capable of learning text-to-phoneme pronunciation rules as in the famous Sehnowski experiment. The Sehnowski neural network learns to pronounce 1000 common English words. The standard input data defines the specific inputs that control the type of run to be made, and input files define the NN in terms of the layers and nodes, as well as the input/output (I/O) pairs. The program has a restart capability so that a neural network can be solved in stages suitable to the user's resources and desires. F77NNS allows the user to customize the patterns of connections between layers of a network. The size of the neural network to be solved is limited only by the amount of random access memory (RAM) available to the user. The program has a memory requirement of about 900K. The standard distribution medium for this package is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format. F77NNS was developed in 1989.

  6. Orthogonal patterns in binary neural networks

    NASA Technical Reports Server (NTRS)

    Baram, Yoram

    1988-01-01

    A binary neural network that stores only mutually orthogonal patterns is shown to converge, when probed by any pattern, to a pattern in the memory space, i.e., the space spanned by the stored patterns. The latter are shown to be the only members of the memory space under a certain coding condition, which allows maximum storage of M=(2N) sup 0.5 patterns, where N is the number of neurons. The stored patterns are shown to have basins of attraction of radius N/(2M), within which errors are corrected with probability 1 in a single update cycle. When the probe falls outside these regions, the error correction capability can still be increased to 1 by repeatedly running the network with the same probe.

  7. How emotion leads to selective memory: neuroimaging evidence.

    PubMed

    Waring, Jill D; Kensinger, Elizabeth A

    2011-06-01

    Often memory for emotionally arousing items is enhanced relative to neutral items within complex visual scenes, but this enhancement can come at the expense of memory for peripheral background information. This 'trade-off' effect has been elicited by a range of stimulus valence and arousal levels, yet the magnitude of the effect has been shown to vary with these factors. Using fMRI, this study investigated the neural mechanisms underlying this selective memory for emotional scenes. Further, we examined how these processes are affected by stimulus dimensions of arousal and valence. The trade-off effect in memory occurred for low to high arousal positive and negative scenes. There was a core emotional memory network associated with the trade-off among all the emotional scene types, however, there were additional regions that were uniquely associated with the trade-off for each individual scene type. These results suggest that there is a common network of regions associated with the emotional memory trade-off effect, but that valence and arousal also independently affect the neural activity underlying the effect. Copyright © 2011 Elsevier Ltd. All rights reserved.

  8. How emotion leads to selective memory: Neuroimaging evidence

    PubMed Central

    Waring, Jill D.; Kensinger, Elizabeth A.

    2011-01-01

    Often memory for emotionally arousing items is enhanced relative to neutral items within complex visual scenes, but this enhancement can come at the expense of memory for peripheral background information. This ‘trade-off’ effect has been elicited by a range of stimulus valence and arousal levels, yet the magnitude of the effect has been shown to vary with these factors. Using fMRI, this study investigated the neural mechanisms underlying this selective memory for emotional scenes. Further, we examined how these processes are affected by stimulus dimensions of arousal and valence. The trade-off effect in memory occurred for low to high arousal positive and negative scenes. There was a core emotional memory network associated with the trade-off among all the emotional scene types, however there were additional regions that were uniquely associated with the trade-off for each individual scene type. These results suggest that there is a common network of regions associated with the emotional memory tradeoff effect, but that valence and arousal also independently affect the neural activity underlying the effect. PMID:21414333

  9. A Three-Threshold Learning Rule Approaches the Maximal Capacity of Recurrent Neural Networks

    PubMed Central

    Alemi, Alireza; Baldassi, Carlo; Brunel, Nicolas; Zecchina, Riccardo

    2015-01-01

    Understanding the theoretical foundations of how memories are encoded and retrieved in neural populations is a central challenge in neuroscience. A popular theoretical scenario for modeling memory function is the attractor neural network scenario, whose prototype is the Hopfield model. The model simplicity and the locality of the synaptic update rules come at the cost of a poor storage capacity, compared with the capacity achieved with perceptron learning algorithms. Here, by transforming the perceptron learning rule, we present an online learning rule for a recurrent neural network that achieves near-maximal storage capacity without an explicit supervisory error signal, relying only upon locally accessible information. The fully-connected network consists of excitatory binary neurons with plastic recurrent connections and non-plastic inhibitory feedback stabilizing the network dynamics; the memory patterns to be memorized are presented online as strong afferent currents, producing a bimodal distribution for the neuron synaptic inputs. Synapses corresponding to active inputs are modified as a function of the value of the local fields with respect to three thresholds. Above the highest threshold, and below the lowest threshold, no plasticity occurs. In between these two thresholds, potentiation/depression occurs when the local field is above/below an intermediate threshold. We simulated and analyzed a network of binary neurons implementing this rule and measured its storage capacity for different sizes of the basins of attraction. The storage capacity obtained through numerical simulations is shown to be close to the value predicted by analytical calculations. We also measured the dependence of capacity on the strength of external inputs. Finally, we quantified the statistics of the resulting synaptic connectivity matrix, and found that both the fraction of zero weight synapses and the degree of symmetry of the weight matrix increase with the number of stored patterns. PMID:26291608

  10. A Three-Threshold Learning Rule Approaches the Maximal Capacity of Recurrent Neural Networks.

    PubMed

    Alemi, Alireza; Baldassi, Carlo; Brunel, Nicolas; Zecchina, Riccardo

    2015-08-01

    Understanding the theoretical foundations of how memories are encoded and retrieved in neural populations is a central challenge in neuroscience. A popular theoretical scenario for modeling memory function is the attractor neural network scenario, whose prototype is the Hopfield model. The model simplicity and the locality of the synaptic update rules come at the cost of a poor storage capacity, compared with the capacity achieved with perceptron learning algorithms. Here, by transforming the perceptron learning rule, we present an online learning rule for a recurrent neural network that achieves near-maximal storage capacity without an explicit supervisory error signal, relying only upon locally accessible information. The fully-connected network consists of excitatory binary neurons with plastic recurrent connections and non-plastic inhibitory feedback stabilizing the network dynamics; the memory patterns to be memorized are presented online as strong afferent currents, producing a bimodal distribution for the neuron synaptic inputs. Synapses corresponding to active inputs are modified as a function of the value of the local fields with respect to three thresholds. Above the highest threshold, and below the lowest threshold, no plasticity occurs. In between these two thresholds, potentiation/depression occurs when the local field is above/below an intermediate threshold. We simulated and analyzed a network of binary neurons implementing this rule and measured its storage capacity for different sizes of the basins of attraction. The storage capacity obtained through numerical simulations is shown to be close to the value predicted by analytical calculations. We also measured the dependence of capacity on the strength of external inputs. Finally, we quantified the statistics of the resulting synaptic connectivity matrix, and found that both the fraction of zero weight synapses and the degree of symmetry of the weight matrix increase with the number of stored patterns.

  11. Memory consolidation from seconds to weeks: a three-stage neural network model with autonomous reinstatement dynamics

    PubMed Central

    Fiebig, Florian; Lansner, Anders

    2014-01-01

    Declarative long-term memories are not created in an instant. Gradual stabilization and temporally shifting dependence of acquired declarative memories in different brain regions—called systems consolidation—can be tracked in time by lesion experiments. The observation of temporally graded retrograde amnesia (RA) following hippocampal lesions points to a gradual transfer of memory from hippocampus to neocortical long-term memory. Spontaneous reactivations of hippocampal memories, as observed in place cell reactivations during slow-wave-sleep, are supposed to drive neocortical reinstatements and facilitate this process. We propose a functional neural network implementation of these ideas and furthermore suggest an extended three-state framework that includes the prefrontal cortex (PFC). It bridges the temporal chasm between working memory percepts on the scale of seconds and consolidated long-term memory on the scale of weeks or months. We show that our three-stage model can autonomously produce the necessary stochastic reactivation dynamics for successful episodic memory consolidation. The resulting learning system is shown to exhibit classical memory effects seen in experimental studies, such as retrograde and anterograde amnesia (AA) after simulated hippocampal lesioning; furthermore the model reproduces peculiar biological findings on memory modulation, such as retrograde facilitation of memory after suppressed acquisition of new long-term memories—similar to the effects of benzodiazepines on memory. PMID:25071536

  12. Data-driven forecasting of high-dimensional chaotic systems with long short-term memory networks.

    PubMed

    Vlachas, Pantelis R; Byeon, Wonmin; Wan, Zhong Y; Sapsis, Themistoklis P; Koumoutsakos, Petros

    2018-05-01

    We introduce a data-driven forecasting method for high-dimensional chaotic systems using long short-term memory (LSTM) recurrent neural networks. The proposed LSTM neural networks perform inference of high-dimensional dynamical systems in their reduced order space and are shown to be an effective set of nonlinear approximators of their attractor. We demonstrate the forecasting performance of the LSTM and compare it with Gaussian processes (GPs) in time series obtained from the Lorenz 96 system, the Kuramoto-Sivashinsky equation and a prototype climate model. The LSTM networks outperform the GPs in short-term forecasting accuracy in all applications considered. A hybrid architecture, extending the LSTM with a mean stochastic model (MSM-LSTM), is proposed to ensure convergence to the invariant measure. This novel hybrid method is fully data-driven and extends the forecasting capabilities of LSTM networks.

  13. Spike frequency adaptation is a possible mechanism for control of attractor preference in auto-associative neural networks

    NASA Astrophysics Data System (ADS)

    Roach, James; Sander, Leonard; Zochowski, Michal

    Auto-associative memory is the ability to retrieve a pattern from a small fraction of the pattern and is an important function of neural networks. Within this context, memories that are stored within the synaptic strengths of networks act as dynamical attractors for network firing patterns. In networks with many encoded memories, some attractors will be stronger than others. This presents the problem of how networks switch between attractors depending on the situation. We suggest that regulation of neuronal spike-frequency adaptation (SFA) provides a universal mechanism for network-wide attractor selectivity. Here we demonstrate in a Hopfield type attractor network that neurons minimal SFA will reliably activate in the pattern corresponding to a local attractor and that a moderate increase in SFA leads to the network to converge to the strongest attractor state. Furthermore, we show that on long time scales SFA allows for temporal sequences of activation to emerge. Finally, using a model of cholinergic modulation within the cortex we argue that dynamic regulation of attractor preference by SFA could be critical for the role of acetylcholine in attention or for arousal states in general. This work was supported by: NSF Graduate Research Fellowship Program under Grant No. DGE 1256260 (JPR), NSF CMMI 1029388 (MRZ) and NSF PoLS 1058034 (MRZ & LMS).

  14. Neural networks engaged in short-term memory rehearsal are disrupted by irrelevant speech in human subjects.

    PubMed

    Kopp, Franziska; Schröger, Erich; Lipka, Sigrid

    2004-01-02

    Rehearsal mechanisms in human short-term memory are increasingly understood in the light of both behavioural and neuroanatomical findings. However, little is known about the cooperation of participating brain structures and how such cooperations are affected when memory performance is disrupted. In this paper we use EEG coherence as a measure of synchronization to investigate rehearsal processes and their disruption by irrelevant speech in a delayed serial recall paradigm. Fronto-central and fronto-parietal theta (4-7.5 Hz), beta (13-20 Hz), and gamma (35-47 Hz) synchronizations are shown to be involved in our short-term memory task. Moreover, the impairment in serial recall due to irrelevant speech was preceded by a reduction of gamma band coherence. Results suggest that the irrelevant speech effect has its neural basis in the disruption of left-lateralized fronto-central networks. This stresses the importance of gamma band activity for short-term memory operations.

  15. Synaptic plasticity and memory functions achieved in a WO3-x-based nanoionics device by using the principle of atomic switch operation

    NASA Astrophysics Data System (ADS)

    Yang, Rui; Terabe, Kazuya; Yao, Yiping; Tsuruoka, Tohru; Hasegawa, Tsuyoshi; Gimzewski, James K.; Aono, Masakazu

    2013-09-01

    A compact neuromorphic nanodevice with inherent learning and memory properties emulating those of biological synapses is the key to developing artificial neural networks rivaling their biological counterparts. Experimental results showed that memorization with a wide time scale from volatile to permanent can be achieved in a WO3-x-based nanoionics device and can be precisely and cumulatively controlled by adjusting the device’s resistance state and input pulse parameters such as the amplitude, interval, and number. This control is analogous to biological synaptic plasticity including short-term plasticity, long-term potentiation, transition from short-term memory to long-term memory, forgetting processes for short- and long-term memory, learning speed, and learning history. A compact WO3-x-based nanoionics device with a simple stacked layer structure should thus be a promising candidate for use as an inorganic synapse in artificial neural networks due to its striking resemblance to the biological synapse.

  16. A reservoir of time constants for memory traces in cortical neurons

    PubMed Central

    Bernacchia, Alberto; Seo, Hyojung; Lee, Daeyeol; Wang, Xiao-Jing

    2011-01-01

    According to reinforcement learning theory of decision making, reward expectation is computed by integrating past rewards with a fixed timescale. By contrast, we found that a wide range of time constants is available across cortical neurons recorded from monkeys performing a competitive game task. By recognizing that reward modulates neural activity multiplicatively, we found that one or two time constants of reward memory can be extracted for each neuron in prefrontal, cingulate, and parietal cortex. These timescales ranged from hundreds of milliseconds to tens of seconds, according to a power-law distribution, which is consistent across areas and reproduced by a “reservoir” neural network model. These neuronal memory timescales were weakly but significantly correlated with those of monkey's decisions. Our findings suggest a flexible memory system, where neural subpopulations with distinct sets of long or short memory timescales may be selectively deployed according to the task demands. PMID:21317906

  17. The role of the episodic buffer in working memory for language processing.

    PubMed

    Rudner, Mary; Rönnberg, Jerker

    2008-03-01

    A body of work has accumulated to show that the cognitive process of binding information from different mnemonic and sensory sources as well as in different linguistic modalities can be fractionated from general executive functions in working memory both functionally and neurally. This process has been defined in terms of the episodic buffer (Baddeley in Trends Cogn Sci 4(11):417-423, 2000). This paper considers behavioural, neuropsychological and neuroimaging data that elucidate the role of the episodic buffer in language processing. We argue that the episodic buffer seems to be truly multimodal in function and that while formation of unitary multidimensional representations in the episodic buffer seems to engage posterior neural networks, maintenance of such representations is supported by frontal networks. Although, the episodic buffer is not necessarily supported by executive processes and seems to be supported by different neural networks, it may operate in tandem with the central executive during effortful language processing. There is also evidence to suggest engagement of the phonological loop during buffer processing. The hippocampus seems to play a role in formation but not maintenance of representations in the episodic buffer of working memory.

  18. Stuck in default mode: inefficient cross-frequency synchronization may lead to age-related short-term memory decline.

    PubMed

    Pinal, Diego; Zurrón, Montserrat; Díaz, Fernando; Sauseng, Paul

    2015-04-01

    Aging-related decline in short-term memory capacity seems to be caused by deficient balancing of task-related and resting state brain networks activity; however, the exact neural mechanism underlying this deficit remains elusive. Here, we studied brain oscillatory activity in healthy young and old adults during visual information maintenance in a delayed match-to-sample task. Particular emphasis was on long range phase:amplitude coupling of frontal alpha (8-12 Hz) and posterior fast oscillatory activity (>30 Hz). It is argued that through posterior fast oscillatory activity nesting into the excitatory or the inhibitory phase of frontal alpha wave, long-range networks can be efficiently coupled or decoupled, respectively. On the basis of this mechanism, we show that healthy, elderly participants exhibit a lack of synchronization in task-relevant networks while maintaining synchronized regions of the resting state network. Lacking disconnection of this resting state network is predictive of aging-related short-term memory decline. These results support the idea of inefficient orchestration of competing brain networks in the aging human brain and identify the neural mechanism responsible for this control breakdown. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Robust image retrieval from noisy inputs using lattice associative memories

    NASA Astrophysics Data System (ADS)

    Urcid, Gonzalo; Nieves-V., José Angel; García-A., Anmi; Valdiviezo-N., Juan Carlos

    2009-02-01

    Lattice associative memories also known as morphological associative memories are fully connected feedforward neural networks with no hidden layers, whose computation at each node is carried out with lattice algebra operations. These networks are a relatively recent development in the field of associative memories that has proven to be an alternative way to work with sets of pattern pairs for which the storage and retrieval stages use minimax algebra. Different associative memory models have been proposed to cope with the problem of pattern recall under input degradations, such as occlusions or random noise, where input patterns can be composed of binary or real valued entries. In comparison to these and other artificial neural network memories, lattice algebra based memories display better performance for storage and recall capability; however, the computational techniques devised to achieve that purpose require additional processing or provide partial success when inputs are presented with undetermined noise levels. Robust retrieval capability of an associative memory model is usually expressed by a high percentage of perfect recalls from non-perfect input. The procedure described here uses noise masking defined by simple lattice operations together with appropriate metrics, such as the normalized mean squared error or signal to noise ratio, to boost the recall performance of either the min or max lattice auto-associative memories. Using a single lattice associative memory, illustrative examples are given that demonstrate the enhanced retrieval of correct gray-scale image associations from inputs corrupted with random noise.

  20. Temporal entrainment of cognitive functions: musical mnemonics induce brain plasticity and oscillatory synchrony in neural networks underlying memory.

    PubMed

    Thaut, Michael H; Peterson, David A; McIntosh, Gerald C

    2005-12-01

    In a series of experiments, we have begun to investigate the effect of music as a mnemonic device on learning and memory and the underlying plasticity of oscillatory neural networks. We used verbal learning and memory tests (standardized word lists, AVLT) in conjunction with electroencephalographic analysis to determine differences between verbal learning in either a spoken or musical (verbal materials as song lyrics) modality. In healthy adults, learning in both the spoken and music condition was associated with significant increases in oscillatory synchrony across all frequency bands. A significant difference between the spoken and music condition emerged in the cortical topography of the learning-related synchronization. When using EEG measures as predictors during learning for subsequent successful memory recall, significantly increased coherence (phase-locked synchronization) within and between oscillatory brain networks emerged for music in alpha and gamma bands. In a similar study with multiple sclerosis patients, superior learning and memory was shown in the music condition when controlled for word order recall, and subjects were instructed to sing back the word lists. Also, the music condition was associated with a significant power increase in the low-alpha band in bilateral frontal networks, indicating increased neuronal synchronization. Musical learning may access compensatory pathways for memory functions during compromised PFC functions associated with learning and recall. Music learning may also confer a neurophysiological advantage through the stronger synchronization of the neuronal cell assemblies underlying verbal learning and memory. Collectively our data provide evidence that melodic-rhythmic templates as temporal structures in music may drive internal rhythm formation in recurrent cortical networks involved in learning and memory.

  1. Identification of a Functional Connectome for Long-Term Fear Memory in Mice

    PubMed Central

    Wheeler, Anne L.; Teixeira, Cátia M.; Wang, Afra H.; Xiong, Xuejian; Kovacevic, Natasa; Lerch, Jason P.; McIntosh, Anthony R.; Parkinson, John; Frankland, Paul W.

    2013-01-01

    Long-term memories are thought to depend upon the coordinated activation of a broad network of cortical and subcortical brain regions. However, the distributed nature of this representation has made it challenging to define the neural elements of the memory trace, and lesion and electrophysiological approaches provide only a narrow window into what is appreciated a much more global network. Here we used a global mapping approach to identify networks of brain regions activated following recall of long-term fear memories in mice. Analysis of Fos expression across 84 brain regions allowed us to identify regions that were co-active following memory recall. These analyses revealed that the functional organization of long-term fear memories depends on memory age and is altered in mutant mice that exhibit premature forgetting. Most importantly, these analyses indicate that long-term memory recall engages a network that has a distinct thalamic-hippocampal-cortical signature. This network is concurrently integrated and segregated and therefore has small-world properties, and contains hub-like regions in the prefrontal cortex and thalamus that may play privileged roles in memory expression. PMID:23300432

  2. Auditory short-term memory in the primate auditory cortex

    PubMed Central

    Scott, Brian H.; Mishkin, Mortimer

    2015-01-01

    Sounds are fleeting, and assembling the sequence of inputs at the ear into a coherent percept requires auditory memory across various time scales. Auditory short-term memory comprises at least two components: an active ‘working memory’ bolstered by rehearsal, and a sensory trace that may be passively retained. Working memory relies on representations recalled from long-term memory, and their rehearsal may require phonological mechanisms unique to humans. The sensory component, passive short-term memory (pSTM), is tractable to study in nonhuman primates, whose brain architecture and behavioral repertoire are comparable to our own. This review discusses recent advances in the behavioral and neurophysiological study of auditory memory with a focus on single-unit recordings from macaque monkeys performing delayed-match-to-sample (DMS) tasks. Monkeys appear to employ pSTM to solve these tasks, as evidenced by the impact of interfering stimuli on memory performance. In several regards, pSTM in monkeys resembles pitch memory in humans, and may engage similar neural mechanisms. Neural correlates of DMS performance have been observed throughout the auditory and prefrontal cortex, defining a network of areas supporting auditory STM with parallels to that supporting visual STM. These correlates include persistent neural firing, or a suppression of firing, during the delay period of the memory task, as well as suppression or (less commonly) enhancement of sensory responses when a sound is repeated as a ‘match’ stimulus. Auditory STM is supported by a distributed temporo-frontal network in which sensitivity to stimulus history is an intrinsic feature of auditory processing. PMID:26541581

  3. Wavelet-based higher-order neural networks for mine detection in thermal IR imagery

    NASA Astrophysics Data System (ADS)

    Baertlein, Brian A.; Liao, Wen-Jiao

    2000-08-01

    An image processing technique is described for the detection of miens in RI imagery. The proposed technique is based on a third-order neural network, which processes the output of a wavelet packet transform. The technique is inherently invariant to changes in signature position, rotation and scaling. The well-known memory limitations that arise with higher-order neural networks are addressed by (1) the data compression capabilities of wavelet packets, (2) protections of the image data into a space of similar triangles, and (3) quantization of that 'triangle space'. Using these techniques, image chips of size 28 by 28, which would require 0(109) neural net weights, are processed by a network having 0(102) weights. ROC curves are presented for mine detection in real and simulated imagery.

  4. Nonvolatile Array Of Synapses For Neural Network

    NASA Technical Reports Server (NTRS)

    Tawel, Raoul

    1993-01-01

    Elements of array programmed with help of ultraviolet light. A 32 x 32 very-large-scale integrated-circuit array of electronic synapses serves as building-block chip for analog neural-network computer. Synaptic weights stored in nonvolatile manner. Makes information content of array invulnerable to loss of power, and, by eliminating need for circuitry to refresh volatile synaptic memory, makes architecture simpler and more compact.

  5. Signature neural networks: definition and application to multidimensional sorting problems.

    PubMed

    Latorre, Roberto; de Borja Rodriguez, Francisco; Varona, Pablo

    2011-01-01

    In this paper we present a self-organizing neural network paradigm that is able to discriminate information locally using a strategy for information coding and processing inspired in recent findings in living neural systems. The proposed neural network uses: 1) neural signatures to identify each unit in the network; 2) local discrimination of input information during the processing; and 3) a multicoding mechanism for information propagation regarding the who and the what of the information. The local discrimination implies a distinct processing as a function of the neural signature recognition and a local transient memory. In the context of artificial neural networks none of these mechanisms has been analyzed in detail, and our goal is to demonstrate that they can be used to efficiently solve some specific problems. To illustrate the proposed paradigm, we apply it to the problem of multidimensional sorting, which can take advantage of the local information discrimination. In particular, we compare the results of this new approach with traditional methods to solve jigsaw puzzles and we analyze the situations where the new paradigm improves the performance.

  6. Memory functions reveal structural properties of gene regulatory networks

    PubMed Central

    Perez-Carrasco, Ruben

    2018-01-01

    Gene regulatory networks (GRNs) control cellular function and decision making during tissue development and homeostasis. Mathematical tools based on dynamical systems theory are often used to model these networks, but the size and complexity of these models mean that their behaviour is not always intuitive and the underlying mechanisms can be difficult to decipher. For this reason, methods that simplify and aid exploration of complex networks are necessary. To this end we develop a broadly applicable form of the Zwanzig-Mori projection. By first converting a thermodynamic state ensemble model of gene regulation into mass action reactions we derive a general method that produces a set of time evolution equations for a subset of components of a network. The influence of the rest of the network, the bulk, is captured by memory functions that describe how the subnetwork reacts to its own past state via components in the bulk. These memory functions provide probes of near-steady state dynamics, revealing information not easily accessible otherwise. We illustrate the method on a simple cross-repressive transcriptional motif to show that memory functions not only simplify the analysis of the subnetwork but also have a natural interpretation. We then apply the approach to a GRN from the vertebrate neural tube, a well characterised developmental transcriptional network composed of four interacting transcription factors. The memory functions reveal the function of specific links within the neural tube network and identify features of the regulatory structure that specifically increase the robustness of the network to initial conditions. Taken together, the study provides evidence that Zwanzig-Mori projections offer powerful and effective tools for simplifying and exploring the behaviour of GRNs. PMID:29470492

  7. Optical computing and neural networks; Proceedings of the Meeting, National Chiao Tung Univ., Hsinchu, Taiwan, Dec. 16, 17, 1992

    NASA Technical Reports Server (NTRS)

    Hsu, Ken-Yuh (Editor); Liu, Hua-Kuang (Editor)

    1992-01-01

    The present conference discusses optical neural networks, photorefractive nonlinear optics, optical pattern recognition, digital and analog processors, and holography and its applications. Attention is given to bifurcating optical information processing, neural structures in digital halftoning, an exemplar-based optical neural net classifier for color pattern recognition, volume storage in photorefractive disks, and microlaser-based compact optical neuroprocessors. Also treated are the optical implementation of a feature-enhanced optical interpattern-associative neural network model and its optical implementation, an optical pattern binary dual-rail logic gate module, a theoretical analysis for holographic associative memories, joint transform correlators, image addition and subtraction via the Talbot effect, and optical wavelet-matched filters. (No individual items are abstracted in this volume)

  8. Optical computing and neural networks; Proceedings of the Meeting, National Chiao Tung Univ., Hsinchu, Taiwan, Dec. 16, 17, 1992

    NASA Astrophysics Data System (ADS)

    Hsu, Ken-Yuh; Liu, Hua-Kuang

    The present conference discusses optical neural networks, photorefractive nonlinear optics, optical pattern recognition, digital and analog processors, and holography and its applications. Attention is given to bifurcating optical information processing, neural structures in digital halftoning, an exemplar-based optical neural net classifier for color pattern recognition, volume storage in photorefractive disks, and microlaser-based compact optical neuroprocessors. Also treated are the optical implementation of a feature-enhanced optical interpattern-associative neural network model and its optical implementation, an optical pattern binary dual-rail logic gate module, a theoretical analysis for holographic associative memories, joint transform correlators, image addition and subtraction via the Talbot effect, and optical wavelet-matched filters. (No individual items are abstracted in this volume)

  9. Distinct neural substrates for visual short-term memory of actions.

    PubMed

    Cai, Ying; Urgolites, Zhisen; Wood, Justin; Chen, Chuansheng; Li, Siyao; Chen, Antao; Xue, Gui

    2018-06-26

    Fundamental theories of human cognition have long posited that the short-term maintenance of actions is supported by one of the "core knowledge" systems of human visual cognition, yet its neural substrates are still not well understood. In particular, it is unclear whether the visual short-term memory (VSTM) of actions has distinct neural substrates or, as proposed by the spatio-object architecture of VSTM, shares them with VSTM of objects and spatial locations. In two experiments, we tested these two competing hypotheses by directly contrasting the neural substrates for VSTM of actions with those for objects and locations. Our results showed that the bilateral middle temporal cortex (MT) was specifically involved in VSTM of actions because its activation and its functional connectivity with the frontal-parietal network (FPN) were only modulated by the memory load of actions, but not by that of objects/agents or locations. Moreover, the brain regions involved in the maintenance of spatial location information (i.e., superior parietal lobule, SPL) was also recruited during the maintenance of actions, consistent with the temporal-spatial nature of actions. Meanwhile, the frontoparietal network (FPN) was commonly involved in all types of VSTM and showed flexible functional connectivity with the domain-specific regions, depending on the current working memory tasks. Together, our results provide clear evidence for a distinct neural system for maintaining actions in VSTM, which supports the core knowledge system theory and the domain-specific and domain-general architectures of VSTM. © 2018 Wiley Periodicals, Inc.

  10. Auditory short-term memory in the primate auditory cortex.

    PubMed

    Scott, Brian H; Mishkin, Mortimer

    2016-06-01

    Sounds are fleeting, and assembling the sequence of inputs at the ear into a coherent percept requires auditory memory across various time scales. Auditory short-term memory comprises at least two components: an active ׳working memory' bolstered by rehearsal, and a sensory trace that may be passively retained. Working memory relies on representations recalled from long-term memory, and their rehearsal may require phonological mechanisms unique to humans. The sensory component, passive short-term memory (pSTM), is tractable to study in nonhuman primates, whose brain architecture and behavioral repertoire are comparable to our own. This review discusses recent advances in the behavioral and neurophysiological study of auditory memory with a focus on single-unit recordings from macaque monkeys performing delayed-match-to-sample (DMS) tasks. Monkeys appear to employ pSTM to solve these tasks, as evidenced by the impact of interfering stimuli on memory performance. In several regards, pSTM in monkeys resembles pitch memory in humans, and may engage similar neural mechanisms. Neural correlates of DMS performance have been observed throughout the auditory and prefrontal cortex, defining a network of areas supporting auditory STM with parallels to that supporting visual STM. These correlates include persistent neural firing, or a suppression of firing, during the delay period of the memory task, as well as suppression or (less commonly) enhancement of sensory responses when a sound is repeated as a ׳match' stimulus. Auditory STM is supported by a distributed temporo-frontal network in which sensitivity to stimulus history is an intrinsic feature of auditory processing. This article is part of a Special Issue entitled SI: Auditory working memory. Published by Elsevier B.V.

  11. Spatiotemporal discrimination in neural networks with short-term synaptic plasticity

    NASA Astrophysics Data System (ADS)

    Shlaer, Benjamin; Miller, Paul

    2015-03-01

    Cells in recurrently connected neural networks exhibit bistability, which allows for stimulus information to persist in a circuit even after stimulus offset, i.e. short-term memory. However, such a system does not have enough hysteresis to encode temporal information about the stimuli. The biophysically described phenomenon of synaptic depression decreases synaptic transmission strengths due to increased presynaptic activity. This short-term reduction in synaptic strengths can destabilize attractor states in excitatory recurrent neural networks, causing the network to move along stimulus dependent dynamical trajectories. Such a network can successfully separate amplitudes and durations of stimuli from the number of successive stimuli. Stimulus number, duration and intensity encoding in randomly connected attractor networks with synaptic depression. Front. Comput. Neurosci. 7:59., and so provides a strong candidate network for the encoding of spatiotemporal information. Here we explicitly demonstrate the capability of a recurrent neural network with short-term synaptic depression to discriminate between the temporal sequences in which spatial stimuli are presented.

  12. An intelligent control system for failure detection and controller reconfiguration

    NASA Technical Reports Server (NTRS)

    Biswas, Saroj K.

    1994-01-01

    We present an architecture of an intelligent restructurable control system to automatically detect failure of system components, assess its impact on system performance and safety, and reconfigure the controller for performance recovery. Fault detection is based on neural network associative memories and pattern classifiers, and is implemented using a multilayer feedforward network. Details of the fault detection network along with simulation results on health monitoring of a dc motor have been presented. Conceptual developments for fault assessment using an expert system and controller reconfiguration using a neural network are outlined.

  13. Neural organization of linguistic short-term memory is sensory modality-dependent: evidence from signed and spoken language.

    PubMed

    Pa, Judy; Wilson, Stephen M; Pickell, Herbert; Bellugi, Ursula; Hickok, Gregory

    2008-12-01

    Despite decades of research, there is still disagreement regarding the nature of the information that is maintained in linguistic short-term memory (STM). Some authors argue for abstract phonological codes, whereas others argue for more general sensory traces. We assess these possibilities by investigating linguistic STM in two distinct sensory-motor modalities, spoken and signed language. Hearing bilingual participants (native in English and American Sign Language) performed equivalent STM tasks in both languages during functional magnetic resonance imaging. Distinct, sensory-specific activations were seen during the maintenance phase of the task for spoken versus signed language. These regions have been previously shown to respond to nonlinguistic sensory stimulation, suggesting that linguistic STM tasks recruit sensory-specific networks. However, maintenance-phase activations common to the two languages were also observed, implying some form of common process. We conclude that linguistic STM involves sensory-dependent neural networks, but suggest that sensory-independent neural networks may also exist.

  14. Design and implementation of a random neural network routing engine.

    PubMed

    Kocak, T; Seeber, J; Terzioglu, H

    2003-01-01

    Random neural network (RNN) is an analytically tractable spiked neural network model that has been implemented in software for a wide range of applications for over a decade. This paper presents the hardware implementation of the RNN model. Recently, cognitive packet networks (CPN) is proposed as an alternative packet network architecture where there is no routing table, instead the RNN based reinforcement learning is used to route packets. Particularly, we describe implementation details for the RNN based routing engine of a CPN network processor chip: the smart packet processor (SPP). The SPP is a dual port device that stores, modifies, and interprets the defining characteristics of multiple RNN models. In addition to hardware design improvements over the software implementation such as the dual access memory, output calculation step, and reduced output calculation module, this paper introduces a major modification to the reinforcement learning algorithm used in the original CPN specification such that the number of weight terms are reduced from 2n/sup 2/ to 2n. This not only yields significant memory savings, but it also simplifies the calculations for the steady state probabilities (neuron outputs in RNN). Simulations have been conducted to confirm the proper functionality for the isolated SPP design as well as for the multiple SPP's in a networked environment.

  15. Plastic modulation of episodic memory networks in the aging brain with cognitive decline.

    PubMed

    Bai, Feng; Yuan, Yonggui; Yu, Hui; Zhang, Zhijun

    2016-07-15

    Social-cognitive processing has been posited to underlie general functions such as episodic memory. Episodic memory impairment is a recognized hallmark of amnestic mild cognitive impairment (aMCI) who is at a high risk for dementia. Three canonical networks, self-referential processing, executive control processing and salience processing, have distinct roles in episodic memory retrieval processing. It remains unclear whether and how these sub-networks of the episodic memory retrieval system would be affected in aMCI. This task-state fMRI study constructed systems-level episodic memory retrieval sub-networks in 28 aMCI and 23 controls using two computational approaches: a multiple region-of-interest based approach and a voxel-level functional connectivity-based approach, respectively. These approaches produced the remarkably similar findings that the self-referential processing network made critical contributions to episodic memory retrieval in aMCI. More conspicuous alterations in self-referential processing of the episodic memory retrieval network were identified in aMCI. In order to complete a given episodic memory retrieval task, increases in cooperation between the self-referential processing network and other sub-networks were mobilized in aMCI. Self-referential processing mediate the cooperation of the episodic memory retrieval sub-networks as it may help to achieve neural plasticity and may contribute to the prevention and treatment of dementia. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Criterion for correct recalls in associative-memory neural networks

    NASA Astrophysics Data System (ADS)

    Ji, Han-Bing

    1992-12-01

    A novel weighted outer-product learning (WOPL) scheme for associative memory neural networks (AMNNs) is presented. In the scheme, each fundamental memory is allocated a learning weight to direct its correct recall. Both the Hopfield and multiple training models are instances of the WOPL model with certain sets of learning weights. A necessary condition of choosing learning weights for the convergence property of the WOPL model is obtained through neural dynamics. A criterion for choosing learning weights for correct associative recalls of the fundamental memories is proposed. In this paper, an important parameter called signal to noise ratio gain (SNRG) is devised, and it is found out empirically that SNRGs have their own threshold values which means that any fundamental memory can be correctly recalled when its corresponding SNRG is greater than or equal to its threshold value. Furthermore, a theorem is given and some theoretical results on the conditions of SNRGs and learning weights for good associative recall performance of the WOPL model are accordingly obtained. In principle, when all SNRGs or learning weights chosen satisfy the theoretically obtained conditions, the asymptotic storage capacity of the WOPL model will grow at the greatest rate under certain known stochastic meaning for AMNNs, and thus the WOPL model can achieve correct recalls for all fundamental memories. The representative computer simulations confirm the criterion and theoretical analysis.

  17. 3-DIMENSIONAL Optoelectronic

    NASA Astrophysics Data System (ADS)

    Krishnamoorthy, Ashok Venketaraman

    This thesis covers the design, analysis, optimization, and implementation of optoelectronic (N,M,F) networks. (N,M,F) networks are generic space-division networks that are well suited to implementation using optoelectronic integrated circuits and free-space optical interconnects. An (N,M,F) networks consists of N input channels each having a fanout F_{rm o}, M output channels each having a fanin F_{rm i}, and Log_{rm K}(N/F) stages of K x K switches. The functionality of the fanout, switching, and fanin stages depends on the specific application. Three applications of optoelectronic (N,M,F) networks are considered. The first is an optoelectronic (N,1,1) content -addressable memory system that achieves associative recall on two-dimensional images retrieved from a parallel-access optical memory. The design and simulation of the associative memory are discussed, and an experimental emulation of a prototype system using images from a parallel-readout optical disk is presented. The system design provides superior performance to existing electronic content-addressable memory chips in terms of capacity and search rate, and uses readily available optical disk and VLSI technologies. Next, a scalable optoelectronic (N,M,F) neural network that uses free-space holographic optical interconnects is presented. The neural architecture minimizes the number of optical transmitters needed, and provides accurate electronic fanin with low signal skew, and dendritic-type fan-in processing capability in a compact layout. Optimal data-encoding methods and circuit techniques are discussed. The implementation of an prototype optoelectronic neural system, and its application to a simple recognition task is demonstrated. Finally, the design, analysis, and optimization of a (N,N,F) self-routing, packet-switched multistage interconnection network is described. The network is suitable for parallel computing and broadband switching applications. The tradeoff between optical and electronic interconnects is examined quantitatively by varying the electronic switch size K. The performance of the (N,N,F) network versus the fanning parameter F, is also analyzed. It is shown that the optoelectronic (N,N,F) networks provide a range of performance-cost alternatives, and offer superior performance-per-cost to fully electronic switching networks and to previous networks designs.

  18. Passivity analysis for uncertain BAM neural networks with time delays and reaction-diffusions

    NASA Astrophysics Data System (ADS)

    Zhou, Jianping; Xu, Shengyuan; Shen, Hao; Zhang, Baoyong

    2013-08-01

    This article deals with the problem of passivity analysis for delayed reaction-diffusion bidirectional associative memory (BAM) neural networks with weight uncertainties. By using a new integral inequality, we first present a passivity condition for the nominal networks, and then extend the result to the case with linear fractional weight uncertainties. The proposed conditions are expressed in terms of linear matrix inequalities, and thus can be checked easily. Examples are provided to demonstrate the effectiveness of the proposed results.

  19. Differential Neural Activity during Search of Specific and General Autobiographical Memories elicited by Musical Cues

    PubMed Central

    Ford, Jaclyn Hennessey; Addis, Donna Rose; Giovanello, Kelly S.

    2011-01-01

    Previous neuroimaging studies that have examined autobiographical memory specificity have utilized retrieval cues associated with prior searches of the event, potentially changing the retrieval processes being investigated. In the current study, musical cues were used to naturally elicit memories from multiple levels of specificity (i.e., lifetime period, general event, and event-specific). Sixteen young adults participated in a neuroimaging study in which they retrieved autobiographical memories associated with musical cues. These musical cues led to the retrieval of highly emotional memories that had low levels of prior retrieval. Retrieval of all autobiographical memory levels was associated with activity in regions in the autobiographical memory network, specifically the ventromedial prefrontal cortex, posterior cingulate, and right medial temporal lobe. Owing to the use of music, memories from varying levels of specificity were retrieved, allowing for comparison of event memory and abstract personal knowledge, as well as comparison of specific and general event memory. Dorsolateral and dorsomedial prefrontal regions were engaged during event retrieval relative to personal knowledge retrieval, and retrieval of specific event memories was associated with increased activity in the bilateral medial temporal lobe and dorsomedial prefrontal cortex relative to retrieval of general event memories. These results suggest that the initial search processes for memories of different specificity levels preferentially engage different components of the autobiographical memory network. The potential underlying causes of these neural differences are discussed. PMID:21600227

  20. How to Compress Sequential Memory Patterns into Periodic Oscillations: General Reduction Rules

    PubMed Central

    Zhang, Kechen

    2017-01-01

    A neural network with symmetric reciprocal connections always admits a Lyapunov function, whose minima correspond to the memory states stored in the network. Networks with suitable asymmetric connections can store and retrieve a sequence of memory patterns, but the dynamics of these networks cannot be characterized as readily as that of the symmetric networks due to the lack of established general methods. Here, a reduction method is developed for a class of asymmetric attractor networks that store sequences of activity patterns as associative memories, as in a Hopfield network. The method projects the original activity pattern of the network to a low-dimensional space such that sequential memory retrievals in the original network correspond to periodic oscillations in the reduced system. The reduced system is self-contained and provides quantitative information about the stability and speed of sequential memory retrievals in the original network. The time evolution of the overlaps between the network state and the stored memory patterns can also be determined from extended reduced systems. The reduction procedure can be summarized by a few reduction rules, which are applied to several network models, including coupled networks and networks with time-delayed connections, and the analytical solutions of the reduced systems are confirmed by numerical simulations of the original networks. Finally, a local learning rule that provides an approximation to the connection weights involving the pseudoinverse is also presented. PMID:24877729

  1. Functional expansion representations of artificial neural networks

    NASA Technical Reports Server (NTRS)

    Gray, W. Steven

    1992-01-01

    In the past few years, significant interest has developed in using artificial neural networks to model and control nonlinear dynamical systems. While there exists many proposed schemes for accomplishing this and a wealth of supporting empirical results, most approaches to date tend to be ad hoc in nature and rely mainly on heuristic justifications. The purpose of this project was to further develop some analytical tools for representing nonlinear discrete-time input-output systems, which when applied to neural networks would give insight on architecture selection, pruning strategies, and learning algorithms. A long term goal is to determine in what sense, if any, a neural network can be used as a universal approximator for nonliner input-output maps with memory (i.e., realized by a dynamical system). This property is well known for the case of static or memoryless input-output maps. The general architecture under consideration in this project was a single-input, single-output recurrent feedforward network.

  2. Are There Multiple Kinds of Episodic Memory? An fMRI Investigation Comparing Autobiographical and Recognition Memory Tasks.

    PubMed

    Chen, Hung-Yu; Gilmore, Adrian W; Nelson, Steven M; McDermott, Kathleen B

    2017-03-08

    What brain regions underlie retrieval from episodic memory? The bulk of research addressing this question with fMRI has relied upon recognition memory for materials encoded within the laboratory. Another, less dominant tradition has used autobiographical methods, whereby people recall events from their lifetime, often after being cued with words or pictures. The current study addresses how the neural substrates of successful memory retrieval differed as a function of the targeted memory when the experimental parameters were held constant in the two conditions (except for instructions). Human participants studied a set of scenes and then took two types of memory test while undergoing fMRI scanning. In one condition (the picture memory test), participants reported for each scene (32 studied, 64 nonstudied) whether it was recollected from the prior study episode. In a second condition (the life memory test), participants reported for each scene (32 studied, 64 nonstudied) whether it reminded them of a specific event from their preexperimental lifetime. An examination of successful retrieval (yes responses) for recently studied scenes for the two test types revealed pronounced differences; that is, autobiographical retrieval instantiated with the life memory test preferentially activated the default mode network, whereas hits in the picture memory test preferentially engaged the parietal memory network as well as portions of the frontoparietal control network. When experimental cueing parameters are held constant, the neural underpinnings of successful memory retrieval differ when remembering life events and recently learned events. SIGNIFICANCE STATEMENT Episodic memory is often discussed as a solitary construct. However, experimental traditions examining episodic memory use very different approaches, and these are rarely compared to one another. When the neural correlates associated with each approach have been directly contrasted, results have varied considerably and at times contradicted each other. The present experiment was designed to match the two primary approaches to studying episodic memory in an unparalleled manner. Results suggest a clear separation of systems supporting memory as it is typically tested in the laboratory and memory as assessed under autobiographical retrieval conditions. These data provide neurobiological evidence that episodic memory is not a single construct, challenging the degree to which different experimental traditions are studying the same construct. Copyright © 2017 the authors 0270-6474/17/372764-12$15.00/0.

  3. Black Holes as Brains: Neural Networks with Area Law Entropy

    NASA Astrophysics Data System (ADS)

    Dvali, Gia

    2018-04-01

    Motivated by the potential similarities between the underlying mechanisms of the enhanced memory storage capacity in black holes and in brain networks, we construct an artificial quantum neural network based on gravity-like synaptic connections and a symmetry structure that allows to describe the network in terms of geometry of a d-dimensional space. We show that the network possesses a critical state in which the gapless neurons emerge that appear to inhabit a (d-1)-dimensional surface, with their number given by the surface area. In the excitations of these neurons, the network can store and retrieve an exponentially large number of patterns within an arbitrarily narrow energy gap. The corresponding micro-state entropy of the brain network exhibits an area law. The neural network can be described in terms of a quantum field, via identifying the different neurons with the different momentum modes of the field, while identifying the synaptic connections among the neurons with the interactions among the corresponding momentum modes. Such a mapping allows to attribute a well-defined sense of geometry to an intrinsically non-local system, such as the neural network, and vice versa, it allows to represent the quantum field model as a neural network.

  4. Memory and learning behaviors mimicked in nanogranular SiO2-based proton conductor gated oxide-based synaptic transistors

    NASA Astrophysics Data System (ADS)

    Wan, Chang Jin; Zhu, Li Qiang; Zhou, Ju Mei; Shi, Yi; Wan, Qing

    2013-10-01

    In neuroscience, signal processing, memory and learning function are established in the brain by modifying ionic fluxes in neurons and synapses. Emulation of memory and learning behaviors of biological systems by nanoscale ionic/electronic devices is highly desirable for building neuromorphic systems or even artificial neural networks. Here, novel artificial synapses based on junctionless oxide-based protonic/electronic hybrid transistors gated by nanogranular phosphorus-doped SiO2-based proton-conducting films are fabricated on glass substrates by a room-temperature process. Short-term memory (STM) and long-term memory (LTM) are mimicked by tuning the pulse gate voltage amplitude. The LTM process in such an artificial synapse is due to the proton-related interfacial electrochemical reaction. Our results are highly desirable for building future neuromorphic systems or even artificial networks via electronic elements.In neuroscience, signal processing, memory and learning function are established in the brain by modifying ionic fluxes in neurons and synapses. Emulation of memory and learning behaviors of biological systems by nanoscale ionic/electronic devices is highly desirable for building neuromorphic systems or even artificial neural networks. Here, novel artificial synapses based on junctionless oxide-based protonic/electronic hybrid transistors gated by nanogranular phosphorus-doped SiO2-based proton-conducting films are fabricated on glass substrates by a room-temperature process. Short-term memory (STM) and long-term memory (LTM) are mimicked by tuning the pulse gate voltage amplitude. The LTM process in such an artificial synapse is due to the proton-related interfacial electrochemical reaction. Our results are highly desirable for building future neuromorphic systems or even artificial networks via electronic elements. Electronic supplementary information (ESI) available. See DOI: 10.1039/c3nr02987e

  5. Effective visual working memory capacity: an emergent effect from the neural dynamics in an attractor network.

    PubMed

    Dempere-Marco, Laura; Melcher, David P; Deco, Gustavo

    2012-01-01

    The study of working memory capacity is of outmost importance in cognitive psychology as working memory is at the basis of general cognitive function. Although the working memory capacity limit has been thoroughly studied, its origin still remains a matter of strong debate. Only recently has the role of visual saliency in modulating working memory storage capacity been assessed experimentally and proved to provide valuable insights into working memory function. In the computational arena, attractor networks have successfully accounted for psychophysical and neurophysiological data in numerous working memory tasks given their ability to produce a sustained elevated firing rate during a delay period. Here we investigate the mechanisms underlying working memory capacity by means of a biophysically-realistic attractor network with spiking neurons while accounting for two recent experimental observations: 1) the presence of a visually salient item reduces the number of items that can be held in working memory, and 2) visually salient items are commonly kept in memory at the cost of not keeping as many non-salient items. Our model suggests that working memory capacity is determined by two fundamental processes: encoding of visual items into working memory and maintenance of the encoded items upon their removal from the visual display. While maintenance critically depends on the constraints that lateral inhibition imposes to the mnemonic activity, encoding is limited by the ability of the stimulated neural assemblies to reach a sufficiently high level of excitation, a process governed by the dynamics of competition and cooperation among neuronal pools. Encoding is therefore contingent upon the visual working memory task and has led us to introduce the concept of effective working memory capacity (eWMC) in contrast to the maximal upper capacity limit only reached under ideal conditions.

  6. Effective Visual Working Memory Capacity: An Emergent Effect from the Neural Dynamics in an Attractor Network

    PubMed Central

    Dempere-Marco, Laura; Melcher, David P.; Deco, Gustavo

    2012-01-01

    The study of working memory capacity is of outmost importance in cognitive psychology as working memory is at the basis of general cognitive function. Although the working memory capacity limit has been thoroughly studied, its origin still remains a matter of strong debate. Only recently has the role of visual saliency in modulating working memory storage capacity been assessed experimentally and proved to provide valuable insights into working memory function. In the computational arena, attractor networks have successfully accounted for psychophysical and neurophysiological data in numerous working memory tasks given their ability to produce a sustained elevated firing rate during a delay period. Here we investigate the mechanisms underlying working memory capacity by means of a biophysically-realistic attractor network with spiking neurons while accounting for two recent experimental observations: 1) the presence of a visually salient item reduces the number of items that can be held in working memory, and 2) visually salient items are commonly kept in memory at the cost of not keeping as many non-salient items. Our model suggests that working memory capacity is determined by two fundamental processes: encoding of visual items into working memory and maintenance of the encoded items upon their removal from the visual display. While maintenance critically depends on the constraints that lateral inhibition imposes to the mnemonic activity, encoding is limited by the ability of the stimulated neural assemblies to reach a sufficiently high level of excitation, a process governed by the dynamics of competition and cooperation among neuronal pools. Encoding is therefore contingent upon the visual working memory task and has led us to introduce the concept of effective working memory capacity (eWMC) in contrast to the maximal upper capacity limit only reached under ideal conditions. PMID:22952608

  7. A system of IAC neural networks as the basis for self-organization in a sociological dynamical system simulation.

    PubMed

    Duong, D V; Reilly, K D

    1995-10-01

    This sociological simulation uses the ideas of semiotics and symbolic interactionism to demonstrate how an appropriately developed associative memory in the minds of individuals on the microlevel can self-organize into macrolevel dissipative structures of societies such as racial cultural/economic classes, status symbols and fads. The associative memory used is based on an extension of the IAC neural network (the Interactive Activation and Competition network). Several IAC networks act together to form a society by virtue of their human-like properties of intuition and creativity. These properties give them the ability to create and understand signs, which lead to the macrolevel structures of society. This system is implemented in hierarchical object oriented container classes which facilitate change in deep structure. Graphs of general trends and an historical account of a simulation run of this dynamical system are presented.

  8. Learning Traffic as Images: A Deep Convolutional Neural Network for Large-Scale Transportation Network Speed Prediction.

    PubMed

    Ma, Xiaolei; Dai, Zhuang; He, Zhengbing; Ma, Jihui; Wang, Yong; Wang, Yunpeng

    2017-04-10

    This paper proposes a convolutional neural network (CNN)-based method that learns traffic as images and predicts large-scale, network-wide traffic speed with a high accuracy. Spatiotemporal traffic dynamics are converted to images describing the time and space relations of traffic flow via a two-dimensional time-space matrix. A CNN is applied to the image following two consecutive steps: abstract traffic feature extraction and network-wide traffic speed prediction. The effectiveness of the proposed method is evaluated by taking two real-world transportation networks, the second ring road and north-east transportation network in Beijing, as examples, and comparing the method with four prevailing algorithms, namely, ordinary least squares, k-nearest neighbors, artificial neural network, and random forest, and three deep learning architectures, namely, stacked autoencoder, recurrent neural network, and long-short-term memory network. The results show that the proposed method outperforms other algorithms by an average accuracy improvement of 42.91% within an acceptable execution time. The CNN can train the model in a reasonable time and, thus, is suitable for large-scale transportation networks.

  9. Learning Traffic as Images: A Deep Convolutional Neural Network for Large-Scale Transportation Network Speed Prediction

    PubMed Central

    Ma, Xiaolei; Dai, Zhuang; He, Zhengbing; Ma, Jihui; Wang, Yong; Wang, Yunpeng

    2017-01-01

    This paper proposes a convolutional neural network (CNN)-based method that learns traffic as images and predicts large-scale, network-wide traffic speed with a high accuracy. Spatiotemporal traffic dynamics are converted to images describing the time and space relations of traffic flow via a two-dimensional time-space matrix. A CNN is applied to the image following two consecutive steps: abstract traffic feature extraction and network-wide traffic speed prediction. The effectiveness of the proposed method is evaluated by taking two real-world transportation networks, the second ring road and north-east transportation network in Beijing, as examples, and comparing the method with four prevailing algorithms, namely, ordinary least squares, k-nearest neighbors, artificial neural network, and random forest, and three deep learning architectures, namely, stacked autoencoder, recurrent neural network, and long-short-term memory network. The results show that the proposed method outperforms other algorithms by an average accuracy improvement of 42.91% within an acceptable execution time. The CNN can train the model in a reasonable time and, thus, is suitable for large-scale transportation networks. PMID:28394270

  10. `Unlearning' has a stabilizing effect in collective memories

    NASA Astrophysics Data System (ADS)

    Hopfield, J. J.; Feinstein, D. I.; Palmer, R. G.

    1983-07-01

    Crick and Mitchison1 have presented a hypothesis for the functional role of dream sleep involving an `unlearning' process. We have independently carried out mathematical and computer modelling of learning and `unlearning' in a collective neural network of 30-1,000 neurones. The model network has a content-addressable memory or `associative memory' which allows it to learn and store many memories. A particular memory can be evoked in its entirety when the network is stimulated by any adequate-sized subpart of the information of that memory2. But different memories of the same size are not equally easy to recall. Also, when memories are learned, spurious memories are also created and can also be evoked. Applying an `unlearning' process, similar to the learning processes but with a reversed sign and starting from a noise input, enhances the performance of the network in accessing real memories and in minimizing spurious ones. Although our model was not motivated by higher nervous function, our system displays behaviours which are strikingly parallel to those needed for the hypothesized role of `unlearning' in rapid eye movement (REM) sleep.

  11. Improved Autoassociative Neural Networks

    NASA Technical Reports Server (NTRS)

    Hand, Charles

    2003-01-01

    Improved autoassociative neural networks, denoted nexi, have been proposed for use in controlling autonomous robots, including mobile exploratory robots of the biomorphic type. In comparison with conventional autoassociative neural networks, nexi would be more complex but more capable in that they could be trained to do more complex tasks. A nexus would use bit weights and simple arithmetic in a manner that would enable training and operation without a central processing unit, programs, weight registers, or large amounts of memory. Only a relatively small amount of memory (to hold the bit weights) and a simple logic application- specific integrated circuit would be needed. A description of autoassociative neural networks is prerequisite to a meaningful description of a nexus. An autoassociative network is a set of neurons that are completely connected in the sense that each neuron receives input from, and sends output to, all the other neurons. (In some instantiations, a neuron could also send output back to its own input terminal.) The state of a neuron is completely determined by the inner product of its inputs with weights associated with its input channel. Setting the weights sets the behavior of the network. The neurons of an autoassociative network are usually regarded as comprising a row or vector. Time is a quantized phenomenon for most autoassociative networks in the sense that time proceeds in discrete steps. At each time step, the row of neurons forms a pattern: some neurons are firing, some are not. Hence, the current state of an autoassociative network can be described with a single binary vector. As time goes by, the network changes the vector. Autoassociative networks move vectors over hyperspace landscapes of possibilities.

  12. Biologically plausible learning in recurrent neural networks reproduces neural dynamics observed during cognitive tasks

    PubMed Central

    Miconi, Thomas

    2017-01-01

    Neural activity during cognitive tasks exhibits complex dynamics that flexibly encode task-relevant variables. Chaotic recurrent networks, which spontaneously generate rich dynamics, have been proposed as a model of cortical computation during cognitive tasks. However, existing methods for training these networks are either biologically implausible, and/or require a continuous, real-time error signal to guide learning. Here we show that a biologically plausible learning rule can train such recurrent networks, guided solely by delayed, phasic rewards at the end of each trial. Networks endowed with this learning rule can successfully learn nontrivial tasks requiring flexible (context-dependent) associations, memory maintenance, nonlinear mixed selectivities, and coordination among multiple outputs. The resulting networks replicate complex dynamics previously observed in animal cortex, such as dynamic encoding of task features and selective integration of sensory inputs. We conclude that recurrent neural networks offer a plausible model of cortical dynamics during both learning and performance of flexible behavior. DOI: http://dx.doi.org/10.7554/eLife.20899.001 PMID:28230528

  13. Biologically plausible learning in recurrent neural networks reproduces neural dynamics observed during cognitive tasks.

    PubMed

    Miconi, Thomas

    2017-02-23

    Neural activity during cognitive tasks exhibits complex dynamics that flexibly encode task-relevant variables. Chaotic recurrent networks, which spontaneously generate rich dynamics, have been proposed as a model of cortical computation during cognitive tasks. However, existing methods for training these networks are either biologically implausible, and/or require a continuous, real-time error signal to guide learning. Here we show that a biologically plausible learning rule can train such recurrent networks, guided solely by delayed, phasic rewards at the end of each trial. Networks endowed with this learning rule can successfully learn nontrivial tasks requiring flexible (context-dependent) associations, memory maintenance, nonlinear mixed selectivities, and coordination among multiple outputs. The resulting networks replicate complex dynamics previously observed in animal cortex, such as dynamic encoding of task features and selective integration of sensory inputs. We conclude that recurrent neural networks offer a plausible model of cortical dynamics during both learning and performance of flexible behavior.

  14. A balanced memory network.

    PubMed

    Roudi, Yasser; Latham, Peter E

    2007-09-01

    A fundamental problem in neuroscience is understanding how working memory--the ability to store information at intermediate timescales, like tens of seconds--is implemented in realistic neuronal networks. The most likely candidate mechanism is the attractor network, and a great deal of effort has gone toward investigating it theoretically. Yet, despite almost a quarter century of intense work, attractor networks are not fully understood. In particular, there are still two unanswered questions. First, how is it that attractor networks exhibit irregular firing, as is observed experimentally during working memory tasks? And second, how many memories can be stored under biologically realistic conditions? Here we answer both questions by studying an attractor neural network in which inhibition and excitation balance each other. Using mean-field analysis, we derive a three-variable description of attractor networks. From this description it follows that irregular firing can exist only if the number of neurons involved in a memory is large. The same mean-field analysis also shows that the number of memories that can be stored in a network scales with the number of excitatory connections, a result that has been suggested for simple models but never shown for realistic ones. Both of these predictions are verified using simulations with large networks of spiking neurons.

  15. Hippocampal and posterior parietal contributions to developmental increases in visual short-term memory capacity.

    PubMed

    von Allmen, David Yoh; Wurmitzer, Karoline; Klaver, Peter

    2014-10-01

    Developmental increases in visual short-term memory (VSTM) capacity have been associated with changes in attention processing limitations and changes in neural activity within neural networks including the posterior parietal cortex (PPC). A growing body of evidence suggests that the hippocampus plays a role in VSTM, but it is unknown whether the hippocampus contributes to the capacity increase across development. We investigated the functional development of the hippocampus and PPC in 57 children, adolescents and adults (age 8-27 years) who performed a visuo-spatial change detection task. A negative relationship between age and VSTM related activity was found in the right posterior hippocampus that was paralleled by a positive age-activity relationship in the right PPC. In the posterior hippocampus, VSTM related activity predicted individual capacity in children, whereas neural activity in the right anterior hippocampus predicted individual capacity in adults. The findings provide first evidence that VSTM development is supported by an integrated neural network that involves hippocampal and posterior parietal regions.

  16. Excitation-neurogenesis coupling in adult neural stem/progenitor cells.

    PubMed

    Deisseroth, Karl; Singla, Sheela; Toda, Hiroki; Monje, Michelle; Palmer, Theo D; Malenka, Robert C

    2004-05-27

    A wide variety of in vivo manipulations influence neurogenesis in the adult hippocampus. It is not known, however, if adult neural stem/progenitor cells (NPCs) can intrinsically sense excitatory neural activity and thereby implement a direct coupling between excitation and neurogenesis. Moreover, the theoretical significance of activity-dependent neurogenesis in hippocampal-type memory processing networks has not been explored. Here we demonstrate that excitatory stimuli act directly on adult hippocampal NPCs to favor neuron production. The excitation is sensed via Ca(v)1.2/1.3 (L-type) Ca(2+) channels and NMDA receptors on the proliferating precursors. Excitation through this pathway acts to inhibit expression of the glial fate genes Hes1 and Id2 and increase expression of NeuroD, a positive regulator of neuronal differentiation. These activity-sensing properties of the adult NPCs, when applied as an "excitation-neurogenesis coupling rule" within a Hebbian neural network, predict significant advantages for both the temporary storage and the clearance of memories.

  17. Neural Correlates of Confidence during Item Recognition and Source Memory Retrieval: Evidence for Both Dual-Process and Strength Memory Theories

    ERIC Educational Resources Information Center

    Hayes, Scott M.; Buchler, Norbou; Stokes, Jared; Kragel, James; Cabeza, Roberto

    2011-01-01

    Although the medial-temporal lobes (MTL), PFC, and parietal cortex are considered primary nodes in the episodic memory network, there is much debate regarding the contributions of MTL, PFC, and parietal subregions to recollection versus familiarity (dual-process theory) and the feasibility of accounts on the basis of a single memory strength…

  18. Stability in Cohen Grossberg-type bidirectional associative memory neural networks with time-varying delays

    NASA Astrophysics Data System (ADS)

    Cao, Jinde; Song, Qiankun

    2006-07-01

    In this paper, the exponential stability problem is investigated for a class of Cohen-Grossberg-type bidirectional associative memory neural networks with time-varying delays. By using the analysis method, inequality technique and the properties of an M-matrix, several novel sufficient conditions ensuring the existence, uniqueness and global exponential stability of the equilibrium point are derived. Moreover, the exponential convergence rate is estimated. The obtained results are less restrictive than those given in the earlier literature, and the boundedness and differentiability of the activation functions and differentiability of the time-varying delays are removed. Two examples with their simulations are given to show the effectiveness of the obtained results.

  19. Hardware implementation of CMAC neural network with reduced storage requirement.

    PubMed

    Ker, J S; Kuo, Y H; Wen, R C; Liu, B D

    1997-01-01

    The cerebellar model articulation controller (CMAC) neural network has the advantages of fast convergence speed and low computation complexity. However, it suffers from a low storage space utilization rate on weight memory. In this paper, we propose a direct weight address mapping approach, which can reduce the required weight memory size with a utilization rate near 100%. Based on such an address mapping approach, we developed a pipeline architecture to efficiently perform the addressing operations. The proposed direct weight address mapping approach also speeds up the computation for the generation of weight addresses. Besides, a CMAC hardware prototype used for color calibration has been implemented to confirm the proposed approach and architecture.

  20. Modeling of cortical signals using echo state networks

    NASA Astrophysics Data System (ADS)

    Zhou, Hanying; Wang, Yongji; Huang, Jiangshuai

    2009-10-01

    Diverse modeling frameworks have been utilized with the ultimate goal of translating brain cortical signals into prediction of visible behavior. The inputs to these models are usually multidimensional neural recordings collected from relevant regions of a monkey's brain while the outputs are the associated behavior which is typically the 2-D or 3-D hand position of a primate. Here our task is to set up a proper model in order to figure out the move trajectories by input the neural signals which are simultaneously collected in the experiment. In this paper, we propose to use Echo State Networks (ESN) to map the neural firing activities into hand positions. ESN is a newly developed recurrent neural network(RNN) model. Besides its dynamic property and short term memory just as other recurrent neural networks have, it has a special echo state property which endows it with the ability to model nonlinear dynamic systems powerfully. What distinguished it from transitional recurrent neural networks most significantly is its special learning method. In this paper we train this net with a refined version of its typical training method and get a better model.

  1. Long-Term Memory Shapes the Primary Olfactory Center of an Insect Brain

    ERIC Educational Resources Information Center

    Hourcade, Benoit; Perisse, Emmanuel; Devaud, Jean-Marc; Sandoz, Jean-Christophe

    2009-01-01

    The storage of stable memories is generally considered to rely on changes in the functional properties and/or the synaptic connectivity of neural networks. However, these changes are not easily tractable given the complexity of the learning procedures and brain circuits studied. Such a search can be narrowed down by studying memories of specific…

  2. Cognitive Control Network Contributions to Memory-Guided Visual Attention

    PubMed Central

    Rosen, Maya L.; Stern, Chantal E.; Michalka, Samantha W.; Devaney, Kathryn J.; Somers, David C.

    2016-01-01

    Visual attentional capacity is severely limited, but humans excel in familiar visual contexts, in part because long-term memories guide efficient deployment of attention. To investigate the neural substrates that support memory-guided visual attention, we performed a set of functional MRI experiments that contrast long-term, memory-guided visuospatial attention with stimulus-guided visuospatial attention in a change detection task. Whereas the dorsal attention network was activated for both forms of attention, the cognitive control network (CCN) was preferentially activated during memory-guided attention. Three posterior nodes in the CCN, posterior precuneus, posterior callosal sulcus/mid-cingulate, and lateral intraparietal sulcus exhibited the greatest specificity for memory-guided attention. These 3 regions exhibit functional connectivity at rest, and we propose that they form a subnetwork within the broader CCN. Based on the task activation patterns, we conclude that the nodes of this subnetwork are preferentially recruited for long-term memory guidance of visuospatial attention. PMID:25750253

  3. Heuristic pattern correction scheme using adaptively trained generalized regression neural networks.

    PubMed

    Hoya, T; Chambers, J A

    2001-01-01

    In many pattern classification problems, an intelligent neural system is required which can learn the newly encountered but misclassified patterns incrementally, while keeping a good classification performance over the past patterns stored in the network. In the paper, an heuristic pattern correction scheme is proposed using adaptively trained generalized regression neural networks (GRNNs). The scheme is based upon both network growing and dual-stage shrinking mechanisms. In the network growing phase, a subset of the misclassified patterns in each incoming data set is iteratively added into the network until all the patterns in the incoming data set are classified correctly. Then, the redundancy in the growing phase is removed in the dual-stage network shrinking. Both long- and short-term memory models are considered in the network shrinking, which are motivated from biological study of the brain. The learning capability of the proposed scheme is investigated through extensive simulation studies.

  4. Alterations in memory networks in mild cognitive impairment and Alzheimer's disease: an independent component analysis.

    PubMed

    Celone, Kim A; Calhoun, Vince D; Dickerson, Bradford C; Atri, Alireza; Chua, Elizabeth F; Miller, Saul L; DePeau, Kristina; Rentz, Doreen M; Selkoe, Dennis J; Blacker, Deborah; Albert, Marilyn S; Sperling, Reisa A

    2006-10-04

    Memory function is likely subserved by multiple distributed neural networks, which are disrupted by the pathophysiological process of Alzheimer's disease (AD). In this study, we used multivariate analytic techniques to investigate memory-related functional magnetic resonance imaging (fMRI) activity in 52 individuals across the continuum of normal aging, mild cognitive impairment (MCI), and mild AD. Independent component analyses revealed specific memory-related networks that activated or deactivated during an associative memory paradigm. Across all subjects, hippocampal activation and parietal deactivation demonstrated a strong reciprocal relationship. Furthermore, we found evidence of a nonlinear trajectory of fMRI activation across the continuum of impairment. Less impaired MCI subjects showed paradoxical hyperactivation in the hippocampus compared with controls, whereas more impaired MCI subjects demonstrated significant hypoactivation, similar to the levels observed in the mild AD subjects. We found a remarkably parallel curve in the pattern of memory-related deactivation in medial and lateral parietal regions with greater deactivation in less-impaired MCI and loss of deactivation in more impaired MCI and mild AD subjects. Interestingly, the failure of deactivation in these regions was also associated with increased positive activity in a neocortical attentional network in MCI and AD. Our findings suggest that loss of functional integrity of the hippocampal-based memory systems is directly related to alterations of neural activity in parietal regions seen over the course of MCI and AD. These data may also provide functional evidence of the interaction between neocortical and medial temporal lobe pathology in early AD.

  5. Neural assembly computing.

    PubMed

    Ranhel, João

    2012-06-01

    Spiking neurons can realize several computational operations when firing cooperatively. This is a prevalent notion, although the mechanisms are not yet understood. A way by which neural assemblies compute is proposed in this paper. It is shown how neural coalitions represent things (and world states), memorize them, and control their hierarchical relations in order to perform algorithms. It is described how neural groups perform statistic logic functions as they form assemblies. Neural coalitions can reverberate, becoming bistable loops. Such bistable neural assemblies become short- or long-term memories that represent the event that triggers them. In addition, assemblies can branch and dismantle other neural groups generating new events that trigger other coalitions. Hence, such capabilities and the interaction among assemblies allow neural networks to create and control hierarchical cascades of causal activities, giving rise to parallel algorithms. Computing and algorithms are used here as in a nonstandard computation approach. In this sense, neural assembly computing (NAC) can be seen as a new class of spiking neural network machines. NAC can explain the following points: 1) how neuron groups represent things and states; 2) how they retain binary states in memories that do not require any plasticity mechanism; and 3) how branching, disbanding, and interaction among assemblies may result in algorithms and behavioral responses. Simulations were carried out and the results are in agreement with the hypothesis presented. A MATLAB code is available as a supplementary material.

  6. Anterograde episodic memory in Korsakoff syndrome.

    PubMed

    Fama, Rosemary; Pitel, Anne-Lise; Sullivan, Edith V

    2012-06-01

    A profound anterograde memory deficit for information, regardless of the nature of the material, is the hallmark of Korsakoff syndrome, an amnesic condition resulting from severe thiamine (vitamin B1) deficiency. Since the late nineteenth century when the Russian physician, S. S. Korsakoff, initially described this syndrome associated with "polyneuropathy," the observed global amnesia has been a primary focus of neuroscience and neuropsychology. In this review we highlight the historical studies that examined anterograde episodic memory processes in KS, present a timeline and evidence supporting the myriad theories proffered to account for this memory dysfunction, and summarize what is known about the neuroanatomical correlates and neural systems presumed affected in KS. Rigorous study of KS amnesia and associated memory disorders of other etiologies provide evidence for distinct mnemonic component processes and neural networks imperative for normal declarative and nondeclarative memory abilities and for mnemonic processes spared in KS, from whence emerged the appreciation that memory is not a unitary function. Debate continues regarding the qualitative and quantitative differences between KS and other amnesias and what brain regions and neural pathways are necessary and sufficient to produce KS amnesia.

  7. Anterograde Episodic Memory in Korsakoff Syndrome

    PubMed Central

    Fama, Rosemary; Pitel, Anne-Lise; Sullivan, Edith V.

    2016-01-01

    A profound anterograde memory deficit for information, regardless of the nature of the material, is the hallmark of Korsakoff syndrome, an amnesic condition resulting from severe thiamine (vitamin B1) deficiency. Since the late nineteenth century when the Russian physician, S. S. Korsakoff, initially described this syndrome associated with “polyneuropathy,” the observed global amnesia has been a primary focus of neuroscience and neuropsychology. In this review we highlight the historical studies that examined anterograde episodic memory processes in KS, present a timeline and evidence supporting the myriad theories proffered to account for this memory dysfunction, and summarize what is known about the neuroanatomical correlates and neural systems presumed affected in KS. Rigorous study of KS amnesia and associated memory disorders of other etiologies provide evidence for distinct mnemonic component processes and neural networks imperative for normal declarative and nondeclarative memory abilities and for mnemonic processes spared in KS, from whence emerged the appreciation that memory is not a unitary function. Debate continues regarding the qualitative and quantitative differences between KS and other amnesias and what brain regions and neural pathways are necessary and sufficient to produce KS amnesia. PMID:22644546

  8. Theta synchronization networks emerge during human object-place memory encoding.

    PubMed

    Sato, Naoyuki; Yamaguchi, Yoko

    2007-03-26

    Recent rodent hippocampus studies have suggested that theta rhythm-dependent neural dynamics ('theta phase precession') is essential for an on-line memory formation. A computational study indicated that the phase precession enables a human object-place association memory with voluntary eye movements, although it is still an open question whether the human brain uses the dynamics. Here we elucidated subsequent memory-correlated activities in human scalp electroencephalography in an object-place association memory designed according the former computational study. Our results successfully demonstrated that subsequent memory recall is characterized by an increase in theta power and coherence, and further, that multiple theta synchronization networks emerge. These findings suggest the human theta dynamics in common with rodents in episodic memory formation.

  9. The Reference Ability Neural Network Study: Life-time stability of reference-ability neural networks derived from task maps of young adults.

    PubMed

    Habeck, C; Gazes, Y; Razlighi, Q; Steffener, J; Brickman, A; Barulli, D; Salthouse, T; Stern, Y

    2016-01-15

    Analyses of large test batteries administered to individuals ranging from young to old have consistently yielded a set of latent variables representing reference abilities (RAs) that capture the majority of the variance in age-related cognitive change: Episodic Memory, Fluid Reasoning, Perceptual Processing Speed, and Vocabulary. In a previous paper (Stern et al., 2014), we introduced the Reference Ability Neural Network Study, which administers 12 cognitive neuroimaging tasks (3 for each RA) to healthy adults age 20-80 in order to derive unique neural networks underlying these 4 RAs and investigate how these networks may be affected by aging. We used a multivariate approach, linear indicator regression, to derive a unique covariance pattern or Reference Ability Neural Network (RANN) for each of the 4 RAs. The RANNs were derived from the neural task data of 64 younger adults of age 30 and below. We then prospectively applied the RANNs to fMRI data from the remaining sample of 227 adults of age 31 and above in order to classify each subject-task map into one of the 4 possible reference domains. Overall classification accuracy across subjects in the sample age 31 and above was 0.80±0.18. Classification accuracy by RA domain was also good, but variable; memory: 0.72±0.32; reasoning: 0.75±0.35; speed: 0.79±0.31; vocabulary: 0.94±0.16. Classification accuracy was not associated with cross-sectional age, suggesting that these networks, and their specificity to the respective reference domain, might remain intact throughout the age range. Higher mean brain volume was correlated with increased overall classification accuracy; better overall performance on the tasks in the scanner was also associated with classification accuracy. For the RANN network scores, we observed for each RANN that a higher score was associated with a higher corresponding classification accuracy for that reference ability. Despite the absence of behavioral performance information in the derivation of these networks, we also observed some brain-behavioral correlations, notably for the fluid-reasoning network whose network score correlated with performance on the memory and fluid-reasoning tasks. While age did not influence the expression of this RANN, the slope of the association between network score and fluid-reasoning performance was negatively associated with higher ages. These results provide support for the hypothesis that a set of specific, age-invariant neural networks underlies these four RAs, and that these networks maintain their cognitive specificity and level of intensity across age. Activation common to all 12 tasks was identified as another activation pattern resulting from a mean-contrast Partial-Least-Squares technique. This common pattern did show associations with age and some subject demographics for some of the reference domains, lending support to the overall conclusion that aspects of neural processing that are specific to any cognitive reference ability stay constant across age, while aspects that are common to all reference abilities differ across age. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Neural-like growing networks

    NASA Astrophysics Data System (ADS)

    Yashchenko, Vitaliy A.

    2000-03-01

    On the basis of the analysis of scientific ideas reflecting the law in the structure and functioning the biological structures of a brain, and analysis and synthesis of knowledge, developed by various directions in Computer Science, also there were developed the bases of the theory of a new class neural-like growing networks, not having the analogue in world practice. In a base of neural-like growing networks the synthesis of knowledge developed by classical theories - semantic and neural of networks is. The first of them enable to form sense, as objects and connections between them in accordance with construction of the network. With thus each sense gets a separate a component of a network as top, connected to other tops. In common it quite corresponds to structure reflected in a brain, where each obvious concept is presented by certain structure and has designating symbol. Secondly, this network gets increased semantic clearness at the expense owing to formation not only connections between neural by elements, but also themselves of elements as such, i.e. here has a place not simply construction of a network by accommodation sense structures in environment neural of elements, and purely creation of most this environment, as of an equivalent of environment of memory. Thus neural-like growing networks are represented by the convenient apparatus for modeling of mechanisms of teleological thinking, as a fulfillment of certain psychophysiological of functions.

  11. Integrated Circuit For Simulation Of Neural Network

    NASA Technical Reports Server (NTRS)

    Thakoor, Anilkumar P.; Moopenn, Alexander W.; Khanna, Satish K.

    1988-01-01

    Ballast resistors deposited on top of circuit structure. Cascadable, programmable binary connection matrix fabricated in VLSI form as basic building block for assembly of like units into content-addressable electronic memory matrices operating somewhat like networks of neurons. Connections formed during storage of data, and data recalled from memory by prompting matrix with approximate or partly erroneous signals. Redundancy in pattern of connections causes matrix to respond with correct stored data.

  12. Blanket Gate Would Address Blocks Of Memory

    NASA Technical Reports Server (NTRS)

    Lambe, John; Moopenn, Alexander; Thakoor, Anilkumar P.

    1988-01-01

    Circuit-chip area used more efficiently. Proposed gate structure selectively allows and restricts access to blocks of memory in electronic neural-type network. By breaking memory into independent blocks, gate greatly simplifies problem of reading from and writing to memory. Since blocks not used simultaneously, share operational amplifiers that prompt and read information stored in memory cells. Fewer operational amplifiers needed, and chip area occupied reduced correspondingly. Cost per bit drops as result.

  13. Distorted Character Recognition Via An Associative Neural Network

    NASA Astrophysics Data System (ADS)

    Messner, Richard A.; Szu, Harold H.

    1987-03-01

    The purpose of this paper is two-fold. First, it is intended to provide some preliminary results of a character recognition scheme which has foundations in on-going neural network architecture modeling, and secondly, to apply some of the neural network results in a real application area where thirty years of effort has had little effect on providing the machine an ability to recognize distorted objects within the same object class. It is the author's belief that the time is ripe to start applying in ernest the results of over twenty years of effort in neural modeling to some of the more difficult problems which seem so hard to solve by conventional means. The character recognition scheme proposed utilizes a preprocessing stage which performs a 2-dimensional Walsh transform of an input cartesian image field, then sequency filters this spectrum into three feature bands. Various features are then extracted and organized into three sets of feature vectors. These vector patterns that are stored and recalled associatively. Two possible associative neural memory models are proposed for further investigation. The first being an outer-product linear matrix associative memory with a threshold function controlling the strength of the output pattern (similar to Kohonen's crosscorrelation approach [1]). The second approach is based upon a modified version of Grossberg's neural architecture [2] which provides better self-organizing properties due to its adaptive nature. Preliminary results of the sequency filtering and feature extraction preprocessing stage and discussion about the use of the proposed neural architectures is included.

  14. Continuous Timescale Long-Short Term Memory Neural Network for Human Intent Understanding

    PubMed Central

    Yu, Zhibin; Moirangthem, Dennis S.; Lee, Minho

    2017-01-01

    Understanding of human intention by observing a series of human actions has been a challenging task. In order to do so, we need to analyze longer sequences of human actions related with intentions and extract the context from the dynamic features. The multiple timescales recurrent neural network (MTRNN) model, which is believed to be a kind of solution, is a useful tool for recording and regenerating a continuous signal for dynamic tasks. However, the conventional MTRNN suffers from the vanishing gradient problem which renders it impossible to be used for longer sequence understanding. To address this problem, we propose a new model named Continuous Timescale Long-Short Term Memory (CTLSTM) in which we inherit the multiple timescales concept into the Long-Short Term Memory (LSTM) recurrent neural network (RNN) that addresses the vanishing gradient problem. We design an additional recurrent connection in the LSTM cell outputs to produce a time-delay in order to capture the slow context. Our experiments show that the proposed model exhibits better context modeling ability and captures the dynamic features on multiple large dataset classification tasks. The results illustrate that the multiple timescales concept enhances the ability of our model to handle longer sequences related with human intentions and hence proving to be more suitable for complex tasks, such as intention recognition. PMID:28878646

  15. Unique Applications for Artificial Neural Networks. Phase 1

    DTIC Science & Technology

    1991-08-08

    significance. For the VRP, a problem that has received considerable attention in the literature, the new NGO-VRP methodology generates better solutions...represent the stop assignments of each route. The effect of the genetic recombinations is to make simple local exchanges to the relative positions of the...technique for representing a computer-based associative memory [Arbib, 1987]. In our routing system, the basic job of the neural network system is to accept

  16. Passivity of memristive BAM neural networks with leakage and additive time-varying delays

    NASA Astrophysics Data System (ADS)

    Wang, Weiping; Wang, Meiqi; Luo, Xiong; Li, Lixiang; Zhao, Wenbing; Liu, Linlin; Ping, Yuan

    2018-02-01

    This paper investigates the passivity of memristive bidirectional associate memory neural networks (MBAMNNs) with leakage and additive time-varying delays. Based on some useful inequalities and appropriate Lyapunov-Krasovskii functionals (LKFs), several delay-dependent conditions for passivity performance are obtained in linear matrix inequalities (LMIs). Moreover, the leakage delays as well as additive delays are considered separately. Finally, numerical simulations are provided to demonstrate the feasibility of the theoretical results.

  17. Analog Delta-Back-Propagation Neural-Network Circuitry

    NASA Technical Reports Server (NTRS)

    Eberhart, Silvio

    1990-01-01

    Changes in synapse weights due to circuit drifts suppressed. Proposed fully parallel analog version of electronic neural-network processor based on delta-back-propagation algorithm. Processor able to "learn" when provided with suitable combinations of inputs and enforced outputs. Includes programmable resistive memory elements (corresponding to synapses), conductances (synapse weights) adjusted during learning. Buffer amplifiers, summing circuits, and sample-and-hold circuits arranged in layers of electronic neurons in accordance with delta-back-propagation algorithm.

  18. Frequency–specific network connectivity increases underlie accurate spatiotemporal memory retrieval

    PubMed Central

    Watrous, Andrew J.; Tandon, Nitin; Connor, Chris; Pieters, Thomas; Ekstrom, Arne D.

    2013-01-01

    The medial temporal lobes, prefrontal cortex, and parts of parietal cortex form the neural underpinnings of episodic memory, which includes remembering both where and when an event occurred. Yet how these three key regions interact during retrieval of spatial and temporal context remains largely untested. Here, we employed simultaneous electrocorticographical recordings across multiple lobular regions, employing phase synchronization as a measure of network functional connectivity, while patients retrieved spatial and temporal context associated with an episode. Successful memory retrieval was characterized by greater global connectivity compared to incorrect retrieval, with the MTL acting as a convergence hub for these interactions. Spatial vs. temporal context retrieval resulted in prominent differences in both the spectral and temporal patterns of network interactions. These results emphasize dynamic network interactions as central to episodic memory retrieval, providing novel insight into how multiple contexts underlying a single event can be recreated within the same network. PMID:23354333

  19. A model for integrating elementary neural functions into delayed-response behavior.

    PubMed

    Gisiger, Thomas; Kerszberg, Michel

    2006-04-01

    It is well established that various cortical regions can implement a wide array of neural processes, yet the mechanisms which integrate these processes into behavior-producing, brain-scale activity remain elusive. We propose that an important role in this respect might be played by executive structures controlling the traffic of information between the cortical regions involved. To illustrate this hypothesis, we present a neural network model comprising a set of interconnected structures harboring stimulus-related activity (visual representation, working memory, and planning), and a group of executive units with task-related activity patterns that manage the information flowing between them. The resulting dynamics allows the network to perform the dual task of either retaining an image during a delay (delayed-matching to sample task), or recalling from this image another one that has been associated with it during training (delayed-pair association task). The model reproduces behavioral and electrophysiological data gathered on the inferior temporal and prefrontal cortices of primates performing these same tasks. It also makes predictions on how neural activity coding for the recall of the image associated with the sample emerges and becomes prospective during the training phase. The network dynamics proves to be very stable against perturbations, and it exhibits signs of scale-invariant organization and cooperativity. The present network represents a possible neural implementation for active, top-down, prospective memory retrieval in primates. The model suggests that brain activity leading to performance of cognitive tasks might be organized in modular fashion, simple neural functions becoming integrated into more complex behavior by executive structures harbored in prefrontal cortex and/or basal ganglia.

  20. Neural Mechanisms of Episodic Retrieval Support Divergent Creative Thinking.

    PubMed

    Madore, Kevin P; Thakral, Preston P; Beaty, Roger E; Addis, Donna Rose; Schacter, Daniel L

    2017-11-17

    Prior research has indicated that brain regions and networks that support semantic memory, top-down and bottom-up attention, and cognitive control are all involved in divergent creative thinking. Kernels of evidence suggest that neural processes supporting episodic memory-the retrieval of particular elements of prior experiences-may also be involved in divergent thinking, but such processes have typically been characterized as not very relevant for, or even a hindrance to, creative output. In the present study, we combine functional magnetic resonance imaging with an experimental manipulation to test formally, for the first time, episodic memory's involvement in divergent thinking. Following a manipulation that facilitates detailed episodic retrieval, we observed greater neural activity in the hippocampus and stronger connectivity between a core brain network linked to episodic processing and a frontoparietal brain network linked to cognitive control during divergent thinking relative to an object association control task that requires little divergent thinking. Stronger coupling following the retrieval manipulation extended to a subsequent resting-state scan. Neural effects of the episodic manipulation were consistent with behavioral effects of enhanced idea production on divergent thinking but not object association. The results indicate that conceptual frameworks should accommodate the idea that episodic retrieval can function as a component process of creative idea generation, and highlight how the brain flexibly utilizes the retrieval of episodic details for tasks beyond simple remembering. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  1. An event map of memory space in the hippocampus

    PubMed Central

    Deuker, Lorena; Bellmund, Jacob LS; Navarro Schröder, Tobias; Doeller, Christian F

    2016-01-01

    The hippocampus has long been implicated in both episodic and spatial memory, however these mnemonic functions have been traditionally investigated in separate research strands. Theoretical accounts and rodent data suggest a common mechanism for spatial and episodic memory in the hippocampus by providing an abstract and flexible representation of the external world. Here, we monitor the de novo formation of such a representation of space and time in humans using fMRI. After learning spatio-temporal trajectories in a large-scale virtual city, subject-specific neural similarity in the hippocampus scaled with the remembered proximity of events in space and time. Crucially, the structure of the entire spatio-temporal network was reflected in neural patterns. Our results provide evidence for a common coding mechanism underlying spatial and temporal aspects of episodic memory in the hippocampus and shed new light on its role in interleaving multiple episodes in a neural event map of memory space. DOI: http://dx.doi.org/10.7554/eLife.16534.001 PMID:27710766

  2. Shared memories reveal shared structure in neural activity across individuals

    PubMed Central

    Chen, J.; Leong, Y.C.; Honey, C.J.; Yong, C.H.; Norman, K.A.; Hasson, U.

    2016-01-01

    Our lives revolve around sharing experiences and memories with others. When different people recount the same events, how similar are their underlying neural representations? Participants viewed a fifty-minute movie, then verbally described the events during functional MRI, producing unguided detailed descriptions lasting up to forty minutes. As each person spoke, event-specific spatial patterns were reinstated in default-network, medial-temporal, and high-level visual areas. Individual event patterns were both highly discriminable from one another and similar between people, suggesting consistent spatial organization. In many high-order areas, patterns were more similar between people recalling the same event than between recall and perception, indicating systematic reshaping of percept into memory. These results reveal the existence of a common spatial organization for memories in high-level cortical areas, where encoded information is largely abstracted beyond sensory constraints; and that neural patterns during perception are altered systematically across people into shared memory representations for real-life events. PMID:27918531

  3. A Scalable Multicore Architecture With Heterogeneous Memory Structures for Dynamic Neuromorphic Asynchronous Processors (DYNAPs).

    PubMed

    Moradi, Saber; Qiao, Ning; Stefanini, Fabio; Indiveri, Giacomo

    2018-02-01

    Neuromorphic computing systems comprise networks of neurons that use asynchronous events for both computation and communication. This type of representation offers several advantages in terms of bandwidth and power consumption in neuromorphic electronic systems. However, managing the traffic of asynchronous events in large scale systems is a daunting task, both in terms of circuit complexity and memory requirements. Here, we present a novel routing methodology that employs both hierarchical and mesh routing strategies and combines heterogeneous memory structures for minimizing both memory requirements and latency, while maximizing programming flexibility to support a wide range of event-based neural network architectures, through parameter configuration. We validated the proposed scheme in a prototype multicore neuromorphic processor chip that employs hybrid analog/digital circuits for emulating synapse and neuron dynamics together with asynchronous digital circuits for managing the address-event traffic. We present a theoretical analysis of the proposed connectivity scheme, describe the methods and circuits used to implement such scheme, and characterize the prototype chip. Finally, we demonstrate the use of the neuromorphic processor with a convolutional neural network for the real-time classification of visual symbols being flashed to a dynamic vision sensor (DVS) at high speed.

  4. Modulation of steady state functional connectivity in the default mode and working memory networks by cognitive load.

    PubMed

    Newton, Allen T; Morgan, Victoria L; Rogers, Baxter P; Gore, John C

    2011-10-01

    Interregional correlations between blood oxygen level dependent (BOLD) magnetic resonance imaging (fMRI) signals in the resting state have been interpreted as measures of connectivity across the brain. Here we investigate whether such connectivity in the working memory and default mode networks is modulated by changes in cognitive load. Functional connectivity was measured in a steady-state verbal identity N-back task for three different conditions (N = 1, 2, and 3) as well as in the resting state. We found that as cognitive load increases, the functional connectivity within both the working memory the default mode network increases. To test whether functional connectivity between the working memory and the default mode networks changed, we constructed maps of functional connectivity to the working memory network as a whole and found that increasingly negative correlations emerged in a dorsal region of the posterior cingulate cortex. These results provide further evidence that low frequency fluctuations in BOLD signals reflect variations in neural activity and suggests interaction between the default mode network and other cognitive networks. Copyright © 2010 Wiley-Liss, Inc.

  5. Neural Differentiation of Incorrectly Predicted Memories.

    PubMed

    Kim, Ghootae; Norman, Kenneth A; Turk-Browne, Nicholas B

    2017-02-22

    When an item is predicted in a particular context but the prediction is violated, memory for that item is weakened (Kim et al., 2014). Here, we explore what happens when such previously mispredicted items are later reencountered. According to prior neural network simulations, this sequence of events-misprediction and subsequent restudy-should lead to differentiation of the item's neural representation from the previous context (on which the misprediction was based). Specifically, misprediction weakens connections in the representation to features shared with the previous context and restudy allows new features to be incorporated into the representation that are not shared with the previous context. This cycle of misprediction and restudy should have the net effect of moving the item's neural representation away from the neural representation of the previous context. We tested this hypothesis using human fMRI by tracking changes in item-specific BOLD activity patterns in the hippocampus, a key structure for representing memories and generating predictions. In left CA2/3/DG, we found greater neural differentiation for items that were repeatedly mispredicted and restudied compared with items from a control condition that was identical except without misprediction. We also measured prediction strength in a trial-by-trial fashion and found that greater misprediction for an item led to more differentiation, further supporting our hypothesis. Therefore, the consequences of prediction error go beyond memory weakening. If the mispredicted item is restudied, the brain adaptively differentiates its memory representation to improve the accuracy of subsequent predictions and to shield it from further weakening. SIGNIFICANCE STATEMENT Competition between overlapping memories leads to weakening of nontarget memories over time, making it easier to access target memories. However, a nontarget memory in one context might become a target memory in another context. How do such memories get restrengthened without increasing competition again? Computational models suggest that the brain handles this by reducing neural connections to the previous context and adding connections to new features that were not part of the previous context. The result is neural differentiation away from the previous context. Here, we provide support for this theory, using fMRI to track neural representations of individual memories in the hippocampus and how they change based on learning. Copyright © 2017 the authors 0270-6474/17/372022-10$15.00/0.

  6. Performance analysis and comparison of a minimum interconnections direct storage model with traditional neural bidirectional memories.

    PubMed

    Bhatti, A Aziz

    2009-12-01

    This study proposes an efficient and improved model of a direct storage bidirectional memory, improved bidirectional associative memory (IBAM), and emphasises the use of nanotechnology for efficient implementation of such large-scale neural network structures at a considerable lower cost reduced complexity, and less area required for implementation. This memory model directly stores the X and Y associated sets of M bipolar binary vectors in the form of (MxN(x)) and (MxN(y)) memory matrices, requires O(N) or about 30% of interconnections with weight strength ranging between +/-1, and is computationally very efficient as compared to sequential, intraconnected and other bidirectional associative memory (BAM) models of outer-product type that require O(N(2)) complex interconnections with weight strength ranging between +/-M. It is shown that it is functionally equivalent to and possesses all attributes of a BAM of outer-product type, and yet it is simple and robust in structure, very large scale integration (VLSI), optical and nanotechnology realisable, modular and expandable neural network bidirectional associative memory model in which the addition or deletion of a pair of vectors does not require changes in the strength of interconnections of the entire memory matrix. The analysis of retrieval process, signal-to-noise ratio, storage capacity and stability of the proposed model as well as of the traditional BAM has been carried out. Constraints on and characteristics of unipolar and bipolar binaries for improved storage and retrieval are discussed. The simulation results show that it has log(e) N times higher storage capacity, superior performance, faster convergence and retrieval time, when compared to traditional sequential and intraconnected bidirectional memories.

  7. Optoelectronic Terminal-Attractor-Based Associative Memory

    NASA Technical Reports Server (NTRS)

    Liu, Hua-Kuang; Barhen, Jacob; Farhat, Nabil H.

    1994-01-01

    Report presents theoretical and experimental study of optically and electronically addressable optical implementation of artificial neural network that performs associative recall. Shows by computer simulation that terminal-attractor-based associative memory can have perfect convergence in associative retrieval and increased storage capacity. Spurious states reduced by exploiting terminal attractors.

  8. Entorhinal-Hippocampal Neuronal Circuits Bridge Temporally Discontiguous Events

    ERIC Educational Resources Information Center

    Kitamura, Takashi; Macdonald, Christopher J.; Tonegawa, Susumu

    2015-01-01

    The entorhinal cortex (EC)-hippocampal (HPC) network plays an essential role for episodic memory, which preserves spatial and temporal information about the occurrence of past events. Although there has been significant progress toward understanding the neural circuits underlying the spatial dimension of episodic memory, the relevant circuits…

  9. Optically simulating a quantum associative memory

    NASA Astrophysics Data System (ADS)

    Howell, John C.; Yeazell, John A.; Ventura, Dan

    2000-10-01

    This paper discusses the realization of a quantum associative memory using linear integrated optics. An associative memory produces a full pattern of bits when presented with only a partial pattern. Quantum computers have the potential to store large numbers of patterns and hence have the ability to far surpass any classical neural-network realization of an associative memory. In this work two three-qubit associative memories will be discussed using linear integrated optics. In addition, corrupted, invented and degenerate memories are discussed.

  10. Routes to the past: neural substrates of direct and generative autobiographical memory retrieval.

    PubMed

    Addis, Donna Rose; Knapp, Katie; Roberts, Reece P; Schacter, Daniel L

    2012-02-01

    Models of autobiographical memory propose two routes to retrieval depending on cue specificity. When available cues are specific and personally-relevant, a memory can be directly accessed. However, when available cues are generic, one must engage a generative retrieval process to produce more specific cues to successfully access a relevant memory. The current study sought to characterize the neural bases of these retrieval processes. During functional magnetic resonance imaging (fMRI), participants were shown personally-relevant cues to elicit direct retrieval, or generic cues (nouns) to elicit generative retrieval. We used spatiotemporal partial least squares to characterize the spatial and temporal characteristics of the networks associated with direct and generative retrieval. Both retrieval tasks engaged regions comprising the autobiographical retrieval network, including hippocampus, and medial prefrontal and parietal cortices. However, some key neural differences emerged. Generative retrieval differentially recruited lateral prefrontal and temporal regions early on during the retrieval process, likely supporting the strategic search operations and initial recovery of generic autobiographical information. However, many regions were activated more strongly during direct versus generative retrieval, even when we time-locked the analysis to the successful recovery of events in both conditions. This result suggests that there may be fundamental differences between memories that are accessed directly and those that are recovered via the iterative search and retrieval process that characterizes generative retrieval. Copyright © 2011 Elsevier Inc. All rights reserved.

  11. Routes to the past: Neural substrates of direct and generative autobiographical memory retrieval

    PubMed Central

    Addis, Donna Rose; Knapp, Katie; Roberts, Reece P.; Schacter, Daniel L.

    2011-01-01

    Models of autobiographical memory propose two routes to retrieval depending on cue specificity. When available cues are specific and personally-relevant, a memory can be directly accessed. However, when available cues are generic, one must engage a generative retrieval process to produce more specific cues to successfully access a relevant memory. The current study sought to characterize the neural bases of these retrieval processes. During functional magnetic resonance imaging (fMRI), participants were shown personally-relevant cues to elicit direct retrieval, or generic cues (nouns) to elicit generative retrieval. We used spatiotemporal partial least squares to characterize the spatial and temporal characteristics of the networks associated with direct and generative retrieval. Both retrieval tasks engaged regions comprising the autobiographical retrieval network, including hippocampus, and medial prefrontal and parietal cortices. However, some key neural differences emerged. Generative retrieval differentially recruited lateral prefrontal and temporal regions early on during the retrieval process, likely supporting the strategic search operations and initial recovery of generic autobiographical information. However, many regions were activated more strongly during direct versus generative retrieval, even when we time-locked the analysis to the successful recovery of events in both conditions. This result suggests that there may be fundamental differences between memories that are accessed directly and those that are recovered via the iterative search and retrieval process that characterizes generative retrieval. PMID:22001264

  12. Computational modeling of spiking neural network with learning rules from STDP and intrinsic plasticity

    NASA Astrophysics Data System (ADS)

    Li, Xiumin; Wang, Wei; Xue, Fangzheng; Song, Yongduan

    2018-02-01

    Recently there has been continuously increasing interest in building up computational models of spiking neural networks (SNN), such as the Liquid State Machine (LSM). The biologically inspired self-organized neural networks with neural plasticity can enhance the capability of computational performance, with the characteristic features of dynamical memory and recurrent connection cycles which distinguish them from the more widely used feedforward neural networks. Despite a variety of computational models for brain-like learning and information processing have been proposed, the modeling of self-organized neural networks with multi-neural plasticity is still an important open challenge. The main difficulties lie in the interplay among different forms of neural plasticity rules and understanding how structures and dynamics of neural networks shape the computational performance. In this paper, we propose a novel approach to develop the models of LSM with a biologically inspired self-organizing network based on two neural plasticity learning rules. The connectivity among excitatory neurons is adapted by spike-timing-dependent plasticity (STDP) learning; meanwhile, the degrees of neuronal excitability are regulated to maintain a moderate average activity level by another learning rule: intrinsic plasticity (IP). Our study shows that LSM with STDP+IP performs better than LSM with a random SNN or SNN obtained by STDP alone. The noticeable improvement with the proposed method is due to the better reflected competition among different neurons in the developed SNN model, as well as the more effectively encoded and processed relevant dynamic information with its learning and self-organizing mechanism. This result gives insights to the optimization of computational models of spiking neural networks with neural plasticity.

  13. Statistical Mechanics Model of the Speed - Accuracy Tradeoff in Spatial and Lexical Memory

    NASA Astrophysics Data System (ADS)

    Kaufman, Miron; Allen, Philip

    2000-03-01

    The molar neural network model of P. Allen, M. Kaufman, A. F. Smith, R. E. Popper, Psychology and Aging 13, 501 (1998) and Experimental Aging Research, 24, 307 (1998) is extended to incorporate reaction times. In our model the entropy associated with a particular task determines the reaction time. We use this molar neural model to directly analyze experimental data on episodic (spatial) memory and semantic (lexical) memory tasks. In particular we are interested in the effect of aging on the two types of memory. We find that there is no difference in performance levels for lexical memory tasks between younger and older adults. In the case spatial memory tasks we find that aging has a detrimental effect on the performance level. This work is supported by NIH/NIA grant AG09282-06.

  14. Semantic and episodic memory of music are subserved by distinct neural networks.

    PubMed

    Platel, Hervé; Baron, Jean-Claude; Desgranges, Béatrice; Bernard, Frédéric; Eustache, Francis

    2003-09-01

    Numerous functional imaging studies have shown that retrieval from semantic and episodic memory is subserved by distinct neural networks. However, these results were essentially obtained with verbal and visuospatial material. The aim of this work was to determine the neural substrates underlying the semantic and episodic components of music using familiar and nonfamiliar melodic tunes. To study musical semantic memory, we designed a task in which the instruction was to judge whether or not the musical extract was felt as "familiar." To study musical episodic memory, we constructed two delayed recognition tasks, one containing only familiar and the other only nonfamiliar items. For each recognition task, half of the extracts (targets) were presented in the prior semantic task. The episodic and semantic tasks were to be contrasted by a comparison to two perceptive control tasks and to one another. Cerebral blood flow was assessed by means of the oxygen-15-labeled water injection method, using high-resolution PET. Distinct patterns of activations were found. First, regarding the episodic memory condition, bilateral activations of the middle and superior frontal gyri and precuneus (more prominent on the right side) were observed. Second, the semantic memory condition disclosed extensive activations in the medial and orbital frontal cortex bilaterally, the left angular gyrus, and predominantly the left anterior part of the middle temporal gyri. The findings from this study are discussed in light of the available neuropsychological data obtained in brain-damaged subjects and functional neuroimaging studies.

  15. Online particle detection with Neural Networks based on topological calorimetry information

    NASA Astrophysics Data System (ADS)

    Ciodaro, T.; Deva, D.; de Seixas, J. M.; Damazio, D.

    2012-06-01

    This paper presents the latest results from the Ringer algorithm, which is based on artificial neural networks for the electron identification at the online filtering system of the ATLAS particle detector, in the context of the LHC experiment at CERN. The algorithm performs topological feature extraction using the ATLAS calorimetry information (energy measurements). The extracted information is presented to a neural network classifier. Studies showed that the Ringer algorithm achieves high detection efficiency, while keeping the false alarm rate low. Optimizations, guided by detailed analysis, reduced the algorithm execution time by 59%. Also, the total memory necessary to store the Ringer algorithm information represents less than 6.2 percent of the total filtering system amount.

  16. Robust stability for uncertain stochastic fuzzy BAM neural networks with time-varying delays

    NASA Astrophysics Data System (ADS)

    Syed Ali, M.; Balasubramaniam, P.

    2008-07-01

    In this Letter, by utilizing the Lyapunov functional and combining with the linear matrix inequality (LMI) approach, we analyze the global asymptotic stability of uncertain stochastic fuzzy Bidirectional Associative Memory (BAM) neural networks with time-varying delays which are represented by the Takagi-Sugeno (TS) fuzzy models. A new class of uncertain stochastic fuzzy BAM neural networks with time varying delays has been studied and sufficient conditions have been derived to obtain conservative result in stochastic settings. The developed results are more general than those reported in the earlier literatures. In addition, the numerical examples are provided to illustrate the applicability of the result using LMI toolbox in MATLAB.

  17. RM-SORN: a reward-modulated self-organizing recurrent neural network.

    PubMed

    Aswolinskiy, Witali; Pipa, Gordon

    2015-01-01

    Neural plasticity plays an important role in learning and memory. Reward-modulation of plasticity offers an explanation for the ability of the brain to adapt its neural activity to achieve a rewarded goal. Here, we define a neural network model that learns through the interaction of Intrinsic Plasticity (IP) and reward-modulated Spike-Timing-Dependent Plasticity (STDP). IP enables the network to explore possible output sequences and STDP, modulated by reward, reinforces the creation of the rewarded output sequences. The model is tested on tasks for prediction, recall, non-linear computation, pattern recognition, and sequence generation. It achieves performance comparable to networks trained with supervised learning, while using simple, biologically motivated plasticity rules, and rewarding strategies. The results confirm the importance of investigating the interaction of several plasticity rules in the context of reward-modulated learning and whether reward-modulated self-organization can explain the amazing capabilities of the brain.

  18. Perceptual Filtering in L2 Lexical Memory: A Neural Network Approach to Second Language Acquisition

    ERIC Educational Resources Information Center

    Nelson, Robert

    2012-01-01

    A number of asymmetries in lexical memory emerge when monolinguals and early bilinguals are compared to (relatively) late second language (L2) learners. Their study promises to provide insight into the internal processes that both support and ultimately limit L2 learner achievement. Generally, theory building in L2 and bilingual lexical memory has…

  19. Neural mechanisms of interference control in working memory capacity.

    PubMed

    Bomyea, Jessica; Taylor, Charles T; Spadoni, Andrea D; Simmons, Alan N

    2018-02-01

    The extent to which one can use cognitive resources to keep information in working memory is known to rely on (1) active maintenance of target representations and (2) downregulation of interference from irrelevant representations. Neurobiologically, the global capacity of working memory is thought to depend on the prefrontal and parietal cortices; however, the neural mechanisms involved in controlling interference specifically in working memory capacity tasks remain understudied. In this study, 22 healthy participants completed a modified complex working memory capacity task (Reading Span) with trials of varying levels of interference control demands while undergoing functional MRI. Neural activity associated with interference control demands was examined separately during encoding and recall phases of the task. Results suggested a widespread network of regions in the prefrontal, parietal, and occipital cortices, and the cingulate and cerebellum associated with encoding, and parietal and occipital regions associated with recall. Results align with prior findings emphasizing the importance of frontoparietal circuits for working memory performance, including the role of the inferior frontal gyrus, cingulate, occipital cortex, and cerebellum in regulation of interference demands. © 2017 Wiley Periodicals, Inc.

  20. Content-based retrieval using MPEG-7 visual descriptor and hippocampal neural network

    NASA Astrophysics Data System (ADS)

    Kim, Young Ho; Joung, Lyang-Jae; Kang, Dae-Seong

    2005-12-01

    As development of digital technology, many kinds of multimedia data are used variously and requirements for effective use by user are increasing. In order to transfer information fast and precisely what user wants, effective retrieval method is required. As existing multimedia data are impossible to apply the MPEG-1, MPEG-2 and MPEG-4 technologies which are aimed at compression, store and transmission. So MPEG-7 is introduced as a new technology for effective management and retrieval for multimedia data. In this paper, we extract content-based features using color descriptor among the MPEG-7 standardization visual descriptor, and reduce feature data applying PCA(Principal Components Analysis) technique. We remodel the cerebral cortex and hippocampal neural networks as a principle of a human's brain and it can label the features of the image-data which are inputted according to the order of hippocampal neuron structure to reaction-pattern according to the adjustment of a good impression in Dentate gyrus region and remove the noise through the auto-associate- memory step in the CA3 region. In the CA1 region receiving the information of the CA3, it can make long-term or short-term memory learned by neuron. Hippocampal neural network makes neuron of the neural network separate and combine dynamically, expand the neuron attaching additional information using the synapse and add new features according to the situation by user's demand. When user is querying, it compares feature value stored in long-term memory first and it learns feature vector fast and construct optimized feature. So the speed of index and retrieval is fast. Also, it uses MPEG-7 standard visual descriptors as content-based feature value, it improves retrieval efficiency.

  1. Behavior control in the sensorimotor loop with short-term synaptic dynamics induced by self-regulating neurons.

    PubMed

    Toutounji, Hazem; Pasemann, Frank

    2014-01-01

    The behavior and skills of living systems depend on the distributed control provided by specialized and highly recurrent neural networks. Learning and memory in these systems is mediated by a set of adaptation mechanisms, known collectively as neuronal plasticity. Translating principles of recurrent neural control and plasticity to artificial agents has seen major strides, but is usually hampered by the complex interactions between the agent's body and its environment. One of the important standing issues is for the agent to support multiple stable states of behavior, so that its behavioral repertoire matches the requirements imposed by these interactions. The agent also must have the capacity to switch between these states in time scales that are comparable to those by which sensory stimulation varies. Achieving this requires a mechanism of short-term memory that allows the neurocontroller to keep track of the recent history of its input, which finds its biological counterpart in short-term synaptic plasticity. This issue is approached here by deriving synaptic dynamics in recurrent neural networks. Neurons are introduced as self-regulating units with a rich repertoire of dynamics. They exhibit homeostatic properties for certain parameter domains, which result in a set of stable states and the required short-term memory. They can also operate as oscillators, which allow them to surpass the level of activity imposed by their homeostatic operation conditions. Neural systems endowed with the derived synaptic dynamics can be utilized for the neural behavior control of autonomous mobile agents. The resulting behavior depends also on the underlying network structure, which is either engineered or developed by evolutionary techniques. The effectiveness of these self-regulating units is demonstrated by controlling locomotion of a hexapod with 18 degrees of freedom, and obstacle-avoidance of a wheel-driven robot.

  2. Behavior control in the sensorimotor loop with short-term synaptic dynamics induced by self-regulating neurons

    PubMed Central

    Toutounji, Hazem; Pasemann, Frank

    2014-01-01

    The behavior and skills of living systems depend on the distributed control provided by specialized and highly recurrent neural networks. Learning and memory in these systems is mediated by a set of adaptation mechanisms, known collectively as neuronal plasticity. Translating principles of recurrent neural control and plasticity to artificial agents has seen major strides, but is usually hampered by the complex interactions between the agent's body and its environment. One of the important standing issues is for the agent to support multiple stable states of behavior, so that its behavioral repertoire matches the requirements imposed by these interactions. The agent also must have the capacity to switch between these states in time scales that are comparable to those by which sensory stimulation varies. Achieving this requires a mechanism of short-term memory that allows the neurocontroller to keep track of the recent history of its input, which finds its biological counterpart in short-term synaptic plasticity. This issue is approached here by deriving synaptic dynamics in recurrent neural networks. Neurons are introduced as self-regulating units with a rich repertoire of dynamics. They exhibit homeostatic properties for certain parameter domains, which result in a set of stable states and the required short-term memory. They can also operate as oscillators, which allow them to surpass the level of activity imposed by their homeostatic operation conditions. Neural systems endowed with the derived synaptic dynamics can be utilized for the neural behavior control of autonomous mobile agents. The resulting behavior depends also on the underlying network structure, which is either engineered or developed by evolutionary techniques. The effectiveness of these self-regulating units is demonstrated by controlling locomotion of a hexapod with 18 degrees of freedom, and obstacle-avoidance of a wheel-driven robot. PMID:24904403

  3. Neural correlates of recognition memory of social information in people with schizophrenia

    PubMed Central

    Harvey, Philippe-Olivier; Lepage, Martin

    2014-01-01

    Background Social dysfunction is a hallmark characteristic of schizophrenia. Part of it may stem from an inability to efficiently encode social information into memory and retrieve it later. This study focused on whether patients with schizophrenia show a memory boost for socially relevant information and engage the same neural network as controls when processing social stimuli that were previously encoded into memory. Methods Patients with schizophrenia and healthy controls performed a social and nonsocial picture recognition memory task while being scanned. We calculated memory performance using d′. Our main analysis focused on brain activity associated with recognition memory of social and nonsocial pictures. Results Our study included 28 patients with schizophrenia and 26 controls. Healthy controls demonstrated a memory boost for socially relevant information. In contrast, patients with schizophrenia failed to show enhanced recognition sensitivity for social pictures. At the neural level, patients did not engage the dorsomedial prefrontal cortex (DMPFC) as much as controls while recognizing social pictures. Limitations Our study did not include direct measures of self-referential processing. All but 3 patients were taking antipsychotic medications, which may have altered both the behavioural performance during the picture recognition memory task and brain activity. Conclusion Impaired social memory in patients with schizophrenia may be associated with altered DMPFC activity. A reduction of DMPFC activity may reflect less involvement of self-referential processes during memory retrieval. Our functional MRI results contribute to a better mapping of the neural disturbances associated with social memory impairment in patients with schizophrenia and may facilitate the development of innovative treatments, such as transcranial magnetic stimulation. PMID:24119792

  4. Neural correlates of recognition memory of social information in people with schizophrenia.

    PubMed

    Harvey, Philippe-Olivier; Lepage, Martin

    2014-03-01

    Social dysfunction is a hallmark characteristic of schizophrenia. Part of it may stem from an inability to efficiently encode social information into memory and retrieve it later. This study focused on whether patients with schizophrenia show a memory boost for socially relevant information and engage the same neural network as controls when processing social stimuli that were previously encoded into memory. Patients with schizophrenia and healthy controls performed a social and nonsocial picture recognition memory task while being scanned. We calculated memory performance using d'. Our main analysis focused on brain activity associated with recognition memory of social and nonsocial pictures. Our study included 28 patients with schizophrenia and 26 controls. Healthy controls demonstrated a memory boost for socially relevant information. In contrast, patients with schizophrenia failed to show enhanced recognition sensitivity for social pictures. At the neural level, patients did not engage the dorsomedial prefrontal cortex (DMPFC) as much as controls while recognizing social pictures. Our study did not include direct measures of self-referential processing. All but 3 patients were taking antipsychotic medications, which may have altered both the behavioural performance during the picture recognition memory task and brain activity. Impaired social memory in patients with schizophrenia may be associated with altered DMPFC activity. A reduction of DMPFC activity may reflect less involvement of self-referential processes during memory retrieval. Our functional MRI results contribute to a better mapping of the neural disturbances associated with social memory impairment in patients with schizophrenia and may facilitate the development of innovative treatments, such as transcranial magnetic stimulation.

  5. Does money matter in inflation forecasting?

    NASA Astrophysics Data System (ADS)

    Binner, J. M.; Tino, P.; Tepper, J.; Anderson, R.; Jones, B.; Kendall, G.

    2010-11-01

    This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. In our forecasting experiment we use two nonlinear techniques, namely, recurrent neural networks and kernel recursive least squares regression-techniques that are new to macroeconomics. Recurrent neural networks operate with potentially unbounded input memory, while the kernel regression technique is a finite memory predictor. The two methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a naïve random walk model. The best models were nonlinear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation. Beyond its economic findings, our study is in the tradition of physicists’ long-standing interest in the interconnections among statistical mechanics, neural networks, and related nonparametric statistical methods, and suggests potential avenues of extension for such studies.

  6. Global exponential stability of bidirectional associative memory neural networks with distributed delays

    NASA Astrophysics Data System (ADS)

    Song, Qiankun; Cao, Jinde

    2007-05-01

    A bidirectional associative memory neural network model with distributed delays is considered. By constructing a new Lyapunov functional, employing the homeomorphism theory, M-matrix theory and the inequality (a[greater-or-equal, slanted]0,bk[greater-or-equal, slanted]0,qk>0 with , and r>1), a sufficient condition is obtained to ensure the existence, uniqueness and global exponential stability of the equilibrium point for the model. Moreover, the exponential converging velocity index is estimated, which depends on the delay kernel functions and the system parameters. The results generalize and improve the earlier publications, and remove the usual assumption that the activation functions are bounded . Two numerical examples are given to show the effectiveness of the obtained results.

  7. Automatic temporal segment detection via bilateral long short-term memory recurrent neural networks

    NASA Astrophysics Data System (ADS)

    Sun, Bo; Cao, Siming; He, Jun; Yu, Lejun; Li, Liandong

    2017-03-01

    Constrained by the physiology, the temporal factors associated with human behavior, irrespective of facial movement or body gesture, are described by four phases: neutral, onset, apex, and offset. Although they may benefit related recognition tasks, it is not easy to accurately detect such temporal segments. An automatic temporal segment detection framework using bilateral long short-term memory recurrent neural networks (BLSTM-RNN) to learn high-level temporal-spatial features, which synthesizes the local and global temporal-spatial information more efficiently, is presented. The framework is evaluated in detail over the face and body database (FABO). The comparison shows that the proposed framework outperforms state-of-the-art methods for solving the problem of temporal segment detection.

  8. Exponential lag function projective synchronization of memristor-based multidirectional associative memory neural networks via hybrid control

    NASA Astrophysics Data System (ADS)

    Yuan, Manman; Wang, Weiping; Luo, Xiong; Li, Lixiang; Kurths, Jürgen; Wang, Xiao

    2018-03-01

    This paper is concerned with the exponential lag function projective synchronization of memristive multidirectional associative memory neural networks (MMAMNNs). First, we propose a new model of MMAMNNs with mixed time-varying delays. In the proposed approach, the mixed delays include time-varying discrete delays and distributed time delays. Second, we design two kinds of hybrid controllers. Traditional control methods lack the capability of reflecting variable synaptic weights. In this paper, the controllers are carefully designed to confirm the process of different types of synchronization in the MMAMNNs. Third, sufficient criteria guaranteeing the synchronization of system are derived based on the derive-response concept. Finally, the effectiveness of the proposed mechanism is validated with numerical experiments.

  9. Attention supports verbal short-term memory via competition between dorsal and ventral attention networks.

    PubMed

    Majerus, Steve; Attout, Lucie; D'Argembeau, Arnaud; Degueldre, Christian; Fias, Wim; Maquet, Pierre; Martinez Perez, Trecy; Stawarczyk, David; Salmon, Eric; Van der Linden, Martial; Phillips, Christophe; Balteau, Evelyne

    2012-05-01

    Interactions between the neural correlates of short-term memory (STM) and attention have been actively studied in the visual STM domain but much less in the verbal STM domain. Here we show that the same attention mechanisms that have been shown to shape the neural networks of visual STM also shape those of verbal STM. Based on previous research in visual STM, we contrasted the involvement of a dorsal attention network centered on the intraparietal sulcus supporting task-related attention and a ventral attention network centered on the temporoparietal junction supporting stimulus-related attention. We observed that, with increasing STM load, the dorsal attention network was activated while the ventral attention network was deactivated, especially during early maintenance. Importantly, activation in the ventral attention network increased in response to task-irrelevant stimuli briefly presented during the maintenance phase of the STM trials but only during low-load STM conditions, which were associated with the lowest levels of activity in the dorsal attention network during encoding and early maintenance. By demonstrating a trade-off between task-related and stimulus-related attention networks during verbal STM, this study highlights the dynamics of attentional processes involved in verbal STM.

  10. Multisensory integration processing during olfactory-visual stimulation-An fMRI graph theoretical network analysis.

    PubMed

    Ripp, Isabelle; Zur Nieden, Anna-Nora; Blankenagel, Sonja; Franzmeier, Nicolai; Lundström, Johan N; Freiherr, Jessica

    2018-05-07

    In this study, we aimed to understand how whole-brain neural networks compute sensory information integration based on the olfactory and visual system. Task-related functional magnetic resonance imaging (fMRI) data was obtained during unimodal and bimodal sensory stimulation. Based on the identification of multisensory integration processing (MIP) specific hub-like network nodes analyzed with network-based statistics using region-of-interest based connectivity matrices, we conclude the following brain areas to be important for processing the presented bimodal sensory information: right precuneus connected contralaterally to the supramarginal gyrus for memory-related imagery and phonology retrieval, and the left middle occipital gyrus connected ipsilaterally to the inferior frontal gyrus via the inferior fronto-occipital fasciculus including functional aspects of working memory. Applied graph theory for quantification of the resulting complex network topologies indicates a significantly increased global efficiency and clustering coefficient in networks including aspects of MIP reflecting a simultaneous better integration and segregation. Graph theoretical analysis of positive and negative network correlations allowing for inferences about excitatory and inhibitory network architectures revealed-not significant, but very consistent-that MIP-specific neural networks are dominated by inhibitory relationships between brain regions involved in stimulus processing. © 2018 Wiley Periodicals, Inc.

  11. Opposite Effects of Working Memory on Subjective Visibility and Priming

    ERIC Educational Resources Information Center

    De Loof, Esther; Verguts, Tom; Fias, Wim; Van Opstal, Filip

    2013-01-01

    Cognitive theories on consciousness propose a strong link between consciousness and working memory (WM). This link is also present at the neural level: Both consciousness and WM have been implicated in a prefrontal parietal network. However, the link remains empirically unexplored. The present study investigates the relation between consciousness…

  12. State-Space Analysis of Working Memory in Schizophrenia: An FBIRN Study

    ERIC Educational Resources Information Center

    Janoos, Firdaus; Brown, Gregory; Morocz, Istvan A.; Wells, William M., III

    2013-01-01

    The neural correlates of "working memory" (WM) in schizophrenia (SZ) have been extensively studied using the multisite fMRI data acquired by the Functional Biomedical Informatics Research Network (fBIRN) consortium. Although univariate and multivariate analysis methods have been variously employed to localize brain responses under differing task…

  13. Learning representations for the early detection of sepsis with deep neural networks.

    PubMed

    Kam, Hye Jin; Kim, Ha Young

    2017-10-01

    Sepsis is one of the leading causes of death in intensive care unit patients. Early detection of sepsis is vital because mortality increases as the sepsis stage worsens. This study aimed to develop detection models for the early stage of sepsis using deep learning methodologies, and to compare the feasibility and performance of the new deep learning methodology with those of the regression method with conventional temporal feature extraction. Study group selection adhered to the InSight model. The results of the deep learning-based models and the InSight model were compared. With deep feedforward networks, the area under the ROC curve (AUC) of the models were 0.887 and 0.915 for the InSight and the new feature sets, respectively. For the model with the combined feature set, the AUC was the same as that of the basic feature set (0.915). For the long short-term memory model, only the basic feature set was applied and the AUC improved to 0.929 compared with the existing 0.887 of the InSight model. The contributions of this paper can be summarized in three ways: (i) improved performance without feature extraction using domain knowledge, (ii) verification of feature extraction capability of deep neural networks through comparison with reference features, and (iii) improved performance with feedforward neural networks using long short-term memory, a neural network architecture that can learn sequential patterns. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Improving protein disorder prediction by deep bidirectional long short-term memory recurrent neural networks.

    PubMed

    Hanson, Jack; Yang, Yuedong; Paliwal, Kuldip; Zhou, Yaoqi

    2017-03-01

    Capturing long-range interactions between structural but not sequence neighbors of proteins is a long-standing challenging problem in bioinformatics. Recently, long short-term memory (LSTM) networks have significantly improved the accuracy of speech and image classification problems by remembering useful past information in long sequential events. Here, we have implemented deep bidirectional LSTM recurrent neural networks in the problem of protein intrinsic disorder prediction. The new method, named SPOT-Disorder, has steadily improved over a similar method using a traditional, window-based neural network (SPINE-D) in all datasets tested without separate training on short and long disordered regions. Independent tests on four other datasets including the datasets from critical assessment of structure prediction (CASP) techniques and >10 000 annotated proteins from MobiDB, confirmed SPOT-Disorder as one of the best methods in disorder prediction. Moreover, initial studies indicate that the method is more accurate in predicting functional sites in disordered regions. These results highlight the usefulness combining LSTM with deep bidirectional recurrent neural networks in capturing non-local, long-range interactions for bioinformatics applications. SPOT-disorder is available as a web server and as a standalone program at: http://sparks-lab.org/server/SPOT-disorder/index.php . j.hanson@griffith.edu.au or yuedong.yang@griffith.edu.au or yaoqi.zhou@griffith.edu.au. Supplementary data is available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  15. Addressing the Movement of a Freescale Robotic Car Using Neural Network

    NASA Astrophysics Data System (ADS)

    Horváth, Dušan; Cuninka, Peter

    2016-12-01

    This article deals with the management of a Freescale small robotic car along the predefined guide line. Controlling of the direction of movement of the robot is performed by neural networks, and scales (memory) of neurons are calculated by Hebbian learning from the truth tables as learning with a teacher. Reflexive infrared sensors serves as inputs. The results are experiments, which are used to compare two methods of mobile robot control - tracking lines.

  16. Deconstructing Memory in Drosophila

    PubMed Central

    Margulies, Carla; Tully, Tim; Dubnau, Josh

    2011-01-01

    Unlike most organ systems, which have evolved to maintain homeostasis, the brain has been selected to sense and adapt to environmental stimuli by constantly altering interactions in a gene network that functions within a larger neural network. This unique feature of the central nervous system provides a remarkable plasticity of behavior, but also makes experimental investigations challenging. Each experimental intervention ramifies through both gene and neural networks, resulting in unpredicted and sometimes confusing phenotypic adaptations. Experimental dissection of mechanisms underlying behavioral plasticity ultimately must accomplish an integration across many levels of biological organization, including genetic pathways acting within individual neurons, neural network interactions which feed back to gene function, and phenotypic observations at the behavioral level. This dissection will be more easily accomplished for model systems such as Drosophila, which, compared with mammals, have relatively simple and manipulable nervous systems and genomes. The evolutionary conservation of behavioral phenotype and the underlying gene function ensures that much of what we learn in such model systems will be relevant to human cognition. In this essay, we have not attempted to review the entire Drosophila memory field. Instead, we have tried to discuss particular findings that provide some level of intellectual synthesis across three levels of biological organization: behavior, neural circuitry and biochemical pathways. We have attempted to use this integrative approach to evaluate distinct mechanistic hypotheses, and to propose critical experiments that will advance this field. PMID:16139203

  17. Using Neural Networks to Improve the Performance of Radiative Transfer Modeling Used for Geometry Dependent LER Calculations

    NASA Astrophysics Data System (ADS)

    Fasnacht, Z.; Qin, W.; Haffner, D. P.; Loyola, D. G.; Joiner, J.; Krotkov, N. A.; Vasilkov, A. P.; Spurr, R. J. D.

    2017-12-01

    In order to estimate surface reflectance used in trace gas retrieval algorithms, radiative transfer models (RTM) such as the Vector Linearized Discrete Ordinate Radiative Transfer Model (VLIDORT) can be used to simulate the top of the atmosphere (TOA) radiances with advanced models of surface properties. With large volumes of satellite data, these model simulations can become computationally expensive. Look up table interpolation can improve the computational cost of the calculations, but the non-linear nature of the radiances requires a dense node structure if interpolation errors are to be minimized. In order to reduce our computational effort and improve the performance of look-up tables, neural networks can be trained to predict these radiances. We investigate the impact of using look-up table interpolation versus a neural network trained using the smart sampling technique, and show that neural networks can speed up calculations and reduce errors while using significantly less memory and RTM calls. In future work we will implement a neural network in operational processing to meet growing demands for reflectance modeling in support of high spatial resolution satellite missions.

  18. An FPGA-Based Massively Parallel Neuromorphic Cortex Simulator

    PubMed Central

    Wang, Runchun M.; Thakur, Chetan S.; van Schaik, André

    2018-01-01

    This paper presents a massively parallel and scalable neuromorphic cortex simulator designed for simulating large and structurally connected spiking neural networks, such as complex models of various areas of the cortex. The main novelty of this work is the abstraction of a neuromorphic architecture into clusters represented by minicolumns and hypercolumns, analogously to the fundamental structural units observed in neurobiology. Without this approach, simulating large-scale fully connected networks needs prohibitively large memory to store look-up tables for point-to-point connections. Instead, we use a novel architecture, based on the structural connectivity in the neocortex, such that all the required parameters and connections can be stored in on-chip memory. The cortex simulator can be easily reconfigured for simulating different neural networks without any change in hardware structure by programming the memory. A hierarchical communication scheme allows one neuron to have a fan-out of up to 200 k neurons. As a proof-of-concept, an implementation on one Altera Stratix V FPGA was able to simulate 20 million to 2.6 billion leaky-integrate-and-fire (LIF) neurons in real time. We verified the system by emulating a simplified auditory cortex (with 100 million neurons). This cortex simulator achieved a low power dissipation of 1.62 μW per neuron. With the advent of commercially available FPGA boards, our system offers an accessible and scalable tool for the design, real-time simulation, and analysis of large-scale spiking neural networks. PMID:29692702

  19. An FPGA-Based Massively Parallel Neuromorphic Cortex Simulator.

    PubMed

    Wang, Runchun M; Thakur, Chetan S; van Schaik, André

    2018-01-01

    This paper presents a massively parallel and scalable neuromorphic cortex simulator designed for simulating large and structurally connected spiking neural networks, such as complex models of various areas of the cortex. The main novelty of this work is the abstraction of a neuromorphic architecture into clusters represented by minicolumns and hypercolumns, analogously to the fundamental structural units observed in neurobiology. Without this approach, simulating large-scale fully connected networks needs prohibitively large memory to store look-up tables for point-to-point connections. Instead, we use a novel architecture, based on the structural connectivity in the neocortex, such that all the required parameters and connections can be stored in on-chip memory. The cortex simulator can be easily reconfigured for simulating different neural networks without any change in hardware structure by programming the memory. A hierarchical communication scheme allows one neuron to have a fan-out of up to 200 k neurons. As a proof-of-concept, an implementation on one Altera Stratix V FPGA was able to simulate 20 million to 2.6 billion leaky-integrate-and-fire (LIF) neurons in real time. We verified the system by emulating a simplified auditory cortex (with 100 million neurons). This cortex simulator achieved a low power dissipation of 1.62 μW per neuron. With the advent of commercially available FPGA boards, our system offers an accessible and scalable tool for the design, real-time simulation, and analysis of large-scale spiking neural networks.

  20. Fractional Hopfield Neural Networks: Fractional Dynamic Associative Recurrent Neural Networks.

    PubMed

    Pu, Yi-Fei; Yi, Zhang; Zhou, Ji-Liu

    2017-10-01

    This paper mainly discusses a novel conceptual framework: fractional Hopfield neural networks (FHNN). As is commonly known, fractional calculus has been incorporated into artificial neural networks, mainly because of its long-term memory and nonlocality. Some researchers have made interesting attempts at fractional neural networks and gained competitive advantages over integer-order neural networks. Therefore, it is naturally makes one ponder how to generalize the first-order Hopfield neural networks to the fractional-order ones, and how to implement FHNN by means of fractional calculus. We propose to introduce a novel mathematical method: fractional calculus to implement FHNN. First, we implement fractor in the form of an analog circuit. Second, we implement FHNN by utilizing fractor and the fractional steepest descent approach, construct its Lyapunov function, and further analyze its attractors. Third, we perform experiments to analyze the stability and convergence of FHNN, and further discuss its applications to the defense against chip cloning attacks for anticounterfeiting. The main contribution of our work is to propose FHNN in the form of an analog circuit by utilizing a fractor and the fractional steepest descent approach, construct its Lyapunov function, prove its Lyapunov stability, analyze its attractors, and apply FHNN to the defense against chip cloning attacks for anticounterfeiting. A significant advantage of FHNN is that its attractors essentially relate to the neuron's fractional order. FHNN possesses the fractional-order-stability and fractional-order-sensitivity characteristics.

  1. Chips of Hope: Neuro-Electronic Hybrids for Brain Repair

    NASA Astrophysics Data System (ADS)

    Ben-Jacob, Eshel

    2010-03-01

    The field of Neuro-Electronic Hybrids kicked off 30 years ago when researchers in the US first tweaked the technology of recording and stimulation of networks of live neurons grown in a Petri dish and interfaced with a computer via an array of electrodes. Since then, many researchers have searched for ways to imprint in neural networks new ``memories" without erasing old ones. I will describe our new generation of Neuro-Electronic Hybrids and how we succeeded to turn them into the first learning Neurochips - memory and information processing chips made of live neurons. To imprint multiple memories in our new chip we used chemical stimulation at specific locations that were selected by analyzing the networks activity in real time according to our new information encoding principle. Currently we develop new-generation of neuro chips using special carbon nano tubes (CNT). These electrodes enable to engineer the networks topology and efficient electrical interfacing with the neurons. This advance bears the promise to pave the way for building a new experimental platform for testing new drugs and developing new methods for neural networks repair and regeneration. Looking into the future, the development brings us a step closer towards the dream of Brain Repair by implementable Neuro-Electronic hybrid chips.

  2. Working memory activation of neural networks in the elderly as a function of information processing phase and task complexity.

    PubMed

    Charroud, Céline; Steffener, Jason; Le Bars, Emmanuelle; Deverdun, Jérémy; Bonafe, Alain; Abdennour, Meriem; Portet, Florence; Molino, François; Stern, Yaakov; Ritchie, Karen; Menjot de Champfleur, Nicolas; Akbaraly, Tasnime N

    2015-11-01

    Changes in working memory are sensitive indicators of both normal and pathological brain aging and associated disability. The present study aims to further understanding of working memory in normal aging using a large cohort of healthy elderly in order to examine three separate phases of information processing in relation to changes in task load activation. Using covariance analysis, increasing and decreasing neural activation was observed on fMRI in response to a delayed item recognition task in 337 cognitively healthy elderly persons as part of the CRESCENDO (Cognitive REServe and Clinical ENDOphenotypes) study. During three phases of the task (stimulation, retention, probe), increased activation was observed with increasing task load in bilateral regions of the prefrontal cortex, parietal lobule, cingulate gyrus, insula and in deep gray matter nuclei, suggesting an involvement of central executive and salience networks. Decreased activation associated with increasing task load was observed during the stimulation phase, in bilateral temporal cortex, parietal lobule, cingulate gyrus and prefrontal cortex. This spatial distribution of decreased activation is suggestive of the default mode network. These findings support the hypothesis of an increased activation in salience and central executive networks and a decreased activation in default mode network concomitant to increasing task load. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Research on improving image recognition robustness by combining multiple features with associative memory

    NASA Astrophysics Data System (ADS)

    Guo, Dongwei; Wang, Zhe

    2018-05-01

    Convolutional neural networks (CNN) achieve great success in computer vision, it can learn hierarchical representation from raw pixels and has outstanding performance in various image recognition tasks [1]. However, CNN is easy to be fraudulent in terms of it is possible to produce images totally unrecognizable to human eyes that CNNs believe with near certainty are familiar objects. [2]. In this paper, an associative memory model based on multiple features is proposed. Within this model, feature extraction and classification are carried out by CNN, T-SNE and exponential bidirectional associative memory neural network (EBAM). The geometric features extracted from CNN and the digital features extracted from T-SNE are associated by EBAM. Thus we ensure the recognition of robustness by a comprehensive assessment of the two features. In our model, we can get only 8% error rate with fraudulent data. In systems that require a high safety factor or some key areas, strong robustness is extremely important, if we can ensure the image recognition robustness, network security will be greatly improved and the social production efficiency will be extremely enhanced.

  4. Neural circuit mechanisms of short-term memory

    NASA Astrophysics Data System (ADS)

    Goldman, Mark

    Memory over time scales of seconds to tens of seconds is thought to be maintained by neural activity that is triggered by a memorized stimulus and persists long after the stimulus is turned off. This presents a challenge to current models of memory-storing mechanisms, because the typical time scales associated with cellular and synaptic dynamics are two orders of magnitude smaller than this. While such long time scales can easily be achieved by bistable processes that toggle like a flip-flop between a baseline and elevated-activity state, many neuronal systems have been observed experimentally to be capable of maintaining a continuum of stable states. For example, in neural integrator networks involved in the accumulation of evidence for decision making and in motor control, individual neurons have been recorded whose activity reflects the mathematical integral of their inputs; in the absence of input, these neurons sustain activity at a level proportional to the running total of their inputs. This represents an analog form of memory whose dynamics can be conceptualized through an energy landscape with a continuum of lowest-energy states. Such continuous attractor landscapes are structurally non-robust, in seeming violation of the relative robustness of biological memory systems. In this talk, I will present and compare different biologically motivated circuit motifs for the accumulation and storage of signals in short-term memory. Challenges to generating robust memory maintenance will be highlighted and potential mechanisms for ameliorating the sensitivity of memory networks to perturbations will be discussed. Funding for this work was provided by NIH R01 MH065034, NSF IIS-1208218, Simons Foundation 324260, and a UC Davis Ophthalmology Research to Prevent Blindness Grant.

  5. Thalamic structures and associated cognitive functions: Relations with age and aging.

    PubMed

    Fama, Rosemary; Sullivan, Edith V

    2015-07-01

    The thalamus, with its cortical, subcortical, and cerebellar connections, is a critical node in networks supporting cognitive functions known to decline in normal aging, including component processes of memory and executive functions of attention and information processing. The macrostructure, microstructure, and neural connectivity of the thalamus changes across the adult lifespan. Structural and functional magnetic resonance imaging (MRI) and diffusion tensor imaging (DTI) have demonstrated, regional thalamic volume shrinkage and microstructural degradation, with anterior regions generally more compromised than posterior regions. The integrity of selective thalamic nuclei and projections decline with advancing age, particularly those in thalamofrontal, thalamoparietal, and thalamolimbic networks. This review presents studies that assess the relations between age and aging and the structure, function, and connectivity of the thalamus and associated neural networks and focuses on their relations with processes of attention, speed of information processing, and working and episodic memory. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Prefrontal Cortex Networks Shift from External to Internal Modes during Learning.

    PubMed

    Brincat, Scott L; Miller, Earl K

    2016-09-14

    As we learn about items in our environment, their neural representations become increasingly enriched with our acquired knowledge. But there is little understanding of how network dynamics and neural processing related to external information changes as it becomes laden with "internal" memories. We sampled spiking and local field potential activity simultaneously from multiple sites in the lateral prefrontal cortex (PFC) and the hippocampus (HPC)-regions critical for sensory associations-of monkeys performing an object paired-associate learning task. We found that in the PFC, evoked potentials to, and neural information about, external sensory stimulation decreased while induced beta-band (∼11-27 Hz) oscillatory power and synchrony associated with "top-down" or internal processing increased. By contrast, the HPC showed little evidence of learning-related changes in either spiking activity or network dynamics. The results suggest that during associative learning, PFC networks shift their resources from external to internal processing. As we learn about items in our environment, their representations in our brain become increasingly enriched with our acquired "top-down" knowledge. We found that in the prefrontal cortex, but not the hippocampus, processing of external sensory inputs decreased while internal network dynamics related to top-down processing increased. The results suggest that during learning, prefrontal cortex networks shift their resources from external (sensory) to internal (memory) processing. Copyright © 2016 the authors 0270-6474/16/369739-16$15.00/0.

  7. Prefrontal Cortex Networks Shift from External to Internal Modes during Learning

    PubMed Central

    Brincat, Scott L.

    2016-01-01

    As we learn about items in our environment, their neural representations become increasingly enriched with our acquired knowledge. But there is little understanding of how network dynamics and neural processing related to external information changes as it becomes laden with “internal” memories. We sampled spiking and local field potential activity simultaneously from multiple sites in the lateral prefrontal cortex (PFC) and the hippocampus (HPC)—regions critical for sensory associations—of monkeys performing an object paired-associate learning task. We found that in the PFC, evoked potentials to, and neural information about, external sensory stimulation decreased while induced beta-band (∼11–27 Hz) oscillatory power and synchrony associated with “top-down” or internal processing increased. By contrast, the HPC showed little evidence of learning-related changes in either spiking activity or network dynamics. The results suggest that during associative learning, PFC networks shift their resources from external to internal processing. SIGNIFICANCE STATEMENT As we learn about items in our environment, their representations in our brain become increasingly enriched with our acquired “top-down” knowledge. We found that in the prefrontal cortex, but not the hippocampus, processing of external sensory inputs decreased while internal network dynamics related to top-down processing increased. The results suggest that during learning, prefrontal cortex networks shift their resources from external (sensory) to internal (memory) processing. PMID:27629722

  8. Effects of the BDNF Val66Met polymorphism and met allele load on declarative memory related neural networks.

    PubMed

    Dodds, Chris M; Henson, Richard N; Suckling, John; Miskowiak, Kamilla W; Ooi, Cinly; Tait, Roger; Soltesz, Fruzsina; Lawrence, Phil; Bentley, Graham; Maltby, Kay; Skeggs, Andrew; Miller, Sam R; McHugh, Simon; Bullmore, Edward T; Nathan, Pradeep J

    2013-01-01

    It has been suggested that the BDNF Val66Met polymorphism modulates episodic memory performance via effects on hippocampal neural circuitry. However, fMRI studies have yielded inconsistent results in this respect. Moreover, very few studies have examined the effect of met allele load on activation of memory circuitry. In the present study, we carried out a comprehensive analysis of the effects of the BDNF polymorphism on brain responses during episodic memory encoding and retrieval, including an investigation of the effect of met allele load on memory related activation in the medial temporal lobe. In contrast to previous studies, we found no evidence for an effect of BDNF genotype or met load during episodic memory encoding. Met allele carriers showed increased activation during successful retrieval in right hippocampus but this was contrast-specific and unaffected by met allele load. These results suggest that the BDNF Val66Met polymorphism does not, as previously claimed, exert an observable effect on neural systems underlying encoding of new information into episodic memory but may exert a subtle effect on the efficiency with which such information can be retrieved.

  9. Content-Addressable Memory Storage by Neural Networks: A General Model and Global Liapunov Method,

    DTIC Science & Technology

    1988-03-01

    point ex- ists. Liapunov functions were also described for Volterra -Lotka systems whose off-diagonal terms are relatively small (Kilmer, 1972...field, bidirectional associative memory, Volterra -Lotka, Gilpin-Ayala, and Eigen- Schuster models. The Cohen-Grossberg model thus defines a general...masking field, bidirectional associative memory. Volterra -Lotka, Gilpin-Ayala. and Eigen-Schuster models. The Cohen-Grossberg model thus defines a

  10. Dysfunctional Neural Network of Spatial Working Memory Contributes to Developmental Dyscalculia

    ERIC Educational Resources Information Center

    Rotzer, S.; Loenneker, T.; Kucian, K.; Martin, E.; Klaver, P.; von Aster, M.

    2009-01-01

    The underlying neural mechanisms of developmental dyscalculia (DD) are still far from being clearly understood. Even the behavioral processes that generate or influence this heterogeneous disorder are a matter of controversy. To date, the few studies examining functional brain activation in children with DD mainly focus on number and counting…

  11. Neuroenhancement of Memory for Children with Autism by a Mind–Body Exercise

    PubMed Central

    Chan, Agnes S.; Han, Yvonne M. Y.; Sze, Sophia L.; Lau, Eliza M.

    2015-01-01

    The memory deficits found in individuals with autism spectrum disorder (ASD) may be caused by the lack of an effective strategy to aid memory. The executive control of memory processing is mediated largely by the timely coupling between frontal and posterior brain regions. The present study aimed to explore the potential effect of a Chinese mind–body exercise, namely Nei Gong, for enhancing learning and memory in children with ASD, and the possible neural basis of the improvement. Sixty-six children with ASD were randomly assigned to groups receiving Nei Gong training (NGT), progressive muscle relaxation (PMR) training, or no training for 1 month. Before and after training, the participants were tested individually on a computerized visual memory task while EEG signals were acquired during the memory encoding phase. Children in the NGT group demonstrated significantly enhanced memory performance and more effective use of a memory strategy, which was not observed in the other two groups. Furthermore, the improved memory after NGT was consistent with findings of elevated EEG theta coherence between frontal and posterior brain regions, a measure of functional coupling. The scalp EEG signals were localized by the standardized low resolution brain electromagnetic tomography method and found to originate from a neural network that promotes effective memory processing, including the prefrontal cortex, the parietal cortex, and the medial and inferior temporal cortex. This alteration in neural processing was not found in children receiving PMR or in those who received no training. The present findings suggest that the mind–body exercise program may have the potential effect on modulating neural functional connectivity underlying memory processing and hence enhance memory functions in individuals with autism. PMID:26696946

  12. Neuroenhancement of Memory for Children with Autism by a Mind-Body Exercise.

    PubMed

    Chan, Agnes S; Han, Yvonne M Y; Sze, Sophia L; Lau, Eliza M

    2015-01-01

    The memory deficits found in individuals with autism spectrum disorder (ASD) may be caused by the lack of an effective strategy to aid memory. The executive control of memory processing is mediated largely by the timely coupling between frontal and posterior brain regions. The present study aimed to explore the potential effect of a Chinese mind-body exercise, namely Nei Gong, for enhancing learning and memory in children with ASD, and the possible neural basis of the improvement. Sixty-six children with ASD were randomly assigned to groups receiving Nei Gong training (NGT), progressive muscle relaxation (PMR) training, or no training for 1 month. Before and after training, the participants were tested individually on a computerized visual memory task while EEG signals were acquired during the memory encoding phase. Children in the NGT group demonstrated significantly enhanced memory performance and more effective use of a memory strategy, which was not observed in the other two groups. Furthermore, the improved memory after NGT was consistent with findings of elevated EEG theta coherence between frontal and posterior brain regions, a measure of functional coupling. The scalp EEG signals were localized by the standardized low resolution brain electromagnetic tomography method and found to originate from a neural network that promotes effective memory processing, including the prefrontal cortex, the parietal cortex, and the medial and inferior temporal cortex. This alteration in neural processing was not found in children receiving PMR or in those who received no training. The present findings suggest that the mind-body exercise program may have the potential effect on modulating neural functional connectivity underlying memory processing and hence enhance memory functions in individuals with autism.

  13. What does the functional organization of cortico-hippocampal networks tell us about the functional organization of memory?

    PubMed

    Reagh, Zachariah M; Ranganath, Charan

    2018-04-25

    Historically, research on the cognitive processes that support human memory proceeded, to a large extent, independently of research on the neural basis of memory. Accumulating evidence from neuroimaging, however, has enabled the field to develop a broader and more integrative perspective. Here, we briefly outline how advances in cognitive neuroscience can potentially shed light on concepts and controversies in human memory research. We argue that research on the functional properties of cortico-hippocampal networks informs us about how memories might be organized in the brain, which, in turn, helps to reconcile seemingly disparate perspectives in cognitive psychology. Finally, we discuss several open questions and directions for future research. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. New exponential stability criteria for stochastic BAM neural networks with impulses

    NASA Astrophysics Data System (ADS)

    Sakthivel, R.; Samidurai, R.; Anthoni, S. M.

    2010-10-01

    In this paper, we study the global exponential stability of time-delayed stochastic bidirectional associative memory neural networks with impulses and Markovian jumping parameters. A generalized activation function is considered, and traditional assumptions on the boundedness, monotony and differentiability of activation functions are removed. We obtain a new set of sufficient conditions in terms of linear matrix inequalities, which ensures the global exponential stability of the unique equilibrium point for stochastic BAM neural networks with impulses. The Lyapunov function method with the Itô differential rule is employed for achieving the required result. Moreover, a numerical example is provided to show that the proposed result improves the allowable upper bound of delays over some existing results in the literature.

  15. Memristor-based neural networks: Synaptic versus neuronal stochasticity

    NASA Astrophysics Data System (ADS)

    Naous, Rawan; AlShedivat, Maruan; Neftci, Emre; Cauwenberghs, Gert; Salama, Khaled Nabil

    2016-11-01

    In neuromorphic circuits, stochasticity in the cortex can be mapped into the synaptic or neuronal components. The hardware emulation of these stochastic neural networks are currently being extensively studied using resistive memories or memristors. The ionic process involved in the underlying switching behavior of the memristive elements is considered as the main source of stochasticity of its operation. Building on its inherent variability, the memristor is incorporated into abstract models of stochastic neurons and synapses. Two approaches of stochastic neural networks are investigated. Aside from the size and area perspective, the impact on the system performance, in terms of accuracy, recognition rates, and learning, among these two approaches and where the memristor would fall into place are the main comparison points to be considered.

  16. Global exponential periodicity and stability of discrete-time complex-valued recurrent neural networks with time-delays.

    PubMed

    Hu, Jin; Wang, Jun

    2015-06-01

    In recent years, complex-valued recurrent neural networks have been developed and analysed in-depth in view of that they have good modelling performance for some applications involving complex-valued elements. In implementing continuous-time dynamical systems for simulation or computational purposes, it is quite necessary to utilize a discrete-time model which is an analogue of the continuous-time system. In this paper, we analyse a discrete-time complex-valued recurrent neural network model and obtain the sufficient conditions on its global exponential periodicity and exponential stability. Simulation results of several numerical examples are delineated to illustrate the theoretical results and an application on associative memory is also given. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Pseudo-orthogonalization of memory patterns for associative memory.

    PubMed

    Oku, Makito; Makino, Takaki; Aihara, Kazuyuki

    2013-11-01

    A new method for improving the storage capacity of associative memory models on a neural network is proposed. The storage capacity of the network increases in proportion to the network size in the case of random patterns, but, in general, the capacity suffers from correlation among memory patterns. Numerous solutions to this problem have been proposed so far, but their high computational cost limits their scalability. In this paper, we propose a novel and simple solution that is locally computable without any iteration. Our method involves XNOR masking of the original memory patterns with random patterns, and the masked patterns and masks are concatenated. The resulting decorrelated patterns allow higher storage capacity at the cost of the pattern length. Furthermore, the increase in the pattern length can be reduced through blockwise masking, which results in a small amount of capacity loss. Movie replay and image recognition are presented as examples to demonstrate the scalability of the proposed method.

  18. Destabilizing Effects of Impulse in Delayed Bam Neural Networks

    NASA Astrophysics Data System (ADS)

    Li, Chuandong; Li, Chaojie; Liu, Chao

    This paper further studies the global exponential stability of the equilibrium point of the delayed bidirectional associative memory (DBAM) neural networks with impulse effects. Several results characterizing the aggregated effects of impulse and dynamical property of the impulse-free DBAM on the exponential stability of the considered DBAM have been established. It is shown that the impulsive DBAM will preserve the global exponential stability of the impulse-free DBAM even if the impulses have enlarging effects on the states of neurons.

  19. Hetero-association for pattern translation

    NASA Astrophysics Data System (ADS)

    Yu, Francis T. S.; Lu, Thomas T.; Yang, Xiangyang

    1991-09-01

    A hetero-association neural network using an interpattern association algorithm is presented. By using simple logical rules, hetero-association memory can be constructed based on the association between the input-output reference patterns. For optical implementation, a compact size liquid crystal television neural network is used. Translations between the English letters and the Chinese characters as well as Arabic and Chinese numerics are demonstrated. The authors have shown that the hetero-association model can perform more effectively in comparison to the Hopfield model in retrieving large numbers of similar patterns.

  20. Weighted-outer-product associative neural network

    NASA Astrophysics Data System (ADS)

    Ji, Han-Bing

    1991-11-01

    A weighted outer product learning (WOPL) scheme for associative memory neural network is presented in which learning orders are incorporated to the Hopfield model. WOPL can be guaranteed to achieve correct recall of some stored datums no matter whether or not they are stable in the Hopfield model, and whether the number of stored datums is small or large. A technically sufficient condition is also discussed for how to suitably choose learning orders to fully utilize WOPL for correct recall of as many stored datums as possible.

  1. Localized states in an unbounded neural field equation with smooth firing rate function: a multi-parameter analysis.

    PubMed

    Faye, Grégory; Rankin, James; Chossat, Pascal

    2013-05-01

    The existence of spatially localized solutions in neural networks is an important topic in neuroscience as these solutions are considered to characterize working (short-term) memory. We work with an unbounded neural network represented by the neural field equation with smooth firing rate function and a wizard hat spatial connectivity. Noting that stationary solutions of our neural field equation are equivalent to homoclinic orbits in a related fourth order ordinary differential equation, we apply normal form theory for a reversible Hopf bifurcation to prove the existence of localized solutions; further, we present results concerning their stability. Numerical continuation is used to compute branches of localized solution that exhibit snaking-type behaviour. We describe in terms of three parameters the exact regions for which localized solutions persist.

  2. Do neural nets learn statistical laws behind natural language?

    PubMed

    Takahashi, Shuntaro; Tanaka-Ishii, Kumiko

    2017-01-01

    The performance of deep learning in natural language processing has been spectacular, but the reasons for this success remain unclear because of the inherent complexity of deep learning. This paper provides empirical evidence of its effectiveness and of a limitation of neural networks for language engineering. Precisely, we demonstrate that a neural language model based on long short-term memory (LSTM) effectively reproduces Zipf's law and Heaps' law, two representative statistical properties underlying natural language. We discuss the quality of reproducibility and the emergence of Zipf's law and Heaps' law as training progresses. We also point out that the neural language model has a limitation in reproducing long-range correlation, another statistical property of natural language. This understanding could provide a direction for improving the architectures of neural networks.

  3. Do neural nets learn statistical laws behind natural language?

    PubMed Central

    Takahashi, Shuntaro

    2017-01-01

    The performance of deep learning in natural language processing has been spectacular, but the reasons for this success remain unclear because of the inherent complexity of deep learning. This paper provides empirical evidence of its effectiveness and of a limitation of neural networks for language engineering. Precisely, we demonstrate that a neural language model based on long short-term memory (LSTM) effectively reproduces Zipf’s law and Heaps’ law, two representative statistical properties underlying natural language. We discuss the quality of reproducibility and the emergence of Zipf’s law and Heaps’ law as training progresses. We also point out that the neural language model has a limitation in reproducing long-range correlation, another statistical property of natural language. This understanding could provide a direction for improving the architectures of neural networks. PMID:29287076

  4. Distinguishable memory retrieval networks for collaboratively and non-collaboratively learned information.

    PubMed

    Vanlangendonck, Flora; Takashima, Atsuko; Willems, Roel M; Hagoort, Peter

    2018-03-01

    Learning often occurs in communicative and collaborative settings, yet almost all research into the neural basis of memory relies on participants encoding and retrieving information on their own. We investigated whether learning linguistic labels in a collaborative context at least partly relies on cognitively and neurally distinct representations, as compared to learning in an individual context. Healthy human participants learned labels for sets of abstract shapes in three different tasks. They came up with labels with another person in a collaborative communication task (collaborative condition), by themselves (individual condition), or were given pre-determined unrelated labels to learn by themselves (arbitrary condition). Immediately after learning, participants retrieved and produced the labels aloud during a communicative task in the MRI scanner. The fMRI results show that the retrieval of collaboratively generated labels as compared to individually learned labels engages brain regions involved in understanding others (mentalizing or theory of mind) and autobiographical memory, including the medial prefrontal cortex, the right temporoparietal junction and the precuneus. This study is the first to show that collaboration during encoding affects the neural networks involved in retrieval. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  5. Neural Differentiation Tracks Improved Recall of Competing Memories Following Interleaved Study and Retrieval Practice

    PubMed Central

    Hulbert, J. C.; Norman, K. A.

    2015-01-01

    Selective retrieval of overlapping memories can generate competition. How does the brain adaptively resolve this competition? One possibility is that competing memories are inhibited; in support of this view, numerous studies have found that selective retrieval leads to forgetting of memories that are related to the just-retrieved memory. However, this retrieval-induced forgetting (RIF) effect can be eliminated or even reversed if participants are given opportunities to restudy the materials between retrieval attempts. Here, we outline an explanation for such a reversal, rooted in a neural network model of RIF that predicts representational differentiation when restudy is interleaved with selective retrieval. To test this hypothesis, we measured changes in pattern similarity of the BOLD fMRI signal elicited by related memories after undergoing interleaved competitive retrieval and restudy. Reduced pattern similarity within the hippocampus positively correlated with retrieval-induced facilitation of competing memories. This result is consistent with an adaptive differentiation process that allows individuals to learn to distinguish between once-confusable memories. PMID:25477369

  6. A neural network model of semantic memory linking feature-based object representation and words.

    PubMed

    Cuppini, C; Magosso, E; Ursino, M

    2009-06-01

    Recent theories in cognitive neuroscience suggest that semantic memory is a distributed process, which involves many cortical areas and is based on a multimodal representation of objects. The aim of this work is to extend a previous model of object representation to realize a semantic memory, in which sensory-motor representations of objects are linked with words. The model assumes that each object is described as a collection of features, coded in different cortical areas via a topological organization. Features in different objects are segmented via gamma-band synchronization of neural oscillators. The feature areas are further connected with a lexical area, devoted to the representation of words. Synapses among the feature areas, and among the lexical area and the feature areas are trained via a time-dependent Hebbian rule, during a period in which individual objects are presented together with the corresponding words. Simulation results demonstrate that, during the retrieval phase, the network can deal with the simultaneous presence of objects (from sensory-motor inputs) and words (from acoustic inputs), can correctly associate objects with words and segment objects even in the presence of incomplete information. Moreover, the network can realize some semantic links among words representing objects with shared features. These results support the idea that semantic memory can be described as an integrated process, whose content is retrieved by the co-activation of different multimodal regions. In perspective, extended versions of this model may be used to test conceptual theories, and to provide a quantitative assessment of existing data (for instance concerning patients with neural deficits).

  7. Statistical Description of Associative Memory

    NASA Astrophysics Data System (ADS)

    Samengo, Inés

    2003-03-01

    The storage of memories, in the brain, induces some kind of modification in the structural and functional properties of a neural network. Here, a few neuropsychological and neurophysiological experiments are reviewed, suggesting that the plastic changes taking place during memory storage are governed, among other things, by the correlations in the activity of a set of neurons. The Hopfield model is briefly described, showing the way the methods of statistical physics can be useful to describe the storage and retrieval of memories.

  8. Temporal Sequence of Hemispheric Network Activation during Semantic Processing: A Functional Network Connectivity Analysis

    ERIC Educational Resources Information Center

    Assaf, Michal; Jagannathan, Kanchana; Calhoun, Vince; Kraut, Michael; Hart, John, Jr.; Pearlson, Godfrey

    2009-01-01

    To explore the temporal sequence of, and the relationship between, the left and right hemispheres (LH and RH) during semantic memory (SM) processing we identified the neural networks involved in the performance of functional MRI semantic object retrieval task (SORT) using group independent component analysis (ICA) in 47 healthy individuals. SORT…

  9. Complex Rotation Quantum Dynamic Neural Networks (CRQDNN) using Complex Quantum Neuron (CQN): Applications to time series prediction.

    PubMed

    Cui, Yiqian; Shi, Junyou; Wang, Zili

    2015-11-01

    Quantum Neural Networks (QNN) models have attracted great attention since it innovates a new neural computing manner based on quantum entanglement. However, the existing QNN models are mainly based on the real quantum operations, and the potential of quantum entanglement is not fully exploited. In this paper, we proposes a novel quantum neuron model called Complex Quantum Neuron (CQN) that realizes a deep quantum entanglement. Also, a novel hybrid networks model Complex Rotation Quantum Dynamic Neural Networks (CRQDNN) is proposed based on Complex Quantum Neuron (CQN). CRQDNN is a three layer model with both CQN and classical neurons. An infinite impulse response (IIR) filter is embedded in the Networks model to enable the memory function to process time series inputs. The Levenberg-Marquardt (LM) algorithm is used for fast parameter learning. The networks model is developed to conduct time series predictions. Two application studies are done in this paper, including the chaotic time series prediction and electronic remaining useful life (RUL) prediction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Recall of patterns using binary and gray-scale autoassociative morphological memories

    NASA Astrophysics Data System (ADS)

    Sussner, Peter

    2005-08-01

    Morphological associative memories (MAM's) belong to a class of artificial neural networks that perform the operations erosion or dilation of mathematical morphology at each node. Therefore we speak of morphological neural networks. Alternatively, the total input effect on a morphological neuron can be expressed in terms of lattice induced matrix operations in the mathematical theory of minimax algebra. Neural models of associative memories are usually concerned with the storage and the retrieval of binary or bipolar patterns. Thus far, the emphasis in research on morphological associative memory systems has been on binary models, although a number of notable features of autoassociative morphological memories (AMM's) such as optimal absolute storage capacity and one-step convergence have been shown to hold in the general, gray-scale setting. In previous papers, we gained valuable insight into the storage and recall phases of AMM's by analyzing their fixed points and basins of attraction. We have shown in particular that the fixed points of binary AMM's correspond to the lattice polynomials in the original patterns. This paper extends these results in the following ways. In the first place, we provide an exact characterization of the fixed points of gray-scale AMM's in terms of combinations of the original patterns. Secondly, we present an exact expression for the fixed point attractor that represents the output of either a binary or a gray-scale AMM upon presentation of a certain input. The results of this paper are confirmed in several experiments using binary patterns and gray-scale images.

  11. Gender differences in working memory networks: A BrainMap meta-analysis

    PubMed Central

    Hill, Ashley C.; Laird, Angela R.; Robinson, Jennifer L.

    2014-01-01

    Gender differences in psychological processes have been of great interest in a variety of fields. While the majority of research in this area has focused on specific differences in relation to test performance, this study sought to determine the underlying neurofunctional differences observed during working memory, a pivotal cognitive process shown to be predictive of academic achievement and intelligence. Using the BrainMap database, we performed a meta-analysis and applied activation likelihood estimation to our search set. Our results demonstrate consistent working memory networks across genders, but also provide evidence for gender-specific networks whereby females consistently activate more limbic (e.g., amygdala and hippocampus) and prefrontal structures (e.g., right inferior frontal gyrus), and males activate a distributed network inclusive of more parietal regions. These data provide a framework for future investigation using functional or effective connectivity methods to elucidate the underpinnings of gender differences in neural network recruitment during working memory tasks. PMID:25042764

  12. Gender differences in working memory networks: a BrainMap meta-analysis.

    PubMed

    Hill, Ashley C; Laird, Angela R; Robinson, Jennifer L

    2014-10-01

    Gender differences in psychological processes have been of great interest in a variety of fields. While the majority of research in this area has focused on specific differences in relation to test performance, this study sought to determine the underlying neurofunctional differences observed during working memory, a pivotal cognitive process shown to be predictive of academic achievement and intelligence. Using the BrainMap database, we performed a meta-analysis and applied activation likelihood estimation to our search set. Our results demonstrate consistent working memory networks across genders, but also provide evidence for gender-specific networks whereby females consistently activate more limbic (e.g., amygdala and hippocampus) and prefrontal structures (e.g., right inferior frontal gyrus), and males activate a distributed network inclusive of more parietal regions. These data provide a framework for future investigations using functional or effective connectivity methods to elucidate the underpinnings of gender differences in neural network recruitment during working memory tasks. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. The neural basis of implicit learning and memory: a review of neuropsychological and neuroimaging research.

    PubMed

    Reber, Paul J

    2013-08-01

    Memory systems research has typically described the different types of long-term memory in the brain as either declarative versus non-declarative or implicit versus explicit. These descriptions reflect the difference between declarative, conscious, and explicit memory that is dependent on the medial temporal lobe (MTL) memory system, and all other expressions of learning and memory. The other type of memory is generally defined by an absence: either the lack of dependence on the MTL memory system (nondeclarative) or the lack of conscious awareness of the information acquired (implicit). However, definition by absence is inherently underspecified and leaves open questions of how this type of memory operates, its neural basis, and how it differs from explicit, declarative memory. Drawing on a variety of studies of implicit learning that have attempted to identify the neural correlates of implicit learning using functional neuroimaging and neuropsychology, a theory of implicit memory is presented that describes it as a form of general plasticity within processing networks that adaptively improve function via experience. Under this model, implicit memory will not appear as a single, coherent, alternative memory system but will instead be manifested as a principle of improvement from experience based on widespread mechanisms of cortical plasticity. The implications of this characterization for understanding the role of implicit learning in complex cognitive processes and the effects of interactions between types of memory will be discussed for examples within and outside the psychology laboratory. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Recurrent Neural Networks With Auxiliary Memory Units.

    PubMed

    Wang, Jianyong; Zhang, Lei; Guo, Quan; Yi, Zhang

    2018-05-01

    Memory is one of the most important mechanisms in recurrent neural networks (RNNs) learning. It plays a crucial role in practical applications, such as sequence learning. With a good memory mechanism, long term history can be fused with current information, and can thus improve RNNs learning. Developing a suitable memory mechanism is always desirable in the field of RNNs. This paper proposes a novel memory mechanism for RNNs. The main contributions of this paper are: 1) an auxiliary memory unit (AMU) is proposed, which results in a new special RNN model (AMU-RNN), separating the memory and output explicitly and 2) an efficient learning algorithm is developed by employing the technique of error flow truncation. The proposed AMU-RNN model, together with the developed learning algorithm, can learn and maintain stable memory over a long time range. This method overcomes both the learning conflict problem and gradient vanishing problem. Unlike the traditional method, which mixes the memory and output with a single neuron in a recurrent unit, the AMU provides an auxiliary memory neuron to maintain memory in particular. By separating the memory and output in a recurrent unit, the problem of learning conflicts can be eliminated easily. Moreover, by using the technique of error flow truncation, each auxiliary memory neuron ensures constant error flow during the learning process. The experiments demonstrate good performance of the proposed AMU-RNNs and the developed learning algorithm. The method exhibits quite efficient learning performance with stable convergence in the AMU-RNN learning and outperforms the state-of-the-art RNN models in sequence generation and sequence classification tasks.

  15. Memory and neural networks on the basis of color centers in solids.

    PubMed

    Winnacker, Albrecht; Osvet, Andres

    2009-11-01

    Optical data recording is one of the most widely used and efficient systems of memory in the non-living world. The application of color centers in this context offers not only systems of high speed in writing and read-out due to a high degree of parallelism in data handling but also a possibility to set up models of neural networks. In this way, systems with a high potential for image processing, pattern recognition and logical operations can be constructed. A limitation to storage density is given by the diffraction limit of optical data recording. It is shown that this limitation can at least in principle be overcome by the principle of spectral hole burning, which results in systems of storage capacities close to the human brain system.

  16. Aberrant neural networks for the recognition memory of socially relevant information in patients with schizophrenia.

    PubMed

    Oh, Jooyoung; Chun, Ji-Won; Kim, Eunseong; Park, Hae-Jeong; Lee, Boreom; Kim, Jae-Jin

    2017-01-01

    Patients with schizophrenia exhibit several cognitive deficits, including memory impairment. Problems with recognition memory can hinder socially adaptive behavior. Previous investigations have suggested that altered activation of the frontotemporal area plays an important role in recognition memory impairment. However, the cerebral networks related to these deficits are not known. The aim of this study was to elucidate the brain networks required for recognizing socially relevant information in patients with schizophrenia performing an old-new recognition task. Sixteen patients with schizophrenia and 16 controls participated in this study. First, the subjects performed the theme-identification task during functional magnetic resonance imaging. In this task, pictures depicting social situations were presented with three words, and the subjects were asked to select the best theme word for each picture. The subjects then performed an old-new recognition task in which they were asked to discriminate whether the presented words were old or new. Task performance and neural responses in the old-new recognition task were compared between the subject groups. An independent component analysis of the functional connectivity was performed. The patients with schizophrenia exhibited decreased discriminability and increased activation of the right superior temporal gyrus compared with the controls during correct responses. Furthermore, aberrant network activities were found in the frontopolar and language comprehension networks in the patients. The functional connectivity analysis showed aberrant connectivity in the frontopolar and language comprehension networks in the patients with schizophrenia, and these aberrations possibly contribute to their low recognition performance and social dysfunction. These results suggest that the frontopolar and language comprehension networks are potential therapeutic targets in patients with schizophrenia.

  17. Computational modeling of neural plasticity for self-organization of neural networks.

    PubMed

    Chrol-Cannon, Joseph; Jin, Yaochu

    2014-11-01

    Self-organization in biological nervous systems during the lifetime is known to largely occur through a process of plasticity that is dependent upon the spike-timing activity in connected neurons. In the field of computational neuroscience, much effort has been dedicated to building up computational models of neural plasticity to replicate experimental data. Most recently, increasing attention has been paid to understanding the role of neural plasticity in functional and structural neural self-organization, as well as its influence on the learning performance of neural networks for accomplishing machine learning tasks such as classification and regression. Although many ideas and hypothesis have been suggested, the relationship between the structure, dynamics and learning performance of neural networks remains elusive. The purpose of this article is to review the most important computational models for neural plasticity and discuss various ideas about neural plasticity's role. Finally, we suggest a few promising research directions, in particular those along the line that combines findings in computational neuroscience and systems biology, and their synergetic roles in understanding learning, memory and cognition, thereby bridging the gap between computational neuroscience, systems biology and computational intelligence. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  18. Hippocampal and ventral medial prefrontal activation during retrieval-mediated learning supports novel inference.

    PubMed

    Zeithamova, Dagmar; Dominick, April L; Preston, Alison R

    2012-07-12

    Memory enables flexible use of past experience to inform new behaviors. Although leading theories hypothesize that this fundamental flexibility results from the formation of integrated memory networks relating multiple experiences, the neural mechanisms that support memory integration are not well understood. Here, we demonstrate that retrieval-mediated learning, whereby prior event details are reinstated during encoding of related experiences, supports participants' ability to infer relationships between distinct events that share content. Furthermore, we show that activation changes in a functionally coupled hippocampal and ventral medial prefrontal cortical circuit track the formation of integrated memories and successful inferential memory performance. These findings characterize the respective roles of these regions in retrieval-mediated learning processes that support relational memory network formation and inferential memory in the human brain. More broadly, these data reveal fundamental mechanisms through which memory representations are constructed into prospectively useful formats. Copyright © 2012 Elsevier Inc. All rights reserved.

  19. Hippocampal and ventral medial prefrontal activation during retrieval-mediated learning supports novel inference

    PubMed Central

    Zeithamova, Dagmar; Dominick, April L.; Preston, Alison R.

    2012-01-01

    SUMMARY Memory enables flexible use of past experience to inform new behaviors. Though leading theories hypothesize that this fundamental flexibility results from the formation of integrated memory networks relating multiple experiences, the neural mechanisms that support memory integration are not well understood. Here, we demonstrate that retrieval-mediated learning, whereby prior event details are reinstated during encoding of related experiences, supports participants’ ability to infer relationships between distinct events that share content. Furthermore, we show that activation changes in a functionally coupled hippocampal and ventral medial prefrontal cortical circuit track the formation of integrated memories and successful inferential memory performance. These findings characterize the respective roles of these regions in retrieval-mediated learning processes that support relational memory network formation and inferential memory in the human brain. More broadly, these data reveal fundamental mechanisms through which memory representations are constructed into prospectively useful formats. PMID:22794270

  20. A Balanced Memory Network

    PubMed Central

    Roudi, Yasser; Latham, Peter E

    2007-01-01

    A fundamental problem in neuroscience is understanding how working memory—the ability to store information at intermediate timescales, like tens of seconds—is implemented in realistic neuronal networks. The most likely candidate mechanism is the attractor network, and a great deal of effort has gone toward investigating it theoretically. Yet, despite almost a quarter century of intense work, attractor networks are not fully understood. In particular, there are still two unanswered questions. First, how is it that attractor networks exhibit irregular firing, as is observed experimentally during working memory tasks? And second, how many memories can be stored under biologically realistic conditions? Here we answer both questions by studying an attractor neural network in which inhibition and excitation balance each other. Using mean-field analysis, we derive a three-variable description of attractor networks. From this description it follows that irregular firing can exist only if the number of neurons involved in a memory is large. The same mean-field analysis also shows that the number of memories that can be stored in a network scales with the number of excitatory connections, a result that has been suggested for simple models but never shown for realistic ones. Both of these predictions are verified using simulations with large networks of spiking neurons. PMID:17845070

  1. Hippocampal Sharp-Wave Ripples Influence Selective Activation of the Default Mode Network

    PubMed Central

    Kaplan, Raphael; Adhikari, Mohit H.; Hindriks, Rikkert; Mantini, Dante; Murayama, Yusuke; Logothetis, Nikos K.; Deco, Gustavo

    2016-01-01

    Summary The default mode network (DMN) is a commonly observed resting-state network (RSN) that includes medial temporal, parietal, and prefrontal regions involved in episodic memory [1, 2, 3]. The behavioral relevance of endogenous DMN activity remains elusive, despite an emerging literature correlating resting fMRI fluctuations with memory performance [4, 5]—particularly in DMN regions [6, 7, 8]. Mechanistic support for the DMN’s role in memory consolidation might come from investigation of large deflections (sharp-waves) in the hippocampal local field potential that co-occur with high-frequency (>80 Hz) oscillations called ripples—both during sleep [9, 10] and awake deliberative periods [11, 12, 13]. Ripples are ideally suited for memory consolidation [14, 15], since the reactivation of hippocampal place cell ensembles occurs during ripples [16, 17, 18, 19]. Moreover, the number of ripples after learning predicts subsequent memory performance in rodents [20, 21, 22] and humans [23], whereas electrical stimulation of the hippocampus after learning interferes with memory consolidation [24, 25, 26]. A recent study in macaques showed diffuse fMRI neocortical activation and subcortical deactivation specifically after ripples [27]. Yet it is unclear whether ripples and other hippocampal neural events influence endogenous fluctuations in specific RSNs—like the DMN—unitarily. Here, we examine fMRI datasets from anesthetized monkeys with simultaneous hippocampal electrophysiology recordings, where we observe a dramatic increase in the DMN fMRI signal following ripples, but not following other hippocampal electrophysiological events. Crucially, we find increases in ongoing DMN activity after ripples, but not in other RSNs. Our results relate endogenous DMN fluctuations to hippocampal ripples, thereby linking network-level resting fMRI fluctuations with behaviorally relevant circuit-level neural dynamics. PMID:26898464

  2. Reorganization of functional brain networks mediates the improvement of cognitive performance following real-time neurofeedback training of working memory.

    PubMed

    Zhang, Gaoyan; Yao, Li; Shen, Jiahui; Yang, Yihong; Zhao, Xiaojie

    2015-05-01

    Working memory (WM) is essential for individuals' cognitive functions. Neuroimaging studies indicated that WM fundamentally relied on a frontoparietal working memory network (WMN) and a cinguloparietal default mode network (DMN). Behavioral training studies demonstrated that the two networks can be modulated by WM training. Different from the behavioral training, our recent study used a real-time functional MRI (rtfMRI)-based neurofeedback method to conduct WM training, demonstrating that WM performance can be significantly improved after successfully upregulating the activity of the target region of interest (ROI) in the left dorsolateral prefrontal cortex (Zhang et al., [2013]: PloS One 8:e73735); however, the neural substrate of rtfMRI-based WM training remains unclear. In this work, we assessed the intranetwork and internetwork connectivity changes of WMN and DMN during the training, and their correlations with the change of brain activity in the target ROI as well as with the improvement of post-training behavior. Our analysis revealed an "ROI-network-behavior" correlation relationship underlying the rtfMRI training. Further mediation analysis indicated that the reorganization of functional brain networks mediated the effect of self-regulation of the target brain activity on the improvement of cognitive performance following the neurofeedback training. The results of this study enhance our understanding of the neural basis of real-time neurofeedback and suggest a new direction to improve WM performance by regulating the functional connectivity in the WM related networks. © 2014 Wiley Periodicals, Inc.

  3. Changes in Neural Connectivity and Memory Following a Yoga Intervention for Older Adults: A Pilot Study.

    PubMed

    Eyre, Harris A; Acevedo, Bianca; Yang, Hongyu; Siddarth, Prabha; Van Dyk, Kathleen; Ercoli, Linda; Leaver, Amber M; Cyr, Natalie St; Narr, Katherine; Baune, Bernhard T; Khalsa, Dharma S; Lavretsky, Helen

    2016-01-01

    No study has explored the effect of yoga on cognitive decline and resting-state functional connectivity. This study explored the relationship between performance on memory tests and resting-state functional connectivity before and after a yoga intervention versus active control for subjects with mild cognitive impairment (MCI). Participants ( ≥ 55 y) with MCI were randomized to receive a yoga intervention or active "gold-standard" control (i.e., memory enhancement training (MET)) for 12 weeks. Resting-state functional magnetic resonance imaging was used to map correlations between brain networks and memory performance changes over time. Default mode networks (DMN), language and superior parietal networks were chosen as networks of interest to analyze the association with changes in verbal and visuospatial memory performance. Fourteen yoga and 11 MET participants completed the study. The yoga group demonstrated a statistically significant improvement in depression and visuospatial memory. We observed improved verbal memory performance correlated with increased connectivity between the DMN and frontal medial cortex, pregenual anterior cingulate cortex, right middle frontal cortex, posterior cingulate cortex, and left lateral occipital cortex. Improved verbal memory performance positively correlated with increased connectivity between the language processing network and the left inferior frontal gyrus. Improved visuospatial memory performance correlated inversely with connectivity between the superior parietal network and the medial parietal cortex. Yoga may be as effective as MET in improving functional connectivity in relation to verbal memory performance. These findings should be confirmed in larger prospective studies.

  4. Influence of aging on the neural correlates of autobiographical, episodic, and semantic memory retrieval.

    PubMed

    St-Laurent, Marie; Abdi, Hervé; Burianová, Hana; Grady, Cheryl L

    2011-12-01

    We used fMRI to assess the neural correlates of autobiographical, semantic, and episodic memory retrieval in healthy young and older adults. Participants were tested with an event-related paradigm in which retrieval demand was the only factor varying between trials. A spatio-temporal partial least square analysis was conducted to identify the main patterns of activity characterizing the groups across conditions. We identified brain regions activated by all three memory conditions relative to a control condition. This pattern was expressed equally in both age groups and replicated previous findings obtained in a separate group of younger adults. We also identified regions whose activity differentiated among the different memory conditions. These patterns of differentiation were expressed less strongly in the older adults than in the young adults, a finding that was further confirmed by a barycentric discriminant analysis. This analysis showed an age-related dedifferentiation in autobiographical and episodic memory tasks but not in the semantic memory task or the control condition. These findings suggest that the activation of a common memory retrieval network is maintained with age, whereas the specific aspects of brain activity that differ with memory content are more vulnerable and less selectively engaged in older adults. Our results provide a potential neural mechanism for the well-known age differences in episodic/autobiographical memory, and preserved semantic memory, observed when older adults are compared with younger adults.

  5. LiteNet: Lightweight Neural Network for Detecting Arrhythmias at Resource-Constrained Mobile Devices.

    PubMed

    He, Ziyang; Zhang, Xiaoqing; Cao, Yangjie; Liu, Zhi; Zhang, Bo; Wang, Xiaoyan

    2018-04-17

    By running applications and services closer to the user, edge processing provides many advantages, such as short response time and reduced network traffic. Deep-learning based algorithms provide significantly better performances than traditional algorithms in many fields but demand more resources, such as higher computational power and more memory. Hence, designing deep learning algorithms that are more suitable for resource-constrained mobile devices is vital. In this paper, we build a lightweight neural network, termed LiteNet which uses a deep learning algorithm design to diagnose arrhythmias, as an example to show how we design deep learning schemes for resource-constrained mobile devices. Compare to other deep learning models with an equivalent accuracy, LiteNet has several advantages. It requires less memory, incurs lower computational cost, and is more feasible for deployment on resource-constrained mobile devices. It can be trained faster than other neural network algorithms and requires less communication across different processing units during distributed training. It uses filters of heterogeneous size in a convolutional layer, which contributes to the generation of various feature maps. The algorithm was tested using the MIT-BIH electrocardiogram (ECG) arrhythmia database; the results showed that LiteNet outperforms comparable schemes in diagnosing arrhythmias, and in its feasibility for use at the mobile devices.

  6. LiteNet: Lightweight Neural Network for Detecting Arrhythmias at Resource-Constrained Mobile Devices

    PubMed Central

    Zhang, Xiaoqing; Cao, Yangjie; Liu, Zhi; Zhang, Bo; Wang, Xiaoyan

    2018-01-01

    By running applications and services closer to the user, edge processing provides many advantages, such as short response time and reduced network traffic. Deep-learning based algorithms provide significantly better performances than traditional algorithms in many fields but demand more resources, such as higher computational power and more memory. Hence, designing deep learning algorithms that are more suitable for resource-constrained mobile devices is vital. In this paper, we build a lightweight neural network, termed LiteNet which uses a deep learning algorithm design to diagnose arrhythmias, as an example to show how we design deep learning schemes for resource-constrained mobile devices. Compare to other deep learning models with an equivalent accuracy, LiteNet has several advantages. It requires less memory, incurs lower computational cost, and is more feasible for deployment on resource-constrained mobile devices. It can be trained faster than other neural network algorithms and requires less communication across different processing units during distributed training. It uses filters of heterogeneous size in a convolutional layer, which contributes to the generation of various feature maps. The algorithm was tested using the MIT-BIH electrocardiogram (ECG) arrhythmia database; the results showed that LiteNet outperforms comparable schemes in diagnosing arrhythmias, and in its feasibility for use at the mobile devices. PMID:29673171

  7. An introduction to deep learning on biological sequence data: examples and solutions.

    PubMed

    Jurtz, Vanessa Isabell; Johansen, Alexander Rosenberg; Nielsen, Morten; Almagro Armenteros, Jose Juan; Nielsen, Henrik; Sønderby, Casper Kaae; Winther, Ole; Sønderby, Søren Kaae

    2017-11-15

    Deep neural network architectures such as convolutional and long short-term memory networks have become increasingly popular as machine learning tools during the recent years. The availability of greater computational resources, more data, new algorithms for training deep models and easy to use libraries for implementation and training of neural networks are the drivers of this development. The use of deep learning has been especially successful in image recognition; and the development of tools, applications and code examples are in most cases centered within this field rather than within biology. Here, we aim to further the development of deep learning methods within biology by providing application examples and ready to apply and adapt code templates. Given such examples, we illustrate how architectures consisting of convolutional and long short-term memory neural networks can relatively easily be designed and trained to state-of-the-art performance on three biological sequence problems: prediction of subcellular localization, protein secondary structure and the binding of peptides to MHC Class II molecules. All implementations and datasets are available online to the scientific community at https://github.com/vanessajurtz/lasagne4bio. skaaesonderby@gmail.com. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  8. A Neural Network Model of Individual Differences in Task Switching Abilities (Author’s Manuscript)

    DTIC Science & Technology

    2014-04-30

    risk for schizophrenia . Proceedings of the National Academy of Sciences of the United States of America. 2001; 98:6917–6922. [PubMed: 11381111] Engle, RW...interference in verbal working memory. Neuropsychology . 2006; 20:511–528. [PubMed: 16938014] Herd SA, Banich MT, O’Reilly RC. Neural mechanisms of cognitive

  9. Holographic neural networks versus conventional neural networks: a comparative evaluation for the classification of landmine targets in ground-penetrating radar images

    NASA Astrophysics Data System (ADS)

    Mudigonda, Naga R.; Kacelenga, Ray; Edwards, Mark

    2004-09-01

    This paper evaluates the performance of a holographic neural network in comparison with a conventional feedforward backpropagation neural network for the classification of landmine targets in ground penetrating radar images. The data used in the study was acquired from four different test sites using the landmine detection system developed by General Dynamics Canada Ltd., in collaboration with the Defense Research and Development Canada, Suffield. A set of seven features extracted for each detected alarm is used as stimulus inputs for the networks. The recall responses of the networks are then evaluated against the ground truth to declare true or false detections. The area computed under the receiver operating characteristic curve is used for comparative purposes. With a large dataset comprising of data from multiple sites, both the holographic and conventional networks showed comparable trends in recall accuracies with area values of 0.88 and 0.87, respectively. By using independent validation datasets, the holographic network"s generalization performance was observed to be better (mean area = 0.86) as compared to the conventional network (mean area = 0.82). Despite the widely publicized theoretical advantages of the holographic technology, use of more than the required number of cortical memory elements resulted in an over-fitting phenomenon of the holographic network.

  10. Natural Language Video Description using Deep Recurrent Neural Networks

    DTIC Science & Technology

    2015-11-23

    records who says what, but lacks tim- ing information. Movie scripts typically include names of all characters and most movies loosely follow the...and Jürgen Schmidhuber. A novel approach to on-line handwriting recognition based on bidirectional long short-term memory networks. In Proc. 9th Int

  11. Unipolar Terminal-Attractor Based Neural Associative Memory with Adaptive Threshold

    NASA Technical Reports Server (NTRS)

    Liu, Hua-Kuang (Inventor); Barhen, Jacob (Inventor); Farhat, Nabil H. (Inventor); Wu, Chwan-Hwa (Inventor)

    1996-01-01

    A unipolar terminal-attractor based neural associative memory (TABAM) system with adaptive threshold for perfect convergence is presented. By adaptively setting the threshold values for the dynamic iteration for the unipolar binary neuron states with terminal-attractors for the purpose of reducing the spurious states in a Hopfield neural network for associative memory and using the inner-product approach, perfect convergence and correct retrieval is achieved. Simulation is completed with a small number of stored states (M) and a small number of neurons (N) but a large M/N ratio. An experiment with optical exclusive-OR logic operation using LCTV SLMs shows the feasibility of optoelectronic implementation of the models. A complete inner-product TABAM is implemented using a PC for calculation of adaptive threshold values to achieve a unipolar TABAM (UIT) in the case where there is no crosstalk, and a crosstalk model (CRIT) in the case where crosstalk corrupts the desired state.

  12. Unipolar terminal-attractor based neural associative memory with adaptive threshold

    NASA Technical Reports Server (NTRS)

    Liu, Hua-Kuang (Inventor); Barhen, Jacob (Inventor); Farhat, Nabil H. (Inventor); Wu, Chwan-Hwa (Inventor)

    1993-01-01

    A unipolar terminal-attractor based neural associative memory (TABAM) system with adaptive threshold for perfect convergence is presented. By adaptively setting the threshold values for the dynamic iteration for the unipolar binary neuron states with terminal-attractors for the purpose of reducing the spurious states in a Hopfield neural network for associative memory and using the inner product approach, perfect convergence and correct retrieval is achieved. Simulation is completed with a small number of stored states (M) and a small number of neurons (N) but a large M/N ratio. An experiment with optical exclusive-OR logic operation using LCTV SLMs shows the feasibility of optoelectronic implementation of the models. A complete inner-product TABAM is implemented using a PC for calculation of adaptive threshold values to achieve a unipolar TABAM (UIT) in the case where there is no crosstalk, and a crosstalk model (CRIT) in the case where crosstalk corrupts the desired state.

  13. Hippocampal-cortical interaction in decision making

    PubMed Central

    Yu, Jai Y.; Frank, Loren M.

    2014-01-01

    When making a decision it is often necessary to consider the available alternatives in order to choose the most appropriate option. This deliberative process, where the pros and cons of each option are considered, relies on memories of past actions and outcomes. The hippocampus and prefrontal cortex are required for memory encoding, memory retrieval and decision making, but it is unclear how these areas support deliberation. Here we examine the potential neural substrates of these processes in the rat. The rat is a powerful model to investigate the network mechanisms underlying deliberation in the mammalian brain given the anatomical and functional conservation of its hippocampus and prefrontal cortex to other mammalian systems. Importantly, it is amenable to large scale neural recording while performing laboratory tasks that exploit its natural decisionmaking behavior. Focusing on findings in the rat, we discuss how hippocampal-cortical interactions could provide a neural substrate for deliberative decision making. PMID:24530374

  14. Functional model of biological neural networks.

    PubMed

    Lo, James Ting-Ho

    2010-12-01

    A functional model of biological neural networks, called temporal hierarchical probabilistic associative memory (THPAM), is proposed in this paper. THPAM comprises functional models of dendritic trees for encoding inputs to neurons, a first type of neuron for generating spike trains, a second type of neuron for generating graded signals to modulate neurons of the first type, supervised and unsupervised Hebbian learning mechanisms for easy learning and retrieving, an arrangement of dendritic trees for maximizing generalization, hardwiring for rotation-translation-scaling invariance, and feedback connections with different delay durations for neurons to make full use of present and past informations generated by neurons in the same and higher layers. These functional models and their processing operations have many functions of biological neural networks that have not been achieved by other models in the open literature and provide logically coherent answers to many long-standing neuroscientific questions. However, biological justifications of these functional models and their processing operations are required for THPAM to qualify as a macroscopic model (or low-order approximate) of biological neural networks.

  15. The functional neuroanatomy of multitasking: combining dual tasking with a short term memory task.

    PubMed

    Deprez, Sabine; Vandenbulcke, Mathieu; Peeters, Ron; Emsell, Louise; Amant, Frederic; Sunaert, Stefan

    2013-09-01

    Insight into the neural architecture of multitasking is crucial when investigating the pathophysiology of multitasking deficits in clinical populations. Presently, little is known about how the brain combines dual-tasking with a concurrent short-term memory task, despite the relevance of this mental operation in daily life and the frequency of complaints related to this process, in disease. In this study we aimed to examine how the brain responds when a memory task is added to dual-tasking. Thirty-three right-handed healthy volunteers (20 females, mean age 39.9 ± 5.8) were examined with functional brain imaging (fMRI). The paradigm consisted of two cross-modal single tasks (a visual and auditory temporal same-different task with short delay), a dual-task combining both single tasks simultaneously and a multi-task condition, combining the dual-task with an additional short-term memory task (temporal same-different visual task with long delay). Dual-tasking compared to both individual visual and auditory single tasks activated a predominantly right-sided fronto-parietal network and the cerebellum. When adding the additional short-term memory task, a larger and more bilateral frontoparietal network was recruited. We found enhanced activity during multitasking in components of the network that were already involved in dual-tasking, suggesting increased working memory demands, as well as recruitment of multitask-specific components including areas that are likely to be involved in online holding of visual stimuli in short-term memory such as occipito-temporal cortex. These results confirm concurrent neural processing of a visual short-term memory task during dual-tasking and provide evidence for an effective fMRI multitasking paradigm. © 2013 Elsevier Ltd. All rights reserved.

  16. Working Memory-Related Effective Connectivity in Huntington's Disease Patients.

    PubMed

    Lahr, Jacob; Minkova, Lora; Tabrizi, Sarah J; Stout, Julie C; Klöppel, Stefan; Scheller, Elisa

    2018-01-01

    Huntington's disease (HD) is a genetically caused neurodegenerative disorder characterized by heterogeneous motor, psychiatric, and cognitive symptoms. Although motor symptoms may be the most prominent presentation, cognitive symptoms such as memory deficits and executive dysfunction typically co-occur. We used functional magnetic resonance imaging (fMRI) and task fMRI-based dynamic causal modeling (DCM) to evaluate HD-related changes in the neural network underlying working memory (WM). Sixty-four pre-symptomatic HD mutation carriers (preHD), 20 patients with early manifest HD symptoms (earlyHD), and 83 healthy control subjects performed an n -back fMRI task with two levels of WM load. Effective connectivity was assessed in five predefined regions of interest, comprising bilateral inferior parietal cortex, left anterior cingulate cortex, and bilateral dorsolateral prefrontal cortex. HD mutation carriers performed less accurately and more slowly at high WM load compared with the control group. While between-group comparisons of brain activation did not reveal differential recruitment of the cortical WM network in mutation carriers, comparisons of brain connectivity as identified with DCM revealed a number of group differences across the whole WM network. Most strikingly, we observed decreasing connectivity from several regions toward right dorsolateral prefrontal cortex (rDLPFC) in preHD and even more so in earlyHD. The deterioration in rDLPFC connectivity complements results from previous studies and might mirror beginning cortical neural decline at premanifest and early manifest stages of HD. We were able to characterize effective connectivity in a WM network of HD mutation carriers yielding further insight into patterns of cognitive decline and accompanying neural deterioration.

  17. A spiking neural network model of model-free reinforcement learning with high-dimensional sensory input and perceptual ambiguity.

    PubMed

    Nakano, Takashi; Otsuka, Makoto; Yoshimoto, Junichiro; Doya, Kenji

    2015-01-01

    A theoretical framework of reinforcement learning plays an important role in understanding action selection in animals. Spiking neural networks provide a theoretically grounded means to test computational hypotheses on neurally plausible algorithms of reinforcement learning through numerical simulation. However, most of these models cannot handle observations which are noisy, or occurred in the past, even though these are inevitable and constraining features of learning in real environments. This class of problem is formally known as partially observable reinforcement learning (PORL) problems. It provides a generalization of reinforcement learning to partially observable domains. In addition, observations in the real world tend to be rich and high-dimensional. In this work, we use a spiking neural network model to approximate the free energy of a restricted Boltzmann machine and apply it to the solution of PORL problems with high-dimensional observations. Our spiking network model solves maze tasks with perceptually ambiguous high-dimensional observations without knowledge of the true environment. An extended model with working memory also solves history-dependent tasks. The way spiking neural networks handle PORL problems may provide a glimpse into the underlying laws of neural information processing which can only be discovered through such a top-down approach.

  18. A Spiking Neural Network Model of Model-Free Reinforcement Learning with High-Dimensional Sensory Input and Perceptual Ambiguity

    PubMed Central

    Nakano, Takashi; Otsuka, Makoto; Yoshimoto, Junichiro; Doya, Kenji

    2015-01-01

    A theoretical framework of reinforcement learning plays an important role in understanding action selection in animals. Spiking neural networks provide a theoretically grounded means to test computational hypotheses on neurally plausible algorithms of reinforcement learning through numerical simulation. However, most of these models cannot handle observations which are noisy, or occurred in the past, even though these are inevitable and constraining features of learning in real environments. This class of problem is formally known as partially observable reinforcement learning (PORL) problems. It provides a generalization of reinforcement learning to partially observable domains. In addition, observations in the real world tend to be rich and high-dimensional. In this work, we use a spiking neural network model to approximate the free energy of a restricted Boltzmann machine and apply it to the solution of PORL problems with high-dimensional observations. Our spiking network model solves maze tasks with perceptually ambiguous high-dimensional observations without knowledge of the true environment. An extended model with working memory also solves history-dependent tasks. The way spiking neural networks handle PORL problems may provide a glimpse into the underlying laws of neural information processing which can only be discovered through such a top-down approach. PMID:25734662

  19. Low-complexity object detection with deep convolutional neural network for embedded systems

    NASA Astrophysics Data System (ADS)

    Tripathi, Subarna; Kang, Byeongkeun; Dane, Gokce; Nguyen, Truong

    2017-09-01

    We investigate low-complexity convolutional neural networks (CNNs) for object detection for embedded vision applications. It is well-known that consolidation of an embedded system for CNN-based object detection is more challenging due to computation and memory requirement comparing with problems like image classification. To achieve these requirements, we design and develop an end-to-end TensorFlow (TF)-based fully-convolutional deep neural network for generic object detection task inspired by one of the fastest framework, YOLO.1 The proposed network predicts the localization of every object by regressing the coordinates of the corresponding bounding box as in YOLO. Hence, the network is able to detect any objects without any limitations in the size of the objects. However, unlike YOLO, all the layers in the proposed network is fully-convolutional. Thus, it is able to take input images of any size. We pick face detection as an use case. We evaluate the proposed model for face detection on FDDB dataset and Widerface dataset. As another use case of generic object detection, we evaluate its performance on PASCAL VOC dataset. The experimental results demonstrate that the proposed network can predict object instances of different sizes and poses in a single frame. Moreover, the results show that the proposed method achieves comparative accuracy comparing with the state-of-the-art CNN-based object detection methods while reducing the model size by 3× and memory-BW by 3 - 4× comparing with one of the best real-time CNN-based object detectors, YOLO. Our 8-bit fixed-point TF-model provides additional 4× memory reduction while keeping the accuracy nearly as good as the floating-point model. Moreover, the fixed- point model is capable of achieving 20× faster inference speed comparing with the floating-point model. Thus, the proposed method is promising for embedded implementations.

  20. Language Identification in Short Utterances Using Long Short-Term Memory (LSTM) Recurrent Neural Networks.

    PubMed

    Zazo, Ruben; Lozano-Diez, Alicia; Gonzalez-Dominguez, Javier; Toledano, Doroteo T; Gonzalez-Rodriguez, Joaquin

    2016-01-01

    Long Short Term Memory (LSTM) Recurrent Neural Networks (RNNs) have recently outperformed other state-of-the-art approaches, such as i-vector and Deep Neural Networks (DNNs), in automatic Language Identification (LID), particularly when dealing with very short utterances (∼3s). In this contribution we present an open-source, end-to-end, LSTM RNN system running on limited computational resources (a single GPU) that outperforms a reference i-vector system on a subset of the NIST Language Recognition Evaluation (8 target languages, 3s task) by up to a 26%. This result is in line with previously published research using proprietary LSTM implementations and huge computational resources, which made these former results hardly reproducible. Further, we extend those previous experiments modeling unseen languages (out of set, OOS, modeling), which is crucial in real applications. Results show that a LSTM RNN with OOS modeling is able to detect these languages and generalizes robustly to unseen OOS languages. Finally, we also analyze the effect of even more limited test data (from 2.25s to 0.1s) proving that with as little as 0.5s an accuracy of over 50% can be achieved.

  1. Language Identification in Short Utterances Using Long Short-Term Memory (LSTM) Recurrent Neural Networks

    PubMed Central

    Zazo, Ruben; Lozano-Diez, Alicia; Gonzalez-Dominguez, Javier; T. Toledano, Doroteo; Gonzalez-Rodriguez, Joaquin

    2016-01-01

    Long Short Term Memory (LSTM) Recurrent Neural Networks (RNNs) have recently outperformed other state-of-the-art approaches, such as i-vector and Deep Neural Networks (DNNs), in automatic Language Identification (LID), particularly when dealing with very short utterances (∼3s). In this contribution we present an open-source, end-to-end, LSTM RNN system running on limited computational resources (a single GPU) that outperforms a reference i-vector system on a subset of the NIST Language Recognition Evaluation (8 target languages, 3s task) by up to a 26%. This result is in line with previously published research using proprietary LSTM implementations and huge computational resources, which made these former results hardly reproducible. Further, we extend those previous experiments modeling unseen languages (out of set, OOS, modeling), which is crucial in real applications. Results show that a LSTM RNN with OOS modeling is able to detect these languages and generalizes robustly to unseen OOS languages. Finally, we also analyze the effect of even more limited test data (from 2.25s to 0.1s) proving that with as little as 0.5s an accuracy of over 50% can be achieved. PMID:26824467

  2. Recruitment and Consolidation of Cell Assemblies for Words by Way of Hebbian Learning and Competition in a Multi-Layer Neural Network

    PubMed Central

    Garagnani, Max; Wennekers, Thomas; Pulvermüller, Friedemann

    2009-01-01

    Current cognitive theories postulate either localist representations of knowledge or fully overlapping, distributed ones. We use a connectionist model that closely replicates known anatomical properties of the cerebral cortex and neurophysiological principles to show that Hebbian learning in a multi-layer neural network leads to memory traces (cell assemblies) that are both distributed and anatomically distinct. Taking the example of word learning based on action-perception correlation, we document mechanisms underlying the emergence of these assemblies, especially (i) the recruitment of neurons and consolidation of connections defining the kernel of the assembly along with (ii) the pruning of the cell assembly’s halo (consisting of very weakly connected cells). We found that, whereas a learning rule mapping covariance led to significant overlap and merging of assemblies, a neurobiologically grounded synaptic plasticity rule with fixed LTP/LTD thresholds produced minimal overlap and prevented merging, exhibiting competitive learning behaviour. Our results are discussed in light of current theories of language and memory. As simulations with neurobiologically realistic neural networks demonstrate here spontaneous emergence of lexical representations that are both cortically dispersed and anatomically distinct, both localist and distributed cognitive accounts receive partial support. PMID:20396612

  3. Recruitment and Consolidation of Cell Assemblies for Words by Way of Hebbian Learning and Competition in a Multi-Layer Neural Network.

    PubMed

    Garagnani, Max; Wennekers, Thomas; Pulvermüller, Friedemann

    2009-06-01

    Current cognitive theories postulate either localist representations of knowledge or fully overlapping, distributed ones. We use a connectionist model that closely replicates known anatomical properties of the cerebral cortex and neurophysiological principles to show that Hebbian learning in a multi-layer neural network leads to memory traces (cell assemblies) that are both distributed and anatomically distinct. Taking the example of word learning based on action-perception correlation, we document mechanisms underlying the emergence of these assemblies, especially (i) the recruitment of neurons and consolidation of connections defining the kernel of the assembly along with (ii) the pruning of the cell assembly's halo (consisting of very weakly connected cells). We found that, whereas a learning rule mapping covariance led to significant overlap and merging of assemblies, a neurobiologically grounded synaptic plasticity rule with fixed LTP/LTD thresholds produced minimal overlap and prevented merging, exhibiting competitive learning behaviour. Our results are discussed in light of current theories of language and memory. As simulations with neurobiologically realistic neural networks demonstrate here spontaneous emergence of lexical representations that are both cortically dispersed and anatomically distinct, both localist and distributed cognitive accounts receive partial support.

  4. Optoelectronic fuzzy associative memory with controllable attraction basin sizes

    NASA Astrophysics Data System (ADS)

    Wen, Zhiqing; Campbell, Scott; Wu, Weishu; Yeh, Pochi

    1995-10-01

    We propose and demonstrate a new fuzzy associative memory model that provides an option to control the sizes of the attraction basins in neural networks. In our optoelectronic implementation we use spatial/polarization encoding to represent the fuzzy variables. Shadow casting of the encoded patterns is employed to yield the fuzzy-absolute difference between fuzzy variables.

  5. The C. elegans Connectome Consists of Homogenous Circuits with Defined Functional Roles

    PubMed Central

    Azulay, Aharon; Zaslaver, Alon

    2016-01-01

    A major goal of systems neuroscience is to decipher the structure-function relationship in neural networks. Here we study network functionality in light of the common-neighbor-rule (CNR) in which a pair of neurons is more likely to be connected the more common neighbors it shares. Focusing on the fully-mapped neural network of C. elegans worms, we establish that the CNR is an emerging property in this connectome. Moreover, sets of common neighbors form homogenous structures that appear in defined layers of the network. Simulations of signal propagation reveal their potential functional roles: signal amplification and short-term memory at the sensory/inter-neuron layer, and synchronized activity at the motoneuron layer supporting coordinated movement. A coarse-grained view of the neural network based on homogenous connected sets alone reveals a simple modular network architecture that is intuitive to understand. These findings provide a novel framework for analyzing larger, more complex, connectomes once these become available. PMID:27606684

  6. Learning Universal Computations with Spikes

    PubMed Central

    Thalmeier, Dominik; Uhlmann, Marvin; Kappen, Hilbert J.; Memmesheimer, Raoul-Martin

    2016-01-01

    Providing the neurobiological basis of information processing in higher animals, spiking neural networks must be able to learn a variety of complicated computations, including the generation of appropriate, possibly delayed reactions to inputs and the self-sustained generation of complex activity patterns, e.g. for locomotion. Many such computations require previous building of intrinsic world models. Here we show how spiking neural networks may solve these different tasks. Firstly, we derive constraints under which classes of spiking neural networks lend themselves to substrates of powerful general purpose computing. The networks contain dendritic or synaptic nonlinearities and have a constrained connectivity. We then combine such networks with learning rules for outputs or recurrent connections. We show that this allows to learn even difficult benchmark tasks such as the self-sustained generation of desired low-dimensional chaotic dynamics or memory-dependent computations. Furthermore, we show how spiking networks can build models of external world systems and use the acquired knowledge to control them. PMID:27309381

  7. A biased competition account of attention and memory in Alzheimer's disease

    PubMed Central

    Finke, Kathrin; Myers, Nicholas; Bublak, Peter; Sorg, Christian

    2013-01-01

    The common view of Alzheimer's disease (AD) is that of an age-related memory disorder, i.e. declarative memory deficits are the first signs of the disease and associated with progressive brain changes in the medial temporal lobes and the default mode network. However, two findings challenge this view. First, new model-based tools of attention research have revealed that impaired selective attention accompanies memory deficits from early pre-dementia AD stages on. Second, very early distributed lesions of lateral parietal networks may cause these attention deficits by disrupting brain mechanisms underlying attentional biased competition. We suggest that memory and attention impairments might indicate disturbances of a common underlying neurocognitive mechanism. We propose a unifying account of impaired neural interactions within and across brain networks involved in attention and memory inspired by the biased competition principle. We specify this account at two levels of analysis: at the computational level, the selective competition of representations during both perception and memory is biased by AD-induced lesions; at the large-scale brain level, integration within and across intrinsic brain networks, which overlap in parietal and temporal lobes, is disrupted. This account integrates a large amount of previously unrelated findings of changed behaviour and brain networks and favours a brain mechanism-centred view on AD. PMID:24018724

  8. A biased competition account of attention and memory in Alzheimer's disease.

    PubMed

    Finke, Kathrin; Myers, Nicholas; Bublak, Peter; Sorg, Christian

    2013-10-19

    The common view of Alzheimer's disease (AD) is that of an age-related memory disorder, i.e. declarative memory deficits are the first signs of the disease and associated with progressive brain changes in the medial temporal lobes and the default mode network. However, two findings challenge this view. First, new model-based tools of attention research have revealed that impaired selective attention accompanies memory deficits from early pre-dementia AD stages on. Second, very early distributed lesions of lateral parietal networks may cause these attention deficits by disrupting brain mechanisms underlying attentional biased competition. We suggest that memory and attention impairments might indicate disturbances of a common underlying neurocognitive mechanism. We propose a unifying account of impaired neural interactions within and across brain networks involved in attention and memory inspired by the biased competition principle. We specify this account at two levels of analysis: at the computational level, the selective competition of representations during both perception and memory is biased by AD-induced lesions; at the large-scale brain level, integration within and across intrinsic brain networks, which overlap in parietal and temporal lobes, is disrupted. This account integrates a large amount of previously unrelated findings of changed behaviour and brain networks and favours a brain mechanism-centred view on AD.

  9. Estimation of effective connectivity using multi-layer perceptron artificial neural network.

    PubMed

    Talebi, Nasibeh; Nasrabadi, Ali Motie; Mohammad-Rezazadeh, Iman

    2018-02-01

    Studies on interactions between brain regions estimate effective connectivity, (usually) based on the causality inferences made on the basis of temporal precedence. In this study, the causal relationship is modeled by a multi-layer perceptron feed-forward artificial neural network, because of the ANN's ability to generate appropriate input-output mapping and to learn from training examples without the need of detailed knowledge of the underlying system. At any time instant, the past samples of data are placed in the network input, and the subsequent values are predicted at its output. To estimate the strength of interactions, the measure of " Causality coefficient " is defined based on the network structure, the connecting weights and the parameters of hidden layer activation function. Simulation analysis demonstrates that the method, called "CREANN" (Causal Relationship Estimation by Artificial Neural Network), can estimate time-invariant and time-varying effective connectivity in terms of MVAR coefficients. The method shows robustness with respect to noise level of data. Furthermore, the estimations are not significantly influenced by the model order (considered time-lag), and the different initial conditions (initial random weights and parameters of the network). CREANN is also applied to EEG data collected during a memory recognition task. The results implicate that it can show changes in the information flow between brain regions, involving in the episodic memory retrieval process. These convincing results emphasize that CREANN can be used as an appropriate method to estimate the causal relationship among brain signals.

  10. Spatiotemporal Recurrent Convolutional Networks for Traffic Prediction in Transportation Networks

    PubMed Central

    Yu, Haiyang; Wu, Zhihai; Wang, Shuqin; Wang, Yunpeng; Ma, Xiaolei

    2017-01-01

    Predicting large-scale transportation network traffic has become an important and challenging topic in recent decades. Inspired by the domain knowledge of motion prediction, in which the future motion of an object can be predicted based on previous scenes, we propose a network grid representation method that can retain the fine-scale structure of a transportation network. Network-wide traffic speeds are converted into a series of static images and input into a novel deep architecture, namely, spatiotemporal recurrent convolutional networks (SRCNs), for traffic forecasting. The proposed SRCNs inherit the advantages of deep convolutional neural networks (DCNNs) and long short-term memory (LSTM) neural networks. The spatial dependencies of network-wide traffic can be captured by DCNNs, and the temporal dynamics can be learned by LSTMs. An experiment on a Beijing transportation network with 278 links demonstrates that SRCNs outperform other deep learning-based algorithms in both short-term and long-term traffic prediction. PMID:28672867

  11. Spatiotemporal Recurrent Convolutional Networks for Traffic Prediction in Transportation Networks.

    PubMed

    Yu, Haiyang; Wu, Zhihai; Wang, Shuqin; Wang, Yunpeng; Ma, Xiaolei

    2017-06-26

    Predicting large-scale transportation network traffic has become an important and challenging topic in recent decades. Inspired by the domain knowledge of motion prediction, in which the future motion of an object can be predicted based on previous scenes, we propose a network grid representation method that can retain the fine-scale structure of a transportation network. Network-wide traffic speeds are converted into a series of static images and input into a novel deep architecture, namely, spatiotemporal recurrent convolutional networks (SRCNs), for traffic forecasting. The proposed SRCNs inherit the advantages of deep convolutional neural networks (DCNNs) and long short-term memory (LSTM) neural networks. The spatial dependencies of network-wide traffic can be captured by DCNNs, and the temporal dynamics can be learned by LSTMs. An experiment on a Beijing transportation network with 278 links demonstrates that SRCNs outperform other deep learning-based algorithms in both short-term and long-term traffic prediction.

  12. Modulating Hippocampal Plasticity with In Vivo Brain Stimulation

    DTIC Science & Technology

    2015-09-16

    persists in the Schaffer collateral–CA1 region of the hippocampus . NMDA-dependent LTP has been shown to be essential for learning and memory ...S114 –S121. CrossRef Medline Neves G, Cooke SF, Bliss TV (2008) Synaptic plasticity, memory and the hippocampus : a neural network approach to causality...and memory . Understanding such molecular effects will lead to a better understanding of the mechanisms by which brain stimulation produces its effects

  13. Minimal perceptrons for memorizing complex patterns

    NASA Astrophysics Data System (ADS)

    Pastor, Marissa; Song, Juyong; Hoang, Danh-Tai; Jo, Junghyo

    2016-11-01

    Feedforward neural networks have been investigated to understand learning and memory, as well as applied to numerous practical problems in pattern classification. It is a rule of thumb that more complex tasks require larger networks. However, the design of optimal network architectures for specific tasks is still an unsolved fundamental problem. In this study, we consider three-layered neural networks for memorizing binary patterns. We developed a new complexity measure of binary patterns, and estimated the minimal network size for memorizing them as a function of their complexity. We formulated the minimal network size for regular, random, and complex patterns. In particular, the minimal size for complex patterns, which are neither ordered nor disordered, was predicted by measuring their Hamming distances from known ordered patterns. Our predictions agree with simulations based on the back-propagation algorithm.

  14. Feedback control stabilization of critical dynamics via resource transport on multilayer networks: How glia enable learning dynamics in the brain

    NASA Astrophysics Data System (ADS)

    Virkar, Yogesh S.; Shew, Woodrow L.; Restrepo, Juan G.; Ott, Edward

    2016-10-01

    Learning and memory are acquired through long-lasting changes in synapses. In the simplest models, such synaptic potentiation typically leads to runaway excitation, but in reality there must exist processes that robustly preserve overall stability of the neural system dynamics. How is this accomplished? Various approaches to this basic question have been considered. Here we propose a particularly compelling and natural mechanism for preserving stability of learning neural systems. This mechanism is based on the global processes by which metabolic resources are distributed to the neurons by glial cells. Specifically, we introduce and study a model composed of two interacting networks: a model neural network interconnected by synapses that undergo spike-timing-dependent plasticity; and a model glial network interconnected by gap junctions that diffusively transport metabolic resources among the glia and, ultimately, to neural synapses where they are consumed. Our main result is that the biophysical constraints imposed by diffusive transport of metabolic resources through the glial network can prevent runaway growth of synaptic strength, both during ongoing activity and during learning. Our findings suggest a previously unappreciated role for glial transport of metabolites in the feedback control stabilization of neural network dynamics during learning.

  15. A Long Short-Term Memory deep learning network for the prediction of epileptic seizures using EEG signals.

    PubMed

    Tsiouris, Κostas Μ; Pezoulas, Vasileios C; Zervakis, Michalis; Konitsiotis, Spiros; Koutsouris, Dimitrios D; Fotiadis, Dimitrios I

    2018-05-17

    The electroencephalogram (EEG) is the most prominent means to study epilepsy and capture changes in electrical brain activity that could declare an imminent seizure. In this work, Long Short-Term Memory (LSTM) networks are introduced in epileptic seizure prediction using EEG signals, expanding the use of deep learning algorithms with convolutional neural networks (CNN). A pre-analysis is initially performed to find the optimal architecture of the LSTM network by testing several modules and layers of memory units. Based on these results, a two-layer LSTM network is selected to evaluate seizure prediction performance using four different lengths of preictal windows, ranging from 15 min to 2 h. The LSTM model exploits a wide range of features extracted prior to classification, including time and frequency domain features, between EEG channels cross-correlation and graph theoretic features. The evaluation is performed using long-term EEG recordings from the open CHB-MIT Scalp EEG database, suggest that the proposed methodology is able to predict all 185 seizures, providing high rates of seizure prediction sensitivity and low false prediction rates (FPR) of 0.11-0.02 false alarms per hour, depending on the duration of the preictal window. The proposed LSTM-based methodology delivers a significant increase in seizure prediction performance compared to both traditional machine learning techniques and convolutional neural networks that have been previously evaluated in the literature. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. The mammillary bodies and memory: more than a hippocampal relay

    PubMed Central

    Vann, Seralynne D.; Nelson, Andrew J.D.

    2015-01-01

    Although the mammillary bodies were one of the first neural structures to be implicated in memory, it has long been assumed that their main function was to act primarily as a hippocampal relay, passing information on to the anterior thalamic nuclei and from there to the cingulate cortex. This view not only afforded the mammillary bodies no independent role in memory, it also neglected the potential significance of other, nonhippocampal, inputs to the mammillary bodies. Recent advances have transformed the picture, revealing that projections from the tegmental nuclei of Gudden, and not the hippocampal formation, are critical for sustaining mammillary body function. By uncovering a role for the mammillary bodies that is independent of its subicular inputs, this work signals the need to consider a wider network of structures that form the neural bases of episodic memory. PMID:26072239

  17. A loop-based neural architecture for structured behavior encoding and decoding.

    PubMed

    Gisiger, Thomas; Boukadoum, Mounir

    2018-02-01

    We present a new type of artificial neural network that generalizes on anatomical and dynamical aspects of the mammal brain. Its main novelty lies in its topological structure which is built as an array of interacting elementary motifs shaped like loops. These loops come in various types and can implement functions such as gating, inhibitory or executive control, or encoding of task elements to name a few. Each loop features two sets of neurons and a control region, linked together by non-recurrent projections. The two neural sets do the bulk of the loop's computations while the control unit specifies the timing and the conditions under which the computations implemented by the loop are to be performed. By functionally linking many such loops together, a neural network is obtained that may perform complex cognitive computations. To demonstrate the potential offered by such a system, we present two neural network simulations. The first illustrates the structure and dynamics of a single loop implementing a simple gating mechanism. The second simulation shows how connecting four loops in series can produce neural activity patterns that are sufficient to pass a simplified delayed-response task. We also show that this network reproduces electrophysiological measurements gathered in various regions of the brain of monkeys performing similar tasks. We also demonstrate connections between this type of neural network and recurrent or long short-term memory network models, and suggest ways to generalize them for future artificial intelligence research. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Impact of Linearity and Write Noise of Analog Resistive Memory Devices in a Neural Algorithm Accelerator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacobs-Gedrim, Robin B.; Agarwal, Sapan; Knisely, Kathrine E.

    Resistive memory (ReRAM) shows promise for use as an analog synapse element in energy-efficient neural network algorithm accelerators. A particularly important application is the training of neural networks, as this is the most computationally-intensive procedure in using a neural algorithm. However, training a network with analog ReRAM synapses can significantly reduce the accuracy at the algorithm level. In order to assess this degradation, analog properties of ReRAM devices were measured and hand-written digit recognition accuracy was modeled for the training using backpropagation. Bipolar filamentary devices utilizing three material systems were measured and compared: one oxygen vacancy system, Ta-TaO x, andmore » two conducting metallization systems, Cu-SiO 2, and Ag/chalcogenide. Analog properties and conductance ranges of the devices are optimized by measuring the response to varying voltage pulse characteristics. Key analog device properties which degrade the accuracy are update linearity and write noise. Write noise may improve as a function of device manufacturing maturity, but write nonlinearity appears relatively consistent among the different device material systems and is found to be the most significant factor affecting accuracy. As a result, this suggests that new materials and/or fundamentally different resistive switching mechanisms may be required to improve device linearity and achieve higher algorithm training accuracy.« less

  19. Impact of Linearity and Write Noise of Analog Resistive Memory Devices in a Neural Algorithm Accelerator

    DOE PAGES

    Jacobs-Gedrim, Robin B.; Agarwal, Sapan; Knisely, Kathrine E.; ...

    2017-12-01

    Resistive memory (ReRAM) shows promise for use as an analog synapse element in energy-efficient neural network algorithm accelerators. A particularly important application is the training of neural networks, as this is the most computationally-intensive procedure in using a neural algorithm. However, training a network with analog ReRAM synapses can significantly reduce the accuracy at the algorithm level. In order to assess this degradation, analog properties of ReRAM devices were measured and hand-written digit recognition accuracy was modeled for the training using backpropagation. Bipolar filamentary devices utilizing three material systems were measured and compared: one oxygen vacancy system, Ta-TaO x, andmore » two conducting metallization systems, Cu-SiO 2, and Ag/chalcogenide. Analog properties and conductance ranges of the devices are optimized by measuring the response to varying voltage pulse characteristics. Key analog device properties which degrade the accuracy are update linearity and write noise. Write noise may improve as a function of device manufacturing maturity, but write nonlinearity appears relatively consistent among the different device material systems and is found to be the most significant factor affecting accuracy. As a result, this suggests that new materials and/or fundamentally different resistive switching mechanisms may be required to improve device linearity and achieve higher algorithm training accuracy.« less

  20. Visual Working Memory Capacity: From Psychophysics and Neurobiology to Individual Differences

    PubMed Central

    Luck, Steven J.; Vogel, Edward K.

    2013-01-01

    Visual working memory capacity is of great interest because it is strongly correlated with overall cognitive ability, can be understood at the level of neural circuits, and is easily measured. Recent studies have shown that capacity influences tasks ranging from saccade targeting to analogical reasoning. A debate has arisen over whether capacity is constrained by a limited number of discrete representations or by an infinitely divisible resource, but the empirical evidence and neural network models currently favor a discrete item limit. Capacity differs markedly across individuals and groups, and recent research indicates that some of these differences reflect true differences in storage capacity whereas others reflect variations in the ability to use memory capacity efficiently. PMID:23850263

  1. A genetic algorithm for optimization of neural network capable of learning to search for food in a maze

    NASA Astrophysics Data System (ADS)

    Budilova, E. V.; Terekhin, A. T.; Chepurnov, S. A.

    1994-09-01

    A hypothetical neural scheme is proposed that ensures efficient decision making by an animal searching for food in a maze. Only the general structure of the network is fixed; its quantitative characteristics are found by numerical optimization that simulates the process of natural selection. Selection is aimed at maximization of the expected number of descendants, which is directly related to the energy stored during the reproductive cycle. The main parameters to be optimized are the increments of the interneuronal links and the working-memory constants.

  2. The cognitive structural approach for image restoration

    NASA Astrophysics Data System (ADS)

    Mardare, Igor; Perju, Veacheslav; Casasent, David

    2008-03-01

    It is analyzed the important and actual problem of the defective images of scenes restoration. The proposed approach provides restoration of scenes by a system on the basis of human intelligence phenomena reproduction used for restoration-recognition of images. The cognitive models of the restoration process are elaborated. The models are realized by the intellectual processors constructed on the base of neural networks and associative memory using neural network simulator NNToolbox from MATLAB 7.0. The models provides restoration and semantic designing of images of scenes under defective images of the separate objects.

  3. Cognitive Control Network Contributions to Memory-Guided Visual Attention.

    PubMed

    Rosen, Maya L; Stern, Chantal E; Michalka, Samantha W; Devaney, Kathryn J; Somers, David C

    2016-05-01

    Visual attentional capacity is severely limited, but humans excel in familiar visual contexts, in part because long-term memories guide efficient deployment of attention. To investigate the neural substrates that support memory-guided visual attention, we performed a set of functional MRI experiments that contrast long-term, memory-guided visuospatial attention with stimulus-guided visuospatial attention in a change detection task. Whereas the dorsal attention network was activated for both forms of attention, the cognitive control network(CCN) was preferentially activated during memory-guided attention. Three posterior nodes in the CCN, posterior precuneus, posterior callosal sulcus/mid-cingulate, and lateral intraparietal sulcus exhibited the greatest specificity for memory-guided attention. These 3 regions exhibit functional connectivity at rest, and we propose that they form a subnetwork within the broader CCN. Based on the task activation patterns, we conclude that the nodes of this subnetwork are preferentially recruited for long-term memory guidance of visuospatial attention. Published by Oxford University Press 2015. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  4. Comparison between sparsely distributed memory and Hopfield-type neural network models

    NASA Technical Reports Server (NTRS)

    Keeler, James D.

    1986-01-01

    The Sparsely Distributed Memory (SDM) model (Kanerva, 1984) is compared to Hopfield-type neural-network models. A mathematical framework for comparing the two is developed, and the capacity of each model is investigated. The capacity of the SDM can be increased independently of the dimension of the stored vectors, whereas the Hopfield capacity is limited to a fraction of this dimension. However, the total number of stored bits per matrix element is the same in the two models, as well as for extended models with higher order interactions. The models are also compared in their ability to store sequences of patterns. The SDM is extended to include time delays so that contextual information can be used to cover sequences. Finally, it is shown how a generalization of the SDM allows storage of correlated input pattern vectors.

  5. How Does the Sparse Memory “Engram” Neurons Encode the Memory of a Spatial–Temporal Event?

    PubMed Central

    Guan, Ji-Song; Jiang, Jun; Xie, Hong; Liu, Kai-Yuan

    2016-01-01

    Episodic memory in human brain is not a fixed 2-D picture but a highly dynamic movie serial, integrating information at both the temporal and the spatial domains. Recent studies in neuroscience reveal that memory storage and recall are closely related to the activities in discrete memory engram (trace) neurons within the dentate gyrus region of hippocampus and the layer 2/3 of neocortex. More strikingly, optogenetic reactivation of those memory trace neurons is able to trigger the recall of naturally encoded memory. It is still unknown how the discrete memory traces encode and reactivate the memory. Considering a particular memory normally represents a natural event, which consists of information at both the temporal and spatial domains, it is unknown how the discrete trace neurons could reconstitute such enriched information in the brain. Furthermore, as the optogenetic-stimuli induced recall of memory did not depend on firing pattern of the memory traces, it is most likely that the spatial activation pattern, but not the temporal activation pattern of the discrete memory trace neurons encodes the memory in the brain. How does the neural circuit convert the activities in the spatial domain into the temporal domain to reconstitute memory of a natural event? By reviewing the literature, here we present how the memory engram (trace) neurons are selected and consolidated in the brain. Then, we will discuss the main challenges in the memory trace theory. In the end, we will provide a plausible model of memory trace cell network, underlying the conversion of neural activities between the spatial domain and the temporal domain. We will also discuss on how the activation of sparse memory trace neurons might trigger the replay of neural activities in specific temporal patterns. PMID:27601979

  6. How Does the Sparse Memory "Engram" Neurons Encode the Memory of a Spatial-Temporal Event?

    PubMed

    Guan, Ji-Song; Jiang, Jun; Xie, Hong; Liu, Kai-Yuan

    2016-01-01

    Episodic memory in human brain is not a fixed 2-D picture but a highly dynamic movie serial, integrating information at both the temporal and the spatial domains. Recent studies in neuroscience reveal that memory storage and recall are closely related to the activities in discrete memory engram (trace) neurons within the dentate gyrus region of hippocampus and the layer 2/3 of neocortex. More strikingly, optogenetic reactivation of those memory trace neurons is able to trigger the recall of naturally encoded memory. It is still unknown how the discrete memory traces encode and reactivate the memory. Considering a particular memory normally represents a natural event, which consists of information at both the temporal and spatial domains, it is unknown how the discrete trace neurons could reconstitute such enriched information in the brain. Furthermore, as the optogenetic-stimuli induced recall of memory did not depend on firing pattern of the memory traces, it is most likely that the spatial activation pattern, but not the temporal activation pattern of the discrete memory trace neurons encodes the memory in the brain. How does the neural circuit convert the activities in the spatial domain into the temporal domain to reconstitute memory of a natural event? By reviewing the literature, here we present how the memory engram (trace) neurons are selected and consolidated in the brain. Then, we will discuss the main challenges in the memory trace theory. In the end, we will provide a plausible model of memory trace cell network, underlying the conversion of neural activities between the spatial domain and the temporal domain. We will also discuss on how the activation of sparse memory trace neurons might trigger the replay of neural activities in specific temporal patterns.

  7. Kanerva's sparse distributed memory: An associative memory algorithm well-suited to the Connection Machine

    NASA Technical Reports Server (NTRS)

    Rogers, David

    1988-01-01

    The advent of the Connection Machine profoundly changes the world of supercomputers. The highly nontraditional architecture makes possible the exploration of algorithms that were impractical for standard Von Neumann architectures. Sparse distributed memory (SDM) is an example of such an algorithm. Sparse distributed memory is a particularly simple and elegant formulation for an associative memory. The foundations for sparse distributed memory are described, and some simple examples of using the memory are presented. The relationship of sparse distributed memory to three important computational systems is shown: random-access memory, neural networks, and the cerebellum of the brain. Finally, the implementation of the algorithm for sparse distributed memory on the Connection Machine is discussed.

  8. Spreading Activation in an Attractor Network with Latching Dynamics: Automatic Semantic Priming Revisited

    ERIC Educational Resources Information Center

    Lerner, Itamar; Bentin, Shlomo; Shriki, Oren

    2012-01-01

    Localist models of spreading activation (SA) and models assuming distributed representations offer very different takes on semantic priming, a widely investigated paradigm in word recognition and semantic memory research. In this study, we implemented SA in an attractor neural network model with distributed representations and created a unified…

  9. Combining neural networks and signed particles to simulate quantum systems more efficiently

    NASA Astrophysics Data System (ADS)

    Sellier, Jean Michel

    2018-04-01

    Recently a new formulation of quantum mechanics has been suggested which describes systems by means of ensembles of classical particles provided with a sign. This novel approach mainly consists of two steps: the computation of the Wigner kernel, a multi-dimensional function describing the effects of the potential over the system, and the field-less evolution of the particles which eventually create new signed particles in the process. Although this method has proved to be extremely advantageous in terms of computational resources - as a matter of fact it is able to simulate in a time-dependent fashion many-body systems on relatively small machines - the Wigner kernel can represent the bottleneck of simulations of certain systems. Moreover, storing the kernel can be another issue as the amount of memory needed is cursed by the dimensionality of the system. In this work, we introduce a new technique which drastically reduces the computation time and memory requirement to simulate time-dependent quantum systems which is based on the use of an appropriately tailored neural network combined with the signed particle formalism. In particular, the suggested neural network is able to compute efficiently and reliably the Wigner kernel without any training as its entire set of weights and biases is specified by analytical formulas. As a consequence, the amount of memory for quantum simulations radically drops since the kernel does not need to be stored anymore as it is now computed by the neural network itself, only on the cells of the (discretized) phase-space which are occupied by particles. As its is clearly shown in the final part of this paper, not only this novel approach drastically reduces the computational time, it also remains accurate. The author believes this work opens the way towards effective design of quantum devices, with incredible practical implications.

  10. Robust state estimation for uncertain fuzzy bidirectional associative memory networks with time-varying delays

    NASA Astrophysics Data System (ADS)

    Vadivel, P.; Sakthivel, R.; Mathiyalagan, K.; Arunkumar, A.

    2013-09-01

    This paper addresses the issue of robust state estimation for a class of fuzzy bidirectional associative memory (BAM) neural networks with time-varying delays and parameter uncertainties. By constructing the Lyapunov-Krasovskii functional, which contains the triple-integral term and using the free-weighting matrix technique, a set of sufficient conditions are derived in terms of linear matrix inequalities (LMIs) to estimate the neuron states through available output measurements such that the dynamics of the estimation error system is robustly asymptotically stable. In particular, we consider a generalized activation function in which the traditional assumptions on the boundedness, monotony and differentiability of the activation functions are removed. More precisely, the design of the state estimator for such BAM neural networks can be obtained by solving some LMIs, which are dependent on the size of the time derivative of the time-varying delays. Finally, a numerical example with simulation result is given to illustrate the obtained theoretical results.

  11. Changes in Neural Connectivity and Memory Following a Yoga Intervention for Older Adults: A Pilot Study

    PubMed Central

    Eyre, Harris A.; Acevedo, Bianca; Yang, Hongyu; Siddarth, Prabha; Van Dyk, Kathleen; Ercoli, Linda; Leaver, Amber M.; Cyr, Natalie St.; Narr, Katherine; Baune, Bernhard T.; Khalsa, Dharma S.; Lavretsky, Helen

    2016-01-01

    Background: No study has explored the effect of yoga on cognitive decline and resting-state functional connectivity. Objectives: This study explored the relationship between performance on memory tests and resting-state functional connectivity before and after a yoga intervention versus active control for subjects with mild cognitive impairment (MCI). Methods: Participants ( ≥ 55 y) with MCI were randomized to receive a yoga intervention or active “gold-standard” control (i.e., memory enhancement training (MET)) for 12 weeks. Resting-state functional magnetic resonance imaging was used to map correlations between brain networks and memory performance changes over time. Default mode networks (DMN), language and superior parietal networks were chosen as networks of interest to analyze the association with changes in verbal and visuospatial memory performance. Results: Fourteen yoga and 11 MET participants completed the study. The yoga group demonstrated a statistically significant improvement in depression and visuospatial memory. We observed improved verbal memory performance correlated with increased connectivity between the DMN and frontal medial cortex, pregenual anterior cingulate cortex, right middle frontal cortex, posterior cingulate cortex, and left lateral occipital cortex. Improved verbal memory performance positively correlated with increased connectivity between the language processing network and the left inferior frontal gyrus. Improved visuospatial memory performance correlated inversely with connectivity between the superior parietal network and the medial parietal cortex. Conclusion:Yoga may be as effective as MET in improving functional connectivity in relation to verbal memory performance. These findings should be confirmed in larger prospective studies. PMID:27060939

  12. Review On Applications Of Neural Network To Computer Vision

    NASA Astrophysics Data System (ADS)

    Li, Wei; Nasrabadi, Nasser M.

    1989-03-01

    Neural network models have many potential applications to computer vision due to their parallel structures, learnability, implicit representation of domain knowledge, fault tolerance, and ability of handling statistical data. This paper demonstrates the basic principles, typical models and their applications in this field. Variety of neural models, such as associative memory, multilayer back-propagation perceptron, self-stabilized adaptive resonance network, hierarchical structured neocognitron, high order correlator, network with gating control and other models, can be applied to visual signal recognition, reinforcement, recall, stereo vision, motion, object tracking and other vision processes. Most of the algorithms have been simulated on com-puters. Some have been implemented with special hardware. Some systems use features, such as edges and profiles, of images as the data form for input. Other systems use raw data as input signals to the networks. We will present some novel ideas contained in these approaches and provide a comparison of these methods. Some unsolved problems are mentioned, such as extracting the intrinsic properties of the input information, integrating those low level functions to a high-level cognitive system, achieving invariances and other problems. Perspectives of applications of some human vision models and neural network models are analyzed.

  13. Visual Working Memory Load-Related Changes in Neural Activity and Functional Connectivity

    PubMed Central

    Li, Ling; Zhang, Jin-Xiang; Jiang, Tao

    2011-01-01

    Background Visual working memory (VWM) helps us store visual information to prepare for subsequent behavior. The neuronal mechanisms for sustaining coherent visual information and the mechanisms for limited VWM capacity have remained uncharacterized. Although numerous studies have utilized behavioral accuracy, neural activity, and connectivity to explore the mechanism of VWM retention, little is known about the load-related changes in functional connectivity for hemi-field VWM retention. Methodology/Principal Findings In this study, we recorded electroencephalography (EEG) from 14 normal young adults while they performed a bilateral visual field memory task. Subjects had more rapid and accurate responses to the left visual field (LVF) memory condition. The difference in mean amplitude between the ipsilateral and contralateral event-related potential (ERP) at parietal-occipital electrodes in retention interval period was obtained with six different memory loads. Functional connectivity between 128 scalp regions was measured by EEG phase synchronization in the theta- (4–8 Hz), alpha- (8–12 Hz), beta- (12–32 Hz), and gamma- (32–40 Hz) frequency bands. The resulting matrices were converted to graphs, and mean degree, clustering coefficient and shortest path length was computed as a function of memory load. The results showed that brain networks of theta-, alpha-, beta-, and gamma- frequency bands were load-dependent and visual-field dependent. The networks of theta- and alpha- bands phase synchrony were most predominant in retention period for right visual field (RVF) WM than for LVF WM. Furthermore, only for RVF memory condition, brain network density of theta-band during the retention interval were linked to the delay of behavior reaction time, and the topological property of alpha-band network was negative correlation with behavior accuracy. Conclusions/Significance We suggest that the differences in theta- and alpha- bands between LVF and RVF conditions in functional connectivity and topological properties during retention period may result in the decline of behavioral performance in RVF task. PMID:21789253

  14. Working Memory after Traumatic Brain Injury: The Neural Basis of Improved Performance with Methylphenidate.

    PubMed

    Manktelow, Anne E; Menon, David K; Sahakian, Barbara J; Stamatakis, Emmanuel A

    2017-01-01

    Traumatic brain injury (TBI) often results in cognitive impairments for patients. The aim of this proof of concept study was to establish the nature of abnormalities, in terms of activity and connectivity, in the working memory network of TBI patients and how these relate to compromised behavioral outcomes. Further, this study examined the neural correlates of working memory improvement following the administration of methylphenidate. We report behavioral, functional and structural MRI data from a group of 15 Healthy Controls (HC) and a group of 15 TBI patients, acquired during the execution of the N-back task. The patients were studied on two occasions after the administration of either placebo or 30 mg of methylphenidate. Between group tests revealed a significant difference in performance when HCs were compared to TBI patients on placebo [ F (1, 28) = 4.426, p < 0.05, η p 2 = 0.136]. This difference disappeared when the patients took methylphenidate [ F (1, 28) = 3.665, p = 0.66]. Patients in the middle range of baseline performance demonstrated the most benefit from methylphenidate. Changes in the TBI patient activation levels in the Left Cerebellum significantly and positively correlated with changes in performance ( r = 0.509, df = 13, p = 0.05). Whole-brain connectivity analysis using the Left Cerebellum as a seed revealed widespread negative interactions between the Left Cerebellum and parietal and frontal cortices as well as subcortical areas. Neither the TBI group on methylphenidate nor the HC group demonstrated any significant negative interactions. Our findings indicate that (a) TBI significantly reduces the levels of activation and connectivity strength between key areas of the working memory network and (b) Methylphenidate improves the cognitive outcomes on a working memory task. Therefore, we conclude that methylphenidate may render the working memory network in a TBI group more consistent with that of an intact working memory network.

  15. Cross-Modal Decoding of Neural Patterns Associated with Working Memory: Evidence for Attention-Based Accounts of Working Memory

    PubMed Central

    Majerus, Steve; Cowan, Nelson; Péters, Frédéric; Van Calster, Laurens; Phillips, Christophe; Schrouff, Jessica

    2016-01-01

    Recent studies suggest common neural substrates involved in verbal and visual working memory (WM), interpreted as reflecting shared attention-based, short-term retention mechanisms. We used a machine-learning approach to determine more directly the extent to which common neural patterns characterize retention in verbal WM and visual WM. Verbal WM was assessed via a standard delayed probe recognition task for letter sequences of variable length. Visual WM was assessed via a visual array WM task involving the maintenance of variable amounts of visual information in the focus of attention. We trained a classifier to distinguish neural activation patterns associated with high- and low-visual WM load and tested the ability of this classifier to predict verbal WM load (high–low) from their associated neural activation patterns, and vice versa. We observed significant between-task prediction of load effects during WM maintenance, in posterior parietal and superior frontal regions of the dorsal attention network; in contrast, between-task prediction in sensory processing cortices was restricted to the encoding stage. Furthermore, between-task prediction of load effects was strongest in those participants presenting the highest capacity for the visual WM task. This study provides novel evidence for common, attention-based neural patterns supporting verbal and visual WM. PMID:25146374

  16. Top-down and bottom-up attention-to-memory: mapping functional connectivity in two distinct networks that underlie cued and uncued recognition memory.

    PubMed

    Burianová, Hana; Ciaramelli, Elisa; Grady, Cheryl L; Moscovitch, Morris

    2012-11-15

    The objective of this study was to examine the functional connectivity of brain regions active during cued and uncued recognition memory to test the idea that distinct networks would underlie these memory processes, as predicted by the attention-to-memory (AtoM) hypothesis. The AtoM hypothesis suggests that dorsal parietal cortex (DPC) allocates effortful top-down attention to memory retrieval during cued retrieval, whereas ventral parietal cortex (VPC) mediates spontaneous bottom-up capture of attention by memory during uncued retrieval. To identify networks associated with these two processes, we conducted a functional connectivity analysis of a left DPC and a left VPC region, both identified by a previous analysis of task-related regional activations. We hypothesized that the two parietal regions would be functionally connected with distinct neural networks, reflecting their engagement in the differential mnemonic processes. We found two spatially dissociated networks that overlapped only in the precuneus. During cued trials, DPC was functionally connected with dorsal attention areas, including the superior parietal lobules, right precuneus, and premotor cortex, as well as relevant memory areas, such as the left hippocampus and the middle frontal gyri. During uncued trials, VPC was functionally connected with ventral attention areas, including the supramarginal gyrus, cuneus, and right fusiform gyrus, as well as the parahippocampal gyrus. In addition, activity in the DPC network was associated with faster response times for cued retrieval. This is the first study to show a dissociation of the functional connectivity of posterior parietal regions during episodic memory retrieval, characterized by a top-down AtoM network involving DPC and a bottom-up AtoM network involving VPC. Copyright © 2012 Elsevier Inc. All rights reserved.

  17. Learning, memory, and the role of neural network architecture.

    PubMed

    Hermundstad, Ann M; Brown, Kevin S; Bassett, Danielle S; Carlson, Jean M

    2011-06-01

    The performance of information processing systems, from artificial neural networks to natural neuronal ensembles, depends heavily on the underlying system architecture. In this study, we compare the performance of parallel and layered network architectures during sequential tasks that require both acquisition and retention of information, thereby identifying tradeoffs between learning and memory processes. During the task of supervised, sequential function approximation, networks produce and adapt representations of external information. Performance is evaluated by statistically analyzing the error in these representations while varying the initial network state, the structure of the external information, and the time given to learn the information. We link performance to complexity in network architecture by characterizing local error landscape curvature. We find that variations in error landscape structure give rise to tradeoffs in performance; these include the ability of the network to maximize accuracy versus minimize inaccuracy and produce specific versus generalizable representations of information. Parallel networks generate smooth error landscapes with deep, narrow minima, enabling them to find highly specific representations given sufficient time. While accurate, however, these representations are difficult to generalize. In contrast, layered networks generate rough error landscapes with a variety of local minima, allowing them to quickly find coarse representations. Although less accurate, these representations are easily adaptable. The presence of measurable performance tradeoffs in both layered and parallel networks has implications for understanding the behavior of a wide variety of natural and artificial learning systems.

  18. [Advances in Acupuncture Mechanism Research on the Changes of Synaptic Plasticity: "Pain Memory" for Chronic Pain].

    PubMed

    Yang, Yi-Ling; Huang, Jian-Peng; Jiang, Li; Liu, Jian-Hua

    2017-12-25

    Previous studies have shown that there are many common structures between the neural network of pain and memory, and the main structure in the pain network is also part of the memory network. Chronic pain is characterized by recurrent attacks and is associated with persistent ectopic impulse, which causes changes in synaptic structure and function based on nerve activity. These changes may induce long-term potentiation of synaptic transmission, and ultimately lead to changes in the central nervous system to produce "pain memory". Acupuncture is an effective method in treating chronic pain. It has been proven that acupuncture can affect the spinal cord dorsal horn, hippocampus, cingulate gyrus and other related areas. The possible mechanisms of action include opioid-induced analgesia, activation of glial cells, and the expression of brain derived neurotrophic factor (BDNF). In this study, we systematically review the brain structures, stage of "pain memory" and the mechanisms of acupuncture on synaptic plasticity in chronic pain.

  19. Cellular computational platform and neurally inspired elements thereof

    DOEpatents

    Okandan, Murat

    2016-11-22

    A cellular computational platform is disclosed that includes a multiplicity of functionally identical, repeating computational hardware units that are interconnected electrically and optically. Each computational hardware unit includes a reprogrammable local memory and has interconnections to other such units that have reconfigurable weights. Each computational hardware unit is configured to transmit signals into the network for broadcast in a protocol-less manner to other such units in the network, and to respond to protocol-less broadcast messages that it receives from the network. Each computational hardware unit is further configured to reprogram the local memory in response to incoming electrical and/or optical signals.

  20. The Role of Episodic Memory in Controlled Evaluative Judgments about Attitudes: An Event-Related Potential Study

    ERIC Educational Resources Information Center

    Johnson, Ray, Jr.; Simon, Elizabeth J.; Henkell, Heather; Zhu, John

    2011-01-01

    Event-related potentials (ERPs) are unique in their ability to provide information about the timing of activity in the neural networks that perform complex cognitive processes. Given the dearth of extant data from normal controls on the question of whether attitude representations are stored in episodic or semantic memory, the goal here was to…

  1. A Neural Network Model of the Effects of Entrenchment and Memory Development on Grammatical Gender Learning

    ERIC Educational Resources Information Center

    Monner, Derek; Vatz, Karen; Morini, Giovanna; Hwang, So-One; DeKeyser, Robert

    2013-01-01

    To investigate potential causes of L2 performance deficits that correlate with age of onset, we use a computational model to explore the individual contributions of L1 entrenchment and aspects of memory development. Since development and L1 entrenchment almost invariably coincide, studying them independently is seldom possible in humans. To avoid…

  2. Algorithm for optimizing bipolar interconnection weights with applications in associative memories and multitarget classification.

    PubMed

    Chang, S; Wong, K W; Zhang, W; Zhang, Y

    1999-08-10

    An algorithm for optimizing a bipolar interconnection weight matrix with the Hopfield network is proposed. The effectiveness of this algorithm is demonstrated by computer simulation and optical implementation. In the optical implementation of the neural network the interconnection weights are biased to yield a nonnegative weight matrix. Moreover, a threshold subchannel is added so that the system can realize, in real time, the bipolar weighted summation in a single channel. Preliminary experimental results obtained from the applications in associative memories and multitarget classification with rotation invariance are shown.

  3. Algorithm for Optimizing Bipolar Interconnection Weights with Applications in Associative Memories and Multitarget Classification

    NASA Astrophysics Data System (ADS)

    Chang, Shengjiang; Wong, Kwok-Wo; Zhang, Wenwei; Zhang, Yanxin

    1999-08-01

    An algorithm for optimizing a bipolar interconnection weight matrix with the Hopfield network is proposed. The effectiveness of this algorithm is demonstrated by computer simulation and optical implementation. In the optical implementation of the neural network the interconnection weights are biased to yield a nonnegative weight matrix. Moreover, a threshold subchannel is added so that the system can realize, in real time, the bipolar weighted summation in a single channel. Preliminary experimental results obtained from the applications in associative memories and multitarget classification with rotation invariance are shown.

  4. Past makes future: role of pFC in prediction.

    PubMed

    Fuster, Joaquín M; Bressler, Steven L

    2015-04-01

    The pFC enables the essential human capacities for predicting future events and preadapting to them. These capacities rest on both the structure and dynamics of the human pFC. Structurally, pFC, together with posterior association cortex, is at the highest hierarchical level of cortical organization, harboring neural networks that represent complex goal-directed actions. Dynamically, pFC is at the highest level of the perception-action cycle, the circular processing loop through the cortex that interfaces the organism with the environment in the pursuit of goals. In its predictive and preadaptive roles, pFC supports cognitive functions that are critical for the temporal organization of future behavior, including planning, attentional set, working memory, decision-making, and error monitoring. These functions have a common future perspective and are dynamically intertwined in goal-directed action. They all utilize the same neural infrastructure: a vast array of widely distributed, overlapping, and interactive cortical networks of personal memory and semantic knowledge, named cognits, which are formed by synaptic reinforcement in learning and memory acquisition. From this cortex-wide reservoir of memory and knowledge, pFC generates purposeful, goal-directed actions that are preadapted to predicted future events.

  5. Optimal Design for Hetero-Associative Memory: Hippocampal CA1 Phase Response Curve and Spike-Timing-Dependent Plasticity

    PubMed Central

    Miyata, Ryota; Ota, Keisuke; Aonishi, Toru

    2013-01-01

    Recently reported experimental findings suggest that the hippocampal CA1 network stores spatio-temporal spike patterns and retrieves temporally reversed and spread-out patterns. In this paper, we explore the idea that the properties of the neural interactions and the synaptic plasticity rule in the CA1 network enable it to function as a hetero-associative memory recalling such reversed and spread-out spike patterns. In line with Lengyel’s speculation (Lengyel et al., 2005), we firstly derive optimally designed spike-timing-dependent plasticity (STDP) rules that are matched to neural interactions formalized in terms of phase response curves (PRCs) for performing the hetero-associative memory function. By maximizing object functions formulated in terms of mutual information for evaluating memory retrieval performance, we search for STDP window functions that are optimal for retrieval of normal and doubly spread-out patterns under the constraint that the PRCs are those of CA1 pyramidal neurons. The system, which can retrieve normal and doubly spread-out patterns, can also retrieve reversed patterns with the same quality. Finally, we demonstrate that purposely designed STDP window functions qualitatively conform to typical ones found in CA1 pyramidal neurons. PMID:24204822

  6. Verbal working memory-related neural network communication in schizophrenia.

    PubMed

    Kustermann, Thomas; Popov, Tzvetan; Miller, Gregory A; Rockstroh, Brigitte

    2018-04-19

    Impaired working memory (WM) in schizophrenia is associated with reduced hemodynamic and electromagnetic activity and altered network connectivity within and between memory-associated neural networks. The present study sought to determine whether schizophrenia involves disruption of a frontal-parietal network normally supporting WM and/or involvement of another brain network. Nineteen schizophrenia patients (SZ) and 19 healthy comparison subjects (HC) participated in a cued visual-verbal Sternberg task while dense-array EEG was recorded. A pair of item arrays each consisting of 2-4 consonants was presented bilaterally for 200 ms with a prior cue signaling the hemifield of the task-relevant WM set. A central probe letter 2,000 ms later prompted a choice reaction time decision about match/mismatch with the target WM set. Group and WM load effects on time domain and time-frequency domain 11-15 Hz alpha power were assessed for the cue-to-probe time window, and posterior 11-15 Hz alpha power and frontal 4-8 Hz theta power were assessed during the retention period. Directional connectivity was estimated via Granger causality, evaluating group differences in communication. SZ showed slower responding, lower accuracy, smaller overall time-domain alpha power increase, and less load-dependent alpha power increase. Midline frontal theta power increases did not vary by group or load. Network communication in SZ was characterized by temporal-to-posterior information flow, in contrast to bidirectional temporal-posterior communication in HC. Results indicate aberrant WM network activity supporting WM in SZ that might facilitate normal load-dependent and only marginally less accurate task performance, despite generally slower responding. © 2018 Society for Psychophysiological Research.

  7. Intrusive Images in Psychological Disorders

    PubMed Central

    Brewin, Chris R.; Gregory, James D.; Lipton, Michelle; Burgess, Neil

    2010-01-01

    Involuntary images and visual memories are prominent in many types of psychopathology. Patients with posttraumatic stress disorder, other anxiety disorders, depression, eating disorders, and psychosis frequently report repeated visual intrusions corresponding to a small number of real or imaginary events, usually extremely vivid, detailed, and with highly distressing content. Both memory and imagery appear to rely on common networks involving medial prefrontal regions, posterior regions in the medial and lateral parietal cortices, the lateral temporal cortex, and the medial temporal lobe. Evidence from cognitive psychology and neuroscience implies distinct neural bases to abstract, flexible, contextualized representations (C-reps) and to inflexible, sensory-bound representations (S-reps). We revise our previous dual representation theory of posttraumatic stress disorder to place it within a neural systems model of healthy memory and imagery. The revised model is used to explain how the different types of distressing visual intrusions associated with clinical disorders arise, in terms of the need for correct interaction between the neural systems supporting S-reps and C-reps via visuospatial working memory. Finally, we discuss the treatment implications of the new model and relate it to existing forms of psychological therapy. PMID:20063969

  8. Critical branching neural networks.

    PubMed

    Kello, Christopher T

    2013-01-01

    It is now well-established that intrinsic variations in human neural and behavioral activity tend to exhibit scaling laws in their fluctuations and distributions. The meaning of these scaling laws is an ongoing matter of debate between isolable causes versus pervasive causes. A spiking neural network model is presented that self-tunes to critical branching and, in doing so, simulates observed scaling laws as pervasive to neural and behavioral activity. These scaling laws are related to neural and cognitive functions, in that critical branching is shown to yield spiking activity with maximal memory and encoding capacities when analyzed using reservoir computing techniques. The model is also shown to account for findings of pervasive 1/f scaling in speech and cued response behaviors that are difficult to explain by isolable causes. Issues and questions raised by the model and its results are discussed from the perspectives of physics, neuroscience, computer and information sciences, and psychological and cognitive sciences.

  9. Real-time determination of fringe pattern frequencies: An application to pressure measurement

    NASA Astrophysics Data System (ADS)

    Sciammarella, Cesar A.; Piroozan, Parham

    2007-05-01

    Retrieving information in real time from fringe patterns is a topic of a great deal of interest in scientific and engineering applications of optical methods. This paper presents a method for fringe frequency determination based on the capability of neural networks to recognize signals that are similar but not identical to signals used to train the neural network. Sampled patterns are generated by calibration and stored in memory. Incoming patterns are analyzed by a back-propagation neural network at the speed of the recording device, a CCD camera. This method of information retrieval is utilized to measure pressures on a boundary layer flow. The sensor combines optics and electronics to analyze dynamic pressure distributions and to feed information to a control system that is capable to preserve the stability of the flow.

  10. Recognition of abstract objects via neural oscillators: interaction among topological organization, associative memory and gamma band synchronization.

    PubMed

    Ursino, Mauro; Magosso, Elisa; Cuppini, Cristiano

    2009-02-01

    Synchronization of neural activity in the gamma band is assumed to play a significant role not only in perceptual processing, but also in higher cognitive functions. Here, we propose a neural network of Wilson-Cowan oscillators to simulate recognition of abstract objects, each represented as a collection of four features. Features are ordered in topological maps of oscillators connected via excitatory lateral synapses, to implement a similarity principle. Experience on previous objects is stored in long-range synapses connecting the different topological maps, and trained via timing dependent Hebbian learning (previous knowledge principle). Finally, a downstream decision network detects the presence of a reliable object representation, when all features are oscillating in synchrony. Simulations performed giving various simultaneous objects to the network (from 1 to 4), with some missing and/or modified properties suggest that the network can reconstruct objects, and segment them from the other simultaneously present objects, even in case of deteriorated information, noise, and moderate correlation among the inputs (one common feature). The balance between sensitivity and specificity depends on the strength of the Hebbian learning. Achieving a correct reconstruction in all cases, however, requires ad hoc selection of the oscillation frequency. The model represents an attempt to investigate the interactions among topological maps, autoassociative memory, and gamma-band synchronization, for recognition of abstract objects.

  11. Self-consistent signal-to-noise analysis of the statistical behavior of analog neural networks and enhancement of the storage capacity

    NASA Astrophysics Data System (ADS)

    Shiino, Masatoshi; Fukai, Tomoki

    1993-08-01

    Based on the self-consistent signal-to-noise analysis (SCSNA) capable of dealing with analog neural networks with a wide class of transfer functions, enhancement of the storage capacity of associative memory and the related statistical properties of neural networks are studied for random memory patterns. Two types of transfer functions with the threshold parameter θ are considered, which are derived from the sigmoidal one to represent the output of three-state neurons. Neural networks having a monotonically increasing transfer function FM, FM(u)=sgnu (||u||>θ), FM(u)=0 (||u||<=θ), are shown to make it impossible for the spin-glass state to coexist with retrieval states in a certain parameter region of θ and α (loading rate of memory patterns), implying the reduction of the number of spurious states. The behavior of the storage capacity with changing θ is qualitatively the same as that of the Ising spin neural networks with varying temperature. On the other hand, the nonmonotonic transfer function FNM, FNM(u)=sgnu (||u||<θ), FNM(u)=0 (||u||>=θ) gives rise to remarkable features in several respects. First, it yields a large enhancement of the storage capacity compared with the Amit-Gutfreund-Sompolinsky (AGS) value: with decreasing θ from θ=∞, the storage capacity αc of such a network is increased from the AGS value (~=0.14) to attain its maximum value of ~=0.42 at θ~=0.7 and afterwards is decreased to vanish at θ=0. Whereas for θ>~1 the storage capacity αc coincides with the value αc~ determined by the SCSNA as the upper bound of α ensuring the existence of retrieval solutions, for θ<~1 the αc is shown to differ from the αc~ with the result that the retrieval solutions claimed by the SCSNA are unstable for αc<α<αc~. Second, in the case of θ<1 the network can exhibit a new type of phase which appears as a result of a phase transition with respect to the non-Gaussian distribution of the local fields of neurons: the standard type of retrieval state with r≠0 (i.e., finite width of the local field distribution), which is implied by the order-parameter equations of the SCSNA, disappears at a certain critical loading rate α0, and for α<=α0 a qualitatively different type of retrieval state comes into existence in which the width of the local field distribution vanishes (i.e., r=0+). As a consequence, memory retrieval without errors becomes possible even in the saturation limit α≠0. Results of the computer simulations on the statistical properties of the novel phase with α<=α0 are shown to be in satisfactory agreement with the theoretical results. The effect of introducing self-couplings on the storage capacity is also analyzed for the two types of networks. It is conspicuous for the networks with FNM, where the self-couplings increase the stability of the retrieval solutions of the SCSNA with small values of θ, leading to a remarkable enhancement of the storage capacity.

  12. Streaming parallel GPU acceleration of large-scale filter-based spiking neural networks.

    PubMed

    Slażyński, Leszek; Bohte, Sander

    2012-01-01

    The arrival of graphics processing (GPU) cards suitable for massively parallel computing promises affordable large-scale neural network simulation previously only available at supercomputing facilities. While the raw numbers suggest that GPUs may outperform CPUs by at least an order of magnitude, the challenge is to develop fine-grained parallel algorithms to fully exploit the particulars of GPUs. Computation in a neural network is inherently parallel and thus a natural match for GPU architectures: given inputs, the internal state for each neuron can be updated in parallel. We show that for filter-based spiking neurons, like the Spike Response Model, the additive nature of membrane potential dynamics enables additional update parallelism. This also reduces the accumulation of numerical errors when using single precision computation, the native precision of GPUs. We further show that optimizing simulation algorithms and data structures to the GPU's architecture has a large pay-off: for example, matching iterative neural updating to the memory architecture of the GPU speeds up this simulation step by a factor of three to five. With such optimizations, we can simulate in better-than-realtime plausible spiking neural networks of up to 50 000 neurons, processing over 35 million spiking events per second.

  13. Variable-Resistivity Material For Memory Circuits

    NASA Technical Reports Server (NTRS)

    Nagasubramanian, Ganesan; Distefano, Salvador; Moacanin, Jovan

    1989-01-01

    Nonvolatile memory elements packed densely. Electrically-erasable, programmable, read-only memory matrices made with newly-synthesized organic material of variable electrical resistivity. Material, polypyrrole doped with tetracyanoquinhydrone (TCNQ), changes reversibly between insulating or higher-resistivity state and conducting or low-resistivity state. Thin film of conductive polymer separates layer of row conductors from layer of column conductors. Resistivity of film at each intersection and, therefore, resistance of memory element defined by row and column, increased or decreased by application of suitable switching voltage. Matrix circuits made with this material useful for experiments in associative electronic memories based on models of neural networks.

  14. Short-term memory in networks of dissociated cortical neurons.

    PubMed

    Dranias, Mark R; Ju, Han; Rajaram, Ezhilarasan; VanDongen, Antonius M J

    2013-01-30

    Short-term memory refers to the ability to store small amounts of stimulus-specific information for a short period of time. It is supported by both fading and hidden memory processes. Fading memory relies on recurrent activity patterns in a neuronal network, whereas hidden memory is encoded using synaptic mechanisms, such as facilitation, which persist even when neurons fall silent. We have used a novel computational and optogenetic approach to investigate whether these same memory processes hypothesized to support pattern recognition and short-term memory in vivo, exist in vitro. Electrophysiological activity was recorded from primary cultures of dissociated rat cortical neurons plated on multielectrode arrays. Cultures were transfected with ChannelRhodopsin-2 and optically stimulated using random dot stimuli. The pattern of neuronal activity resulting from this stimulation was analyzed using classification algorithms that enabled the identification of stimulus-specific memories. Fading memories for different stimuli, encoded in ongoing neural activity, persisted and could be distinguished from each other for as long as 1 s after stimulation was terminated. Hidden memories were detected by altered responses of neurons to additional stimulation, and this effect persisted longer than 1 s. Interestingly, network bursts seem to eliminate hidden memories. These results are similar to those that have been reported from similar experiments in vivo and demonstrate that mechanisms of information processing and short-term memory can be studied using cultured neuronal networks, thereby setting the stage for therapeutic applications using this platform.

  15. Functional Brain Network Modularity Captures Inter- and Intra-Individual Variation in Working Memory Capacity

    PubMed Central

    Stevens, Alexander A.; Tappon, Sarah C.; Garg, Arun; Fair, Damien A.

    2012-01-01

    Background Cognitive abilities, such as working memory, differ among people; however, individuals also vary in their own day-to-day cognitive performance. One potential source of cognitive variability may be fluctuations in the functional organization of neural systems. The degree to which the organization of these functional networks is optimized may relate to the effective cognitive functioning of the individual. Here we specifically examine how changes in the organization of large-scale networks measured via resting state functional connectivity MRI and graph theory track changes in working memory capacity. Methodology/Principal Findings Twenty-two participants performed a test of working memory capacity and then underwent resting-state fMRI. Seventeen subjects repeated the protocol three weeks later. We applied graph theoretic techniques to measure network organization on 34 brain regions of interest (ROI). Network modularity, which measures the level of integration and segregation across sub-networks, and small-worldness, which measures global network connection efficiency, both predicted individual differences in memory capacity; however, only modularity predicted intra-individual variation across the two sessions. Partial correlations controlling for the component of working memory that was stable across sessions revealed that modularity was almost entirely associated with the variability of working memory at each session. Analyses of specific sub-networks and individual circuits were unable to consistently account for working memory capacity variability. Conclusions/Significance The results suggest that the intrinsic functional organization of an a priori defined cognitive control network measured at rest provides substantial information about actual cognitive performance. The association of network modularity to the variability in an individual's working memory capacity suggests that the organization of this network into high connectivity within modules and sparse connections between modules may reflect effective signaling across brain regions, perhaps through the modulation of signal or the suppression of the propagation of noise. PMID:22276205

  16. Exploring Neural Network Models with Hierarchical Memories and Their Use in Modeling Biological Systems

    NASA Astrophysics Data System (ADS)

    Pusuluri, Sai Teja

    Energy landscapes are often used as metaphors for phenomena in biology, social sciences and finance. Different methods have been implemented in the past for the construction of energy landscapes. Neural network models based on spin glass physics provide an excellent mathematical framework for the construction of energy landscapes. This framework uses a minimal number of parameters and constructs the landscape using data from the actual phenomena. In the past neural network models were used to mimic the storage and retrieval process of memories (patterns) in the brain. With advances in the field now, these models are being used in machine learning, deep learning and modeling of complex phenomena. Most of the past literature focuses on increasing the storage capacity and stability of stored patterns in the network but does not study these models from a modeling perspective or an energy landscape perspective. This dissertation focuses on neural network models both from a modeling perspective and from an energy landscape perspective. I firstly show how the cellular interconversion phenomenon can be modeled as a transition between attractor states on an epigenetic landscape constructed using neural network models. The model allows the identification of a reaction coordinate of cellular interconversion by analyzing experimental and simulation time course data. Monte Carlo simulations of the model show that the initial phase of cellular interconversion is a Poisson process and the later phase of cellular interconversion is a deterministic process. Secondly, I explore the static features of landscapes generated using neural network models, such as sizes of basins of attraction and densities of metastable states. The simulation results show that the static landscape features are strongly dependent on the correlation strength and correlation structure between patterns. Using different hierarchical structures of the correlation between patterns affects the landscape features. These results show how the static landscape features can be controlled by adjusting the correlations between patterns. Finally, I explore the dynamical features of landscapes generated using neural network models such as the stability of minima and the transition rates between minima. The results from this project show that the stability depends on the correlations between patterns. It is also found that the transition rates between minima strongly depend on the type of bias applied and the correlation between patterns. The results from this part of the dissertation can be useful in engineering an energy landscape without even having the complete information about the associated minima of the landscape.

  17. Mittag-Leffler synchronization of fractional neural networks with time-varying delays and reaction-diffusion terms using impulsive and linear controllers.

    PubMed

    Stamova, Ivanka; Stamov, Gani

    2017-12-01

    In this paper, we propose a fractional-order neural network system with time-varying delays and reaction-diffusion terms. We first develop a new Mittag-Leffler synchronization strategy for the controlled nodes via impulsive controllers. Using the fractional Lyapunov method sufficient conditions are given. We also study the global Mittag-Leffler synchronization of two identical fractional impulsive reaction-diffusion neural networks using linear controllers, which was an open problem even for integer-order models. Since the Mittag-Leffler stability notion is a generalization of the exponential stability concept for fractional-order systems, our results extend and improve the exponential impulsive control theory of neural network system with time-varying delays and reaction-diffusion terms to the fractional-order case. The fractional-order derivatives allow us to model the long-term memory in the neural networks, and thus the present research provides with a conceptually straightforward mathematical representation of rather complex processes. Illustrative examples are presented to show the validity of the obtained results. We show that by means of appropriate impulsive controllers we can realize the stability goal and to control the qualitative behavior of the states. An image encryption scheme is extended using fractional derivatives. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Neural correlates of confidence during item recognition and source memory retrieval: evidence for both dual-process and strength memory theories.

    PubMed

    Hayes, Scott M; Buchler, Norbou; Stokes, Jared; Kragel, James; Cabeza, Roberto

    2011-12-01

    Although the medial-temporal lobes (MTL), PFC, and parietal cortex are considered primary nodes in the episodic memory network, there is much debate regarding the contributions of MTL, PFC, and parietal subregions to recollection versus familiarity (dual-process theory) and the feasibility of accounts on the basis of a single memory strength process (strength theory). To investigate these issues, the current fMRI study measured activity during retrieval of memories that differed quantitatively in terms of strength (high vs. low-confidence trials) and qualitatively in terms of recollection versus familiarity (source vs. item memory tasks). Support for each theory varied depending on which node of the episodic memory network was considered. Results from MTL best fit a dual-process account, as a dissociation was found between a right hippocampal region showing high-confidence activity during the source memory task and bilateral rhinal regions showing high-confidence activity during the item memory task. Within PFC, several left-lateralized regions showed greater activity for source than item memory, consistent with recollective orienting, whereas a right-lateralized ventrolateral area showed low-confidence activity in both tasks, consistent with monitoring processes. Parietal findings were generally consistent with strength theory, with dorsal areas showing low-confidence activity and ventral areas showing high-confidence activity in both tasks. This dissociation fits with an attentional account of parietal functions during episodic retrieval. The results suggest that both dual-process and strength theories are partly correct, highlighting the need for an integrated model that links to more general cognitive theories to account for observed neural activity during episodic memory retrieval.

  19. Neural Correlates of Confidence during Item Recognition and Source Memory Retrieval: Evidence for Both Dual-process and Strength Memory Theories

    PubMed Central

    Hayes, Scott M.; Buchler, Norbou; Stokes, Jared; Kragel, James; Cabeza, Roberto

    2012-01-01

    Although the medial-temporal lobes (MTL), PFC, and parietal cortex are considered primary nodes in the episodic memory network, there is much debate regarding the contributions of MTL, PFC, and parietal subregions to recollection versus familiarity (dual-process theory) and the feasibility of accounts on the basis of a single memory strength process (strength theory). To investigate these issues, the current fMRI study measured activity during retrieval of memories that differed quantitatively in terms of strength (high vs. low-confidence trials) and qualitatively in terms of recollection versus familiarity (source vs. item memory tasks). Support for each theory varied depending on which node of the episodic memory network was considered. Results from MTL best fit a dual-process account, as a dissociation was found between a right hippocampal region showing high-confidence activity during the source memory task and bilateral rhinal regions showing high-confidence activity during the item memory task. Within PFC, several left-lateralized regions showed greater activity for source than item memory, consistent with recollective orienting, whereas a right-lateralized ventrolateral area showed low-confidence activity in both tasks, consistent with monitoring processes. Parietal findings were generally consistent with strength theory, with dorsal areas showing low-confidence activity and ventral areas showing high-confidence activity in both tasks. This dissociation fits with an attentional account of parietal functions during episodic retrieval. The results suggest that both dual-process and strength theories are partly correct, highlighting the need for an integrated model that links to more general cognitive theories to account for observed neural activity during episodic memory retrieval. PMID:21736454

  20. The influence of age and mild cognitive impairment on associative memory performance and underlying brain networks.

    PubMed

    Oedekoven, Christiane S H; Jansen, Andreas; Keidel, James L; Kircher, Tilo; Leube, Dirk

    2015-12-01

    Associative memory is essential to everyday activities, such as the binding of faces and corresponding names to form single bits of information. However, this ability often becomes impaired with increasing age. The most important neural substrate of associative memory is the hippocampus, a structure crucially implicated in the pathogenesis of Alzheimer's disease (AD). The main aim of this study was to compare neural correlates of associative memory in healthy aging and mild cognitive impairment (MCI), an at-risk state for AD. We used fMRI to investigate differences in brain activation and connectivity between young controls (n = 20), elderly controls (n = 32) and MCI patients (n = 21) during associative memory retrieval. We observed lower hippocampal activation in MCI patients than control groups during a face-name recognition task, and the magnitude of this decrement was correlated with lower associative memory performance. Further, increased activation in precentral regions in all older adults indicated a stronger involvement of the task positive network (TPN) with age. Finally, functional connectivity analysis revealed a stronger link of hippocampal and striatal components in older adults in comparison to young controls, regardless of memory impairment. In elderly controls, this went hand-in-hand with a stronger activation of striatal areas. Increased TPN activation may be linked to greater reliance on cognitive control in both older groups, while increased functional connectivity between the hippocampus and the striatum may suggest dedifferentiation, especially in elderly controls.

  1. Optimal Recall from Bounded Metaplastic Synapses: Predicting Functional Adaptations in Hippocampal Area CA3

    PubMed Central

    Savin, Cristina; Dayan, Peter; Lengyel, Máté

    2014-01-01

    A venerable history of classical work on autoassociative memory has significantly shaped our understanding of several features of the hippocampus, and most prominently of its CA3 area, in relation to memory storage and retrieval. However, existing theories of hippocampal memory processing ignore a key biological constraint affecting memory storage in neural circuits: the bounded dynamical range of synapses. Recent treatments based on the notion of metaplasticity provide a powerful model for individual bounded synapses; however, their implications for the ability of the hippocampus to retrieve memories well and the dynamics of neurons associated with that retrieval are both unknown. Here, we develop a theoretical framework for memory storage and recall with bounded synapses. We formulate the recall of a previously stored pattern from a noisy recall cue and limited-capacity (and therefore lossy) synapses as a probabilistic inference problem, and derive neural dynamics that implement approximate inference algorithms to solve this problem efficiently. In particular, for binary synapses with metaplastic states, we demonstrate for the first time that memories can be efficiently read out with biologically plausible network dynamics that are completely constrained by the synaptic plasticity rule, and the statistics of the stored patterns and of the recall cue. Our theory organises into a coherent framework a wide range of existing data about the regulation of excitability, feedback inhibition, and network oscillations in area CA3, and makes novel and directly testable predictions that can guide future experiments. PMID:24586137

  2. Feedforward-Feedback Hybrid Control for Magnetic Shape Memory Alloy Actuators Based on the Krasnosel'skii-Pokrovskii Model

    PubMed Central

    Zhou, Miaolei; Zhang, Qi; Wang, Jingyuan

    2014-01-01

    As a new type of smart material, magnetic shape memory alloy has the advantages of a fast response frequency and outstanding strain capability in the field of microdrive and microposition actuators. The hysteresis nonlinearity in magnetic shape memory alloy actuators, however, limits system performance and further application. Here we propose a feedforward-feedback hybrid control method to improve control precision and mitigate the effects of the hysteresis nonlinearity of magnetic shape memory alloy actuators. First, hysteresis nonlinearity compensation for the magnetic shape memory alloy actuator is implemented by establishing a feedforward controller which is an inverse hysteresis model based on Krasnosel'skii-Pokrovskii operator. Secondly, the paper employs the classical Proportion Integration Differentiation feedback control with feedforward control to comprise the hybrid control system, and for further enhancing the adaptive performance of the system and improving the control accuracy, the Radial Basis Function neural network self-tuning Proportion Integration Differentiation feedback control replaces the classical Proportion Integration Differentiation feedback control. Utilizing self-learning ability of the Radial Basis Function neural network obtains Jacobian information of magnetic shape memory alloy actuator for the on-line adjustment of parameters in Proportion Integration Differentiation controller. Finally, simulation results show that the hybrid control method proposed in this paper can greatly improve the control precision of magnetic shape memory alloy actuator and the maximum tracking error is reduced from 1.1% in the open-loop system to 0.43% in the hybrid control system. PMID:24828010

  3. Feedforward-feedback hybrid control for magnetic shape memory alloy actuators based on the Krasnosel'skii-Pokrovskii model.

    PubMed

    Zhou, Miaolei; Zhang, Qi; Wang, Jingyuan

    2014-01-01

    As a new type of smart material, magnetic shape memory alloy has the advantages of a fast response frequency and outstanding strain capability in the field of microdrive and microposition actuators. The hysteresis nonlinearity in magnetic shape memory alloy actuators, however, limits system performance and further application. Here we propose a feedforward-feedback hybrid control method to improve control precision and mitigate the effects of the hysteresis nonlinearity of magnetic shape memory alloy actuators. First, hysteresis nonlinearity compensation for the magnetic shape memory alloy actuator is implemented by establishing a feedforward controller which is an inverse hysteresis model based on Krasnosel'skii-Pokrovskii operator. Secondly, the paper employs the classical Proportion Integration Differentiation feedback control with feedforward control to comprise the hybrid control system, and for further enhancing the adaptive performance of the system and improving the control accuracy, the Radial Basis Function neural network self-tuning Proportion Integration Differentiation feedback control replaces the classical Proportion Integration Differentiation feedback control. Utilizing self-learning ability of the Radial Basis Function neural network obtains Jacobian information of magnetic shape memory alloy actuator for the on-line adjustment of parameters in Proportion Integration Differentiation controller. Finally, simulation results show that the hybrid control method proposed in this paper can greatly improve the control precision of magnetic shape memory alloy actuator and the maximum tracking error is reduced from 1.1% in the open-loop system to 0.43% in the hybrid control system.

  4. Efficient reinforcement learning of a reservoir network model of parametric working memory achieved with a cluster population winner-take-all readout mechanism.

    PubMed

    Cheng, Zhenbo; Deng, Zhidong; Hu, Xiaolin; Zhang, Bo; Yang, Tianming

    2015-12-01

    The brain often has to make decisions based on information stored in working memory, but the neural circuitry underlying working memory is not fully understood. Many theoretical efforts have been focused on modeling the persistent delay period activity in the prefrontal areas that is believed to represent working memory. Recent experiments reveal that the delay period activity in the prefrontal cortex is neither static nor homogeneous as previously assumed. Models based on reservoir networks have been proposed to model such a dynamical activity pattern. The connections between neurons within a reservoir are random and do not require explicit tuning. Information storage does not depend on the stable states of the network. However, it is not clear how the encoded information can be retrieved for decision making with a biologically realistic algorithm. We therefore built a reservoir-based neural network to model the neuronal responses of the prefrontal cortex in a somatosensory delayed discrimination task. We first illustrate that the neurons in the reservoir exhibit a heterogeneous and dynamical delay period activity observed in previous experiments. Then we show that a cluster population circuit decodes the information from the reservoir with a winner-take-all mechanism and contributes to the decision making. Finally, we show that the model achieves a good performance rapidly by shaping only the readout with reinforcement learning. Our model reproduces important features of previous behavior and neurophysiology data. We illustrate for the first time how task-specific information stored in a reservoir network can be retrieved with a biologically plausible reinforcement learning training scheme. Copyright © 2015 the American Physiological Society.

  5. Neural correlates of verbal associative memory and mnemonic strategy use following childhood traumatic brain injury

    PubMed Central

    Kramer, Megan E.; Chiu, C.-Y. Peter; Shear, Paula K.; Wade, Shari L.

    2010-01-01

    Children with traumatic brain injury (TBI) often experience memory deficits, although the nature, functional implication, and recovery trajectory of such difficulties are poorly understood. The present fMRI study examined the neural activation patterns in a group of young children who sustained moderate TBI in early childhood (n = 7), and a group of healthy control children (n = 13) during a verbal paired associate learning (PAL) task that promoted the use of two mnemonic strategies differing in efficacy. The children with TBI demonstrated intact memory performance and were able to successfully utilize the mnemonic strategies. However, the TBI group also demonstrated altered brain activation patterns during the task compared to the control children. These findings suggest early childhood TBI may alter activation within the network of brain regions supporting associative memory even in children who show good behavioral performance. PMID:21188286

  6. Architecture of fluid intelligence and working memory revealed by lesion mapping.

    PubMed

    Barbey, Aron K; Colom, Roberto; Paul, Erick J; Grafman, Jordan

    2014-03-01

    Although cognitive neuroscience has made valuable progress in understanding the role of the prefrontal cortex in human intelligence, the functional networks that support adaptive behavior and novel problem solving remain to be well characterized. Here, we studied 158 human brain lesion patients to investigate the cognitive and neural foundations of key competencies for fluid intelligence and working memory. We administered a battery of neuropsychological tests, including the Wechsler Adult Intelligence Scale (WAIS) and the N-Back task. Latent variable modeling was applied to obtain error-free scores of fluid intelligence and working memory, followed by voxel-based lesion-symptom mapping to elucidate their neural substrates. The observed latent variable modeling and lesion results support an integrative framework for understanding the architecture of fluid intelligence and working memory and make specific recommendations for the interpretation and application of the WAIS and N-Back task to the study of fluid intelligence in health and disease.

  7. Data systems and computer science programs: Overview

    NASA Technical Reports Server (NTRS)

    Smith, Paul H.; Hunter, Paul

    1991-01-01

    An external review of the Integrated Technology Plan for the Civil Space Program is presented. The topics are presented in viewgraph form and include the following: onboard memory and storage technology; advanced flight computers; special purpose flight processors; onboard networking and testbeds; information archive, access, and retrieval; visualization; neural networks; software engineering; and flight control and operations.

  8. Oscillations in Spurious States of the Associative Memory Model with Synaptic Depression

    NASA Astrophysics Data System (ADS)

    Murata, Shin; Otsubo, Yosuke; Nagata, Kenji; Okada, Masato

    2014-12-01

    The associative memory model is a typical neural network model that can store discretely distributed fixed-point attractors as memory patterns. When the network stores the memory patterns extensively, however, the model has other attractors besides the memory patterns. These attractors are called spurious memories. Both spurious states and memory states are in equilibrium, so there is little difference between their dynamics. Recent physiological experiments have shown that the short-term dynamic synapse called synaptic depression decreases its efficacy of transmission to postsynaptic neurons according to the activities of presynaptic neurons. Previous studies revealed that synaptic depression destabilizes the memory states when the number of memory patterns is finite. However, it is very difficult to study the dynamical properties of the spurious states if the number of memory patterns is proportional to the number of neurons. We investigate the effect of synaptic depression on spurious states by Monte Carlo simulation. The results demonstrate that synaptic depression does not affect the memory states but mainly destabilizes the spurious states and induces periodic oscillations.

  9. An analysis of image storage systems for scalable training of deep neural networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lim, Seung-Hwan; Young, Steven R; Patton, Robert M

    This study presents a principled empirical evaluation of image storage systems for training deep neural networks. We employ the Caffe deep learning framework to train neural network models for three different data sets, MNIST, CIFAR-10, and ImageNet. While training the models, we evaluate five different options to retrieve training image data: (1) PNG-formatted image files on local file system; (2) pushing pixel arrays from image files into a single HDF5 file on local file system; (3) in-memory arrays to hold the pixel arrays in Python and C++; (4) loading the training data into LevelDB, a log-structured merge tree based key-valuemore » storage; and (5) loading the training data into LMDB, a B+tree based key-value storage. The experimental results quantitatively highlight the disadvantage of using normal image files on local file systems to train deep neural networks and demonstrate reliable performance with key-value storage based storage systems. When training a model on the ImageNet dataset, the image file option was more than 17 times slower than the key-value storage option. Along with measurements on training time, this study provides in-depth analysis on the cause of performance advantages/disadvantages of each back-end to train deep neural networks. We envision the provided measurements and analysis will shed light on the optimal way to architect systems for training neural networks in a scalable manner.« less

  10. Holography and optical information processing; Proceedings of the Soviet-Chinese Joint Seminar, Bishkek, Kyrgyzstan, Sept. 21-26, 1991

    NASA Astrophysics Data System (ADS)

    Mikaelian, Andrei L.

    Attention is given to data storage, devices, architectures, and implementations of optical memory and neural networks; holographic optical elements and computer-generated holograms; holographic display and materials; systems, pattern recognition, interferometry, and applications in optical information processing; and special measurements and devices. Topics discussed include optical immersion as a new way to increase information recording density, systems for data reading from optical disks on the basis of diffractive lenses, a new real-time optical associative memory system, an optical pattern recognition system based on a WTA model of neural networks, phase diffraction grating for the integral transforms of coherent light fields, holographic recording with operated sensitivity and stability in chalcogenide glass layers, a compact optical logic processor, a hybrid optical system for computing invariant moments of images, optical fiber holographic inteferometry, and image transmission through random media in single pass via optical phase conjugation.

  11. Urdu Nasta'liq text recognition using implicit segmentation based on multi-dimensional long short term memory neural networks.

    PubMed

    Naz, Saeeda; Umar, Arif Iqbal; Ahmed, Riaz; Razzak, Muhammad Imran; Rashid, Sheikh Faisal; Shafait, Faisal

    2016-01-01

    The recognition of Arabic script and its derivatives such as Urdu, Persian, Pashto etc. is a difficult task due to complexity of this script. Particularly, Urdu text recognition is more difficult due to its Nasta'liq writing style. Nasta'liq writing style inherits complex calligraphic nature, which presents major issues to recognition of Urdu text owing to diagonality in writing, high cursiveness, context sensitivity and overlapping of characters. Therefore, the work done for recognition of Arabic script cannot be directly applied to Urdu recognition. We present Multi-dimensional Long Short Term Memory (MDLSTM) Recurrent Neural Networks with an output layer designed for sequence labeling for recognition of printed Urdu text-lines written in the Nasta'liq writing style. Experiments show that MDLSTM attained a recognition accuracy of 98% for the unconstrained Urdu Nasta'liq printed text, which significantly outperforms the state-of-the-art techniques.

  12. Discrete-state phasor neural networks

    NASA Astrophysics Data System (ADS)

    Noest, André J.

    1988-08-01

    An associative memory network with local variables assuming one of q equidistant positions on the unit circle (q-state phasors) is introduced, and its recall behavior is solved exactly for any q when the interactions are sparse and asymmetric. Such models can describe natural or artifical networks of (neuro-)biological, chemical, or electronic limit-cycle oscillators with q-fold instead of circular symmetry, or similar optical computing devices using a phase-encoded data representation.

  13. Short-term memory of TiO2-based electrochemical capacitors: empirical analysis with adoption of a sliding threshold

    NASA Astrophysics Data System (ADS)

    Lim, Hyungkwang; Kim, Inho; Kim, Jin-Sang; Hwang, Cheol Seong; Jeong, Doo Seok

    2013-09-01

    Chemical synapses are important components of the large-scaled neural network in the hippocampus of the mammalian brain, and a change in their weight is thought to be in charge of learning and memory. Thus, the realization of artificial chemical synapses is of crucial importance in achieving artificial neural networks emulating the brain’s functionalities to some extent. This kind of research is often referred to as neuromorphic engineering. In this study, we report short-term memory behaviours of electrochemical capacitors (ECs) utilizing TiO2 mixed ionic-electronic conductor and various reactive electrode materials e.g. Ti, Ni, and Cr. By experiments, it turned out that the potentiation behaviours did not represent unlimited growth of synaptic weight. Instead, the behaviours exhibited limited synaptic weight growth that can be understood by means of an empirical equation similar to the Bienenstock-Cooper-Munro rule, employing a sliding threshold. The observed potentiation behaviours were analysed using the empirical equation and the differences between the different ECs were parameterized.

  14. Scaling laws describe memories of host-pathogen riposte in the HIV population.

    PubMed

    Barton, John P; Kardar, Mehran; Chakraborty, Arup K

    2015-02-17

    The enormous genetic diversity and mutability of HIV has prevented effective control of this virus by natural immune responses or vaccination. Evolution of the circulating HIV population has thus occurred in response to diverse, ultimately ineffective, immune selection pressures that randomly change from host to host. We show that the interplay between the diversity of human immune responses and the ways that HIV mutates to evade them results in distinct sets of sequences defined by similar collectively coupled mutations. Scaling laws that relate these sets of sequences resemble those observed in linguistics and other branches of inquiry, and dynamics reminiscent of neural networks are observed. Like neural networks that store memories of past stimulation, the circulating HIV population stores memories of host-pathogen combat won by the virus. We describe an exactly solvable model that captures the main qualitative features of the sets of sequences and a simple mechanistic model for the origin of the observed scaling laws. Our results define collective mutational pathways used by HIV to evade human immune responses, which could guide vaccine design.

  15. Nonvolatile Memory Materials for Neuromorphic Intelligent Machines.

    PubMed

    Jeong, Doo Seok; Hwang, Cheol Seong

    2018-04-18

    Recent progress in deep learning extends the capability of artificial intelligence to various practical tasks, making the deep neural network (DNN) an extremely versatile hypothesis. While such DNN is virtually built on contemporary data centers of the von Neumann architecture, physical (in part) DNN of non-von Neumann architecture, also known as neuromorphic computing, can remarkably improve learning and inference efficiency. Particularly, resistance-based nonvolatile random access memory (NVRAM) highlights its handy and efficient application to the multiply-accumulate (MAC) operation in an analog manner. Here, an overview is given of the available types of resistance-based NVRAMs and their technological maturity from the material- and device-points of view. Examples within the strategy are subsequently addressed in comparison with their benchmarks (virtual DNN in deep learning). A spiking neural network (SNN) is another type of neural network that is more biologically plausible than the DNN. The successful incorporation of resistance-based NVRAM in SNN-based neuromorphic computing offers an efficient solution to the MAC operation and spike timing-based learning in nature. This strategy is exemplified from a material perspective. Intelligent machines are categorized according to their architecture and learning type. Also, the functionality and usefulness of NVRAM-based neuromorphic computing are addressed. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Different neural activities support auditory working memory in musicians and bilinguals.

    PubMed

    Alain, Claude; Khatamian, Yasha; He, Yu; Lee, Yunjo; Moreno, Sylvain; Leung, Ada W S; Bialystok, Ellen

    2018-05-17

    Musical training and bilingualism benefit executive functioning and working memory (WM)-however, the brain networks supporting this advantage are not well specified. Here, we used functional magnetic resonance imaging and the n-back task to assess WM for spatial (sound location) and nonspatial (sound category) auditory information in musician monolingual (musicians), nonmusician bilinguals (bilinguals), and nonmusician monolinguals (controls). Musicians outperformed bilinguals and controls on the nonspatial WM task. Overall, spatial and nonspatial WM were associated with greater activity in dorsal and ventral brain regions, respectively. Increasing WM load yielded similar recruitment of the anterior-posterior attention network in all three groups. In both tasks and both levels of difficulty, musicians showed lower brain activity than controls in superior prefrontal frontal gyrus and dorsolateral prefrontal cortex (DLPFC) bilaterally, a finding that may reflect improved and more efficient use of neural resources. Bilinguals showed enhanced activity in language-related areas (i.e., left DLPFC and left supramarginal gyrus) relative to musicians and controls, which could be associated with the need to suppress interference associated with competing semantic activations from multiple languages. These findings indicate that the auditory WM advantage in musicians and bilinguals is mediated by different neural networks specific to each life experience. © 2018 New York Academy of Sciences.

  17. From biological neural networks to thinking machines: Transitioning biological organizational principles to computer technology

    NASA Technical Reports Server (NTRS)

    Ross, Muriel D.

    1991-01-01

    The three-dimensional organization of the vestibular macula is under study by computer assisted reconstruction and simulation methods as a model for more complex neural systems. One goal of this research is to transition knowledge of biological neural network architecture and functioning to computer technology, to contribute to the development of thinking computers. Maculas are organized as weighted neural networks for parallel distributed processing of information. The network is characterized by non-linearity of its terminal/receptive fields. Wiring appears to develop through constrained randomness. A further property is the presence of two main circuits, highly channeled and distributed modifying, that are connected through feedforward-feedback collaterals and biasing subcircuit. Computer simulations demonstrate that differences in geometry of the feedback (afferent) collaterals affects the timing and the magnitude of voltage changes delivered to the spike initiation zone. Feedforward (efferent) collaterals act as voltage followers and likely inhibit neurons of the distributed modifying circuit. These results illustrate the importance of feedforward-feedback loops, of timing, and of inhibition in refining neural network output. They also suggest that it is the distributed modifying network that is most involved in adaptation, memory, and learning. Tests of macular adaptation, through hyper- and microgravitational studies, support this hypothesis since synapses in the distributed modifying circuit, but not the channeled circuit, are altered. Transitioning knowledge of biological systems to computer technology, however, remains problematical.

  18. Microlaser-based compact optical neuro-processors (Invited Paper)

    NASA Astrophysics Data System (ADS)

    Paek, Eung Gi; Chan, Winston K.; Zah, Chung-En; Cheung, Kwok-wai; Curtis, L.; Chang-Hasnain, Constance J.

    1992-10-01

    This paper reviews the recent progress in the development of holographic neural networks using surface-emitting laser diode arrays (SELDAs). Since the previous work on ultrafast holographic memory readout system and a robust incoherent correlator, progress has been made in several areas: the use of an array of monolithic `neurons' to reconstruct holographic memories; two-dimensional (2-D) wavelength-division multiplexing (WDM) for image transmission through a single-mode fiber; and finally, an associative memory using time- division multiplexing (TDM). Experimental demonstrations on these are presented.

  19. Categorization and decision-making in a neurobiologically plausible spiking network using a STDP-like learning rule.

    PubMed

    Beyeler, Michael; Dutt, Nikil D; Krichmar, Jeffrey L

    2013-12-01

    Understanding how the human brain is able to efficiently perceive and understand a visual scene is still a field of ongoing research. Although many studies have focused on the design and optimization of neural networks to solve visual recognition tasks, most of them either lack neurobiologically plausible learning rules or decision-making processes. Here we present a large-scale model of a hierarchical spiking neural network (SNN) that integrates a low-level memory encoding mechanism with a higher-level decision process to perform a visual classification task in real-time. The model consists of Izhikevich neurons and conductance-based synapses for realistic approximation of neuronal dynamics, a spike-timing-dependent plasticity (STDP) synaptic learning rule with additional synaptic dynamics for memory encoding, and an accumulator model for memory retrieval and categorization. The full network, which comprised 71,026 neurons and approximately 133 million synapses, ran in real-time on a single off-the-shelf graphics processing unit (GPU). The network was constructed on a publicly available SNN simulator that supports general-purpose neuromorphic computer chips. The network achieved 92% correct classifications on MNIST in 100 rounds of random sub-sampling, which is comparable to other SNN approaches and provides a conservative and reliable performance metric. Additionally, the model correctly predicted reaction times from psychophysical experiments. Because of the scalability of the approach and its neurobiological fidelity, the current model can be extended to an efficient neuromorphic implementation that supports more generalized object recognition and decision-making architectures found in the brain. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Distinct Neural Circuits Support Transient and Sustained Processes in Prospective Memory and Working Memory

    PubMed Central

    West, Robert; Braver, Todd

    2009-01-01

    Current theories are divided as to whether prospective memory (PM) involves primarily sustained processes such as strategic monitoring, or transient processes such as the retrieval of intentions from memory when a relevant cue is encountered. The current study examined the neural correlates of PM using a functional magnetic resonance imaging design that allows for the decomposition of brain activity into sustained and transient components. Performance of the PM task was primarily associated with sustained responses in a network including anterior prefrontal cortex (lateral Brodmann area 10), and these responses were dissociable from sustained responses associated with active maintenance in working memory. Additionally, the sustained responses in anterior prefrontal cortex correlated with faster response times for prospective responses. Prospective cues also elicited selective transient activity in a region of interest along the right middle temporal gyrus. The results support the conclusion that both sustained and transient processes contribute to efficient PM and provide novel constraints on the functional role of anterior PFC in higher-order cognition. PMID:18854581

Top