Science.gov

Sample records for neuron networks method

  1. Epileptic Neuronal Networks: Methods of Identification and Clinical Relevance

    PubMed Central

    Stefan, Hermann; Lopes da Silva, Fernando H.

    2012-01-01

    The main objective of this paper is to examine evidence for the concept that epileptic activity should be envisaged in terms of functional connectivity and dynamics of neuronal networks. Basic concepts regarding structure and dynamics of neuronal networks are briefly described. Particular attention is given to approaches that are derived, or related, to the concept of causality, as formulated by Granger. Linear and non-linear methodologies aiming at characterizing the dynamics of neuronal networks applied to EEG/MEG and combined EEG/fMRI signals in epilepsy are critically reviewed. The relevance of functional dynamical analysis of neuronal networks with respect to clinical queries in focal cortical dysplasias, temporal lobe epilepsies, and “generalized” epilepsies is emphasized. In the light of the concepts of epileptic neuronal networks, and recent experimental findings, the dichotomic classification in focal and generalized epilepsy is re-evaluated. It is proposed that so-called “generalized epilepsies,” such as absence seizures, are actually fast spreading epilepsies, the onset of which can be tracked down to particular neuronal networks using appropriate network analysis. Finally new approaches to delineate epileptogenic networks are discussed. PMID:23532203

  2. Response functions for electrically coupled neuronal network: a method of local point matching and its applications.

    PubMed

    Yihe, Lu; Timofeeva, Yulia

    2016-06-01

    Neuronal networks connected by electrical synapses, also referred to as gap junctions, are present throughout the entire central nervous system. Many instances of gap-junctional coupling are formed between dendritic arbours of individual cells, and these dendro-dendritic gap junctions are known to play an important role in mediating various brain rhythms in both normal and pathological states. The dynamics of such neuronal networks modelled by passive or quasi-active (resonant) membranes can be described by the Green's function which provides the fundamental input-output relationships of the entire network. One of the methods for calculating this response function is the so-called 'sum-over-trips' framework which enables the construction of the Green's function for an arbitrary network as a convergent infinite series solution. Here we propose an alternative and computationally efficient approach for constructing the Green's functions on dendro-dendritic gap junction-coupled neuronal networks which avoids any infinite terms in the solutions. Instead, the Green's function is constructed from the solution of a system of linear algebraic equations. We apply this new method to a number of systems including a simple single cell model and two-cell neuronal networks. We also demonstrate that the application of this novel approach allows one to reduce a model with complex dendritic formations to an equivalent model with a much simpler morphological structure. PMID:26994016

  3. Micropatterning neuronal networks.

    PubMed

    Hardelauf, Heike; Waide, Sarah; Sisnaiske, Julia; Jacob, Peter; Hausherr, Vanessa; Schöbel, Nicole; Janasek, Dirk; van Thriel, Christoph; West, Jonathan

    2014-07-01

    Spatially organised neuronal networks have wide reaching applications, including fundamental research, toxicology testing, pharmaceutical screening and the realisation of neuronal implant interfaces. Despite the large number of methods catalogued in the literature there remains the need to identify a method that delivers high pattern compliance, long-term stability and is widely accessible to neuroscientists. In this comparative study, aminated (polylysine/polyornithine and aminosilanes) and cytophobic (poly(ethylene glycol) (PEG) and methylated) material contrasts were evaluated. Backfilling plasma stencilled PEGylated substrates with polylysine does not produce good material contrasts, whereas polylysine patterned on methylated substrates becomes mobilised by agents in the cell culture media which results in rapid pattern decay. Aminosilanes, polylysine substitutes, are prone to hydrolysis and the chemistries prove challenging to master. Instead, the stable coupling between polylysine and PLL-g-PEG can be exploited: Microcontact printing polylysine onto a PLL-g-PEG coated glass substrate provides a simple means to produce microstructured networks of primary neurons that have superior pattern compliance during long term (>1 month) culture. PMID:24855658

  4. Numerical methods for solving moment equations in kinetic theory of neuronal network dynamics

    NASA Astrophysics Data System (ADS)

    Rangan, Aaditya V.; Cai, David; Tao, Louis

    2007-02-01

    Recently developed kinetic theory and related closures for neuronal network dynamics have been demonstrated to be a powerful theoretical framework for investigating coarse-grained dynamical properties of neuronal networks. The moment equations arising from the kinetic theory are a system of (1 + 1)-dimensional nonlinear partial differential equations (PDE) on a bounded domain with nonlinear boundary conditions. The PDEs themselves are self-consistently specified by parameters which are functions of the boundary values of the solution. The moment equations can be stiff in space and time. Numerical methods are presented here for efficiently and accurately solving these moment equations. The essential ingredients in our numerical methods include: (i) the system is discretized in time with an implicit Euler method within a spectral deferred correction framework, therefore, the PDEs of the kinetic theory are reduced to a sequence, in time, of boundary value problems (BVPs) with nonlinear boundary conditions; (ii) a set of auxiliary parameters is introduced to recast the original BVP with nonlinear boundary conditions as BVPs with linear boundary conditions - with additional algebraic constraints on the auxiliary parameters; (iii) a careful combination of two Newton's iterates for the nonlinear BVP with linear boundary condition, interlaced with a Newton's iterate for solving the associated algebraic constraints is constructed to achieve quadratic convergence for obtaining the solutions with self-consistent parameters. It is shown that a simple fixed-point iteration can only achieve a linear convergence for the self-consistent parameters. The practicability and efficiency of our numerical methods for solving the moment equations of the kinetic theory are illustrated with numerical examples. It is further demonstrated that the moment equations derived from the kinetic theory of neuronal network dynamics can very well capture the coarse-grained dynamical properties of

  5. Spontaneous Calcium Changes in Micro Neuronal Networks

    NASA Astrophysics Data System (ADS)

    Saito, Aki; Moriguchi, Hiroyuki; Iwabuchi, Shin; Goto, Miho; Takayama, Yuzo; Kotani, Kiyoshi; Jimbo, Yasuhiko

    We have developed a practical experimental method to mass-produce and maintain a variation of minimal neuronal networks (“micro neuronal networks”) consisted of a single to several neurons in culture using spray-patterning technique. In this paper, we could maintain the micro-cultures for one month or more by adding conditioned medium and carried out optical recording of spontaneous activity in micro neuronal networks and considered the interactions between them. To determine the interactions between micro neuronal networks, fluorescence changes in several small networks were simultaneously measured using calcium indicator dye fluo-4 AM, and time-series analysis was carried out using surrogate arrangements. By using the spray-patterning method, a large number of cell-adhesive micro regions were formed. Neurons extended neurites along the edge of the cell-adhesive micro regions and form micro neuronal networks. In part of micro regions, some neurite was protruded from the region, and thus micro neuronal networks were connected with synapses. In these networks, a single neuron-induced network activity was observed. On the other hand, even in morphologically non-connected micro neuronal networks, synchronous oscillations between micro neuronal networks were observed. Our micro-patterning methods and results provide the possibility that synchronous activity is occurred between morphologically non-connected neuronal networks. This suggest that the humoral factor is also a important component for network-wide dynamics.

  6. Control of Neuronal Network in Caenorhabditis elegans

    PubMed Central

    Badhwar, Rahul; Bagler, Ganesh

    2015-01-01

    Caenorhabditis elegans, a soil dwelling nematode, is evolutionarily rudimentary and contains only ∼ 300 neurons which are connected to each other via chemical synapses and gap junctions. This structural connectivity can be perceived as nodes and edges of a graph. Controlling complex networked systems (such as nervous system) has been an area of excitement for mankind. Various methods have been developed to identify specific brain regions, which when controlled by external input can lead to achievement of control over the state of the system. But in case of neuronal connectivity network the properties of neurons identified as driver nodes is of much importance because nervous system can produce a variety of states (behaviour of the animal). Hence to gain insight on the type of control achieved in nervous system we implemented the notion of structural control from graph theory to C. elegans neuronal network. We identified ‘driver neurons’ which can provide full control over the network. We studied phenotypic properties of these neurons which are referred to as ‘phenoframe’ as well as the ‘genoframe’ which represents their genetic correlates. We find that the driver neurons are primarily motor neurons located in the ventral nerve cord and contribute to biological reproduction of the animal. Identification of driver neurons and its characterization adds a new dimension in controllability of C. elegans neuronal network. This study suggests the importance of driver neurons and their utility to control the behaviour of the organism. PMID:26413834

  7. A comparison of computational methods for detecting bursts in neuronal spike trains and their application to human stem cell-derived neuronal networks

    PubMed Central

    Charlesworth, Paul; Thomas, Christopher W.; Paulsen, Ole

    2016-01-01

    Accurate identification of bursting activity is an essential element in the characterization of neuronal network activity. Despite this, no one technique for identifying bursts in spike trains has been widely adopted. Instead, many methods have been developed for the analysis of bursting activity, often on an ad hoc basis. Here we provide an unbiased assessment of the effectiveness of eight of these methods at detecting bursts in a range of spike trains. We suggest a list of features that an ideal burst detection technique should possess and use synthetic data to assess each method in regard to these properties. We further employ each of the methods to reanalyze microelectrode array (MEA) recordings from mouse retinal ganglion cells and examine their coherence with bursts detected by a human observer. We show that several common burst detection techniques perform poorly at analyzing spike trains with a variety of properties. We identify four promising burst detection techniques, which are then applied to MEA recordings of networks of human induced pluripotent stem cell-derived neurons and used to describe the ontogeny of bursting activity in these networks over several months of development. We conclude that no current method can provide “perfect” burst detection results across a range of spike trains; however, two burst detection techniques, the MaxInterval and logISI methods, outperform compared with others. We provide recommendations for the robust analysis of bursting activity in experimental recordings using current techniques. PMID:27098024

  8. A comparison of computational methods for detecting bursts in neuronal spike trains and their application to human stem cell-derived neuronal networks.

    PubMed

    Cotterill, Ellese; Charlesworth, Paul; Thomas, Christopher W; Paulsen, Ole; Eglen, Stephen J

    2016-08-01

    Accurate identification of bursting activity is an essential element in the characterization of neuronal network activity. Despite this, no one technique for identifying bursts in spike trains has been widely adopted. Instead, many methods have been developed for the analysis of bursting activity, often on an ad hoc basis. Here we provide an unbiased assessment of the effectiveness of eight of these methods at detecting bursts in a range of spike trains. We suggest a list of features that an ideal burst detection technique should possess and use synthetic data to assess each method in regard to these properties. We further employ each of the methods to reanalyze microelectrode array (MEA) recordings from mouse retinal ganglion cells and examine their coherence with bursts detected by a human observer. We show that several common burst detection techniques perform poorly at analyzing spike trains with a variety of properties. We identify four promising burst detection techniques, which are then applied to MEA recordings of networks of human induced pluripotent stem cell-derived neurons and used to describe the ontogeny of bursting activity in these networks over several months of development. We conclude that no current method can provide "perfect" burst detection results across a range of spike trains; however, two burst detection techniques, the MaxInterval and logISI methods, outperform compared with others. We provide recommendations for the robust analysis of bursting activity in experimental recordings using current techniques. PMID:27098024

  9. Experiments on clustered neuronal networks

    NASA Astrophysics Data System (ADS)

    Teller, S.; Soriano, J.

    2013-01-01

    Neuronal cultures show a rich repertoire of spontaneous activity. However, the mechanisms that relate a particular network architecture with a specific dynamic behavior are still not well understood. In order to investigate the dependence of neuronal network dynamics on architecture we study spontaneous activity in networks formed by interconnected aggregates of neurons (clustered neuronal networks). In the experiments we monitor the spontaneous activity using calcium fluorescence imaging. Network's firing is characterized by bursts of activity, in which the clusters fire sequentially in a short time window, remaining silent until the next bursting episode. We also investigate perturbations on the connectivity of the network. We mainly focus in physical damage. In some cases we observe important changes in the collective activity of the network, while in other cases some dynamic motifs are preserved, hinting at the existence of dynamic robustness.

  10. Simulating synchronization in neuronal networks

    NASA Astrophysics Data System (ADS)

    Fink, Christian G.

    2016-06-01

    We discuss several techniques used in simulating neuronal networks by exploring how a network's connectivity structure affects its propensity for synchronous spiking. Network connectivity is generated using the Watts-Strogatz small-world algorithm, and two key measures of network structure are described. These measures quantify structural characteristics that influence collective neuronal spiking, which is simulated using the leaky integrate-and-fire model. Simulations show that adding a small number of random connections to an otherwise lattice-like connectivity structure leads to a dramatic increase in neuronal synchronization.

  11. Nanometric resolution magnetic resonance imaging methods for mapping functional activity in neuronal networks

    PubMed Central

    Boretti, Albert; Castelletto, Stefania

    2016-01-01

    This contribution highlights and compares some recent achievements in the use of k-space and real space imaging (scanning probe and wide-filed microscope techniques), when applied to a luminescent color center in diamond, known as nitrogen vacancy (NV) center. These techniques combined with the optically detected magnetic resonance of NV, provide a unique platform to achieve nanometric magnetic resonance imaging (MRI) resolution of nearby nuclear spins (known as nanoMRI), and nanometric NV real space localization. • Atomic size optically detectable spin probe. • High magnetic field sensitivity and nanometric resolution. • Non-invasive mapping of functional activity in neuronal networks. PMID:27144128

  12. Nanometric resolution magnetic resonance imaging methods for mapping functional activity in neuronal networks.

    PubMed

    Boretti, Albert; Castelletto, Stefania

    2016-01-01

    This contribution highlights and compares some recent achievements in the use of k-space and real space imaging (scanning probe and wide-filed microscope techniques), when applied to a luminescent color center in diamond, known as nitrogen vacancy (NV) center. These techniques combined with the optically detected magnetic resonance of NV, provide a unique platform to achieve nanometric magnetic resonance imaging (MRI) resolution of nearby nuclear spins (known as nanoMRI), and nanometric NV real space localization. •Atomic size optically detectable spin probe.•High magnetic field sensitivity and nanometric resolution.•Non-invasive mapping of functional activity in neuronal networks. PMID:27144128

  13. Network synchronization in hippocampal neurons.

    PubMed

    Penn, Yaron; Segal, Menahem; Moses, Elisha

    2016-03-22

    Oscillatory activity is widespread in dynamic neuronal networks. The main paradigm for the origin of periodicity consists of specialized pacemaking elements that synchronize and drive the rest of the network; however, other models exist. Here, we studied the spontaneous emergence of synchronized periodic bursting in a network of cultured dissociated neurons from rat hippocampus and cortex. Surprisingly, about 60% of all active neurons were self-sustained oscillators when disconnected, each with its own natural frequency. The individual neuron's tendency to oscillate and the corresponding oscillation frequency are controlled by its excitability. The single neuron intrinsic oscillations were blocked by riluzole, and are thus dependent on persistent sodium leak currents. Upon a gradual retrieval of connectivity, the synchrony evolves: Loose synchrony appears already at weak connectivity, with the oscillators converging to one common oscillation frequency, yet shifted in phase across the population. Further strengthening of the connectivity causes a reduction in the mean phase shifts until zero-lag is achieved, manifested by synchronous periodic network bursts. Interestingly, the frequency of network bursting matches the average of the intrinsic frequencies. Overall, the network behaves like other universal systems, where order emerges spontaneously by entrainment of independent rhythmic units. Although simplified with respect to circuitry in the brain, our results attribute a basic functional role for intrinsic single neuron excitability mechanisms in driving the network's activity and dynamics, contributing to our understanding of developing neural circuits. PMID:26961000

  14. Automatic neuron segmentation and neural network analysis method for phase contrast microscopy images

    PubMed Central

    Pang, Jincheng; Özkucur, Nurdan; Ren, Michael; Kaplan, David L.; Levin, Michael; Miller, Eric L.

    2015-01-01

    Phase Contrast Microscopy (PCM) is an important tool for the long term study of living cells. Unlike fluorescence methods which suffer from photobleaching of fluorophore or dye molecules, PCM image contrast is generated by the natural variations in optical index of refraction. Unfortunately, the same physical principles which allow for these studies give rise to complex artifacts in the raw PCM imagery. Of particular interest in this paper are neuron images where these image imperfections manifest in very different ways for the two structures of specific interest: cell bodies (somas) and dendrites. To address these challenges, we introduce a novel parametric image model using the level set framework and an associated variational approach which simultaneously restores and segments this class of images. Using this technique as the basis for an automated image analysis pipeline, results for both the synthetic and real images validate and demonstrate the advantages of our approach. PMID:26601004

  15. Stages of neuronal network formation

    NASA Astrophysics Data System (ADS)

    Woiterski, Lydia; Claudepierre, Thomas; Luxenhofer, Robert; Jordan, Rainer; Käs, Josef A.

    2013-02-01

    Graph theoretical approaches have become a powerful tool for investigating the architecture and dynamics of complex networks. The topology of network graphs revealed small-world properties for very different real systems among these neuronal networks. In this study, we observed the early development of mouse retinal ganglion cell (RGC) networks in vitro using time-lapse video microscopy. By means of a time-resolved graph theoretical analysis of the connectivity, shortest path length and the edge length, we were able to discover the different stages during the network formation. Starting from single cells, at the first stage neurons connected to each other ending up in a network with maximum complexity. In the further course, we observed a simplification of the network which manifested in a change of relevant network parameters such as the minimization of the path length. Moreover, we found that RGC networks self-organized as small-world networks at both stages; however, the optimization occurred only in the second stage.

  16. Parallel Network Simulations with NEURON

    PubMed Central

    Migliore, M.; Cannia, C.; Lytton, W.W; Markram, Henry; Hines, M. L.

    2009-01-01

    The NEURON simulation environment has been extended to support parallel network simulations. Each processor integrates the equations for its subnet over an interval equal to the minimum (interprocessor) presynaptic spike generation to postsynaptic spike delivery connection delay. The performance of three published network models with very different spike patterns exhibits superlinear speedup on Beowulf clusters and demonstrates that spike communication overhead is often less than the benefit of an increased fraction of the entire problem fitting into high speed cache. On the EPFL IBM Blue Gene, almost linear speedup was obtained up to 100 processors. Increasing one model from 500 to 40,000 realistic cells exhibited almost linear speedup on 2000 processors, with an integration time of 9.8 seconds and communication time of 1.3 seconds. The potential for speed-ups of several orders of magnitude makes practical the running of large network simulations that could otherwise not be explored. PMID:16732488

  17. Shaping Neuronal Network Activity by Presynaptic Mechanisms

    PubMed Central

    Ashery, Uri

    2015-01-01

    Neuronal microcircuits generate oscillatory activity, which has been linked to basic functions such as sleep, learning and sensorimotor gating. Although synaptic release processes are well known for their ability to shape the interaction between neurons in microcircuits, most computational models do not simulate the synaptic transmission process directly and hence cannot explain how changes in synaptic parameters alter neuronal network activity. In this paper, we present a novel neuronal network model that incorporates presynaptic release mechanisms, such as vesicle pool dynamics and calcium-dependent release probability, to model the spontaneous activity of neuronal networks. The model, which is based on modified leaky integrate-and-fire neurons, generates spontaneous network activity patterns, which are similar to experimental data and robust under changes in the model's primary gain parameters such as excitatory postsynaptic potential and connectivity ratio. Furthermore, it reliably recreates experimental findings and provides mechanistic explanations for data obtained from microelectrode array recordings, such as network burst termination and the effects of pharmacological and genetic manipulations. The model demonstrates how elevated asynchronous release, but not spontaneous release, synchronizes neuronal network activity and reveals that asynchronous release enhances utilization of the recycling vesicle pool to induce the network effect. The model further predicts a positive correlation between vesicle priming at the single-neuron level and burst frequency at the network level; this prediction is supported by experimental findings. Thus, the model is utilized to reveal how synaptic release processes at the neuronal level govern activity patterns and synchronization at the network level. PMID:26372048

  18. Neuronal Networks on Nanocellulose Scaffolds.

    PubMed

    Jonsson, Malin; Brackmann, Christian; Puchades, Maja; Brattås, Karoline; Ewing, Andrew; Gatenholm, Paul; Enejder, Annika

    2015-11-01

    Proliferation, integration, and neurite extension of PC12 cells, a widely used culture model for cholinergic neurons, were studied in nanocellulose scaffolds biosynthesized by Gluconacetobacter xylinus to allow a three-dimensional (3D) extension of neurites better mimicking neuronal networks in tissue. The interaction with control scaffolds was compared with cationized nanocellulose (trimethyl ammonium betahydroxy propyl [TMAHP] cellulose) to investigate the impact of surface charges on the cell interaction mechanisms. Furthermore, coatings with extracellular matrix proteins (collagen, fibronectin, and laminin) were investigated to determine the importance of integrin-mediated cell attachment. Cell proliferation was evaluated by a cellular proliferation assay, while cell integration and neurite propagation were studied by simultaneous label-free Coherent anti-Stokes Raman Scattering and second harmonic generation microscopy, providing 3D images of PC12 cells and arrangement of nanocellulose fibrils, respectively. Cell attachment and proliferation were enhanced by TMAHP modification, but not by protein coating. Protein coating instead promoted active interaction between the cells and the scaffold, hence lateral cell migration and integration. Irrespective of surface modification, deepest cell integration measured was one to two cell layers, whereas neurites have a capacity to integrate deeper than the cell bodies in the scaffold due to their fine dimensions and amoeba-like migration pattern. Neurites with lengths of >50 μm were observed, successfully connecting individual cells and cell clusters. In conclusion, TMAHP-modified nanocellulose scaffolds promote initial cellular scaffold adhesion, which combined with additional cell-scaffold treatments enables further formation of 3D neuronal networks. PMID:26398224

  19. Solving Constraint Satisfaction Problems with Networks of Spiking Neurons

    PubMed Central

    Jonke, Zeno; Habenschuss, Stefan; Maass, Wolfgang

    2016-01-01

    Network of neurons in the brain apply—unlike processors in our current generation of computer hardware—an event-based processing strategy, where short pulses (spikes) are emitted sparsely by neurons to signal the occurrence of an event at a particular point in time. Such spike-based computations promise to be substantially more power-efficient than traditional clocked processing schemes. However, it turns out to be surprisingly difficult to design networks of spiking neurons that can solve difficult computational problems on the level of single spikes, rather than rates of spikes. We present here a new method for designing networks of spiking neurons via an energy function. Furthermore, we show how the energy function of a network of stochastically firing neurons can be shaped in a transparent manner by composing the networks of simple stereotypical network motifs. We show that this design approach enables networks of spiking neurons to produce approximate solutions to difficult (NP-hard) constraint satisfaction problems from the domains of planning/optimization and verification/logical inference. The resulting networks employ noise as a computational resource. Nevertheless, the timing of spikes plays an essential role in their computations. Furthermore, networks of spiking neurons carry out for the Traveling Salesman Problem a more efficient stochastic search for good solutions compared with stochastic artificial neural networks (Boltzmann machines) and Gibbs sampling. PMID:27065785

  20. Solving Constraint Satisfaction Problems with Networks of Spiking Neurons.

    PubMed

    Jonke, Zeno; Habenschuss, Stefan; Maass, Wolfgang

    2016-01-01

    Network of neurons in the brain apply-unlike processors in our current generation of computer hardware-an event-based processing strategy, where short pulses (spikes) are emitted sparsely by neurons to signal the occurrence of an event at a particular point in time. Such spike-based computations promise to be substantially more power-efficient than traditional clocked processing schemes. However, it turns out to be surprisingly difficult to design networks of spiking neurons that can solve difficult computational problems on the level of single spikes, rather than rates of spikes. We present here a new method for designing networks of spiking neurons via an energy function. Furthermore, we show how the energy function of a network of stochastically firing neurons can be shaped in a transparent manner by composing the networks of simple stereotypical network motifs. We show that this design approach enables networks of spiking neurons to produce approximate solutions to difficult (NP-hard) constraint satisfaction problems from the domains of planning/optimization and verification/logical inference. The resulting networks employ noise as a computational resource. Nevertheless, the timing of spikes plays an essential role in their computations. Furthermore, networks of spiking neurons carry out for the Traveling Salesman Problem a more efficient stochastic search for good solutions compared with stochastic artificial neural networks (Boltzmann machines) and Gibbs sampling. PMID:27065785

  1. Robust Multiobjective Controllability of Complex Neuronal Networks.

    PubMed

    Tang, Yang; Gao, Huijun; Du, Wei; Lu, Jianquan; Vasilakos, Athanasios V; Kurths, Jurgen

    2016-01-01

    This paper addresses robust multiobjective identification of driver nodes in the neuronal network of a cat's brain, in which uncertainties in determination of driver nodes and control gains are considered. A framework for robust multiobjective controllability is proposed by introducing interval uncertainties and optimization algorithms. By appropriate definitions of robust multiobjective controllability, a robust nondominated sorting adaptive differential evolution (NSJaDE) is presented by means of the nondominated sorting mechanism and the adaptive differential evolution (JaDE). The simulation experimental results illustrate the satisfactory performance of NSJaDE for robust multiobjective controllability, in comparison with six statistical methods and two multiobjective evolutionary algorithms (MOEAs): nondominated sorting genetic algorithms II (NSGA-II) and nondominated sorting composite differential evolution. It is revealed that the existence of uncertainties in choosing driver nodes and designing control gains heavily affects the controllability of neuronal networks. We also unveil that driver nodes play a more drastic role than control gains in robust controllability. The developed NSJaDE and obtained results will shed light on the understanding of robustness in controlling realistic complex networks such as transportation networks, power grid networks, biological networks, etc. PMID:26441452

  2. Macroscopic Description for Networks of Spiking Neurons

    NASA Astrophysics Data System (ADS)

    Montbrió, Ernest; Pazó, Diego; Roxin, Alex

    2015-04-01

    A major goal of neuroscience, statistical physics, and nonlinear dynamics is to understand how brain function arises from the collective dynamics of networks of spiking neurons. This challenge has been chiefly addressed through large-scale numerical simulations. Alternatively, researchers have formulated mean-field theories to gain insight into macroscopic states of large neuronal networks in terms of the collective firing activity of the neurons, or the firing rate. However, these theories have not succeeded in establishing an exact correspondence between the firing rate of the network and the underlying microscopic state of the spiking neurons. This has largely constrained the range of applicability of such macroscopic descriptions, particularly when trying to describe neuronal synchronization. Here, we provide the derivation of a set of exact macroscopic equations for a network of spiking neurons. Our results reveal that the spike generation mechanism of individual neurons introduces an effective coupling between two biophysically relevant macroscopic quantities, the firing rate and the mean membrane potential, which together govern the evolution of the neuronal network. The resulting equations exactly describe all possible macroscopic dynamical states of the network, including states of synchronous spiking activity. Finally, we show that the firing-rate description is related, via a conformal map, to a low-dimensional description in terms of the Kuramoto order parameter, called Ott-Antonsen theory. We anticipate that our results will be an important tool in investigating how large networks of spiking neurons self-organize in time to process and encode information in the brain.

  3. Adaptive Neurons For Artificial Neural Networks

    NASA Technical Reports Server (NTRS)

    Tawel, Raoul

    1990-01-01

    Training time decreases dramatically. In improved mathematical model of neural-network processor, temperature of neurons (in addition to connection strengths, also called weights, of synapses) varied during supervised-learning phase of operation according to mathematical formalism and not heuristic rule. Evidence that biological neural networks also process information at neuronal level.

  4. Vehicle dynamic analysis using neuronal network algorithms

    NASA Astrophysics Data System (ADS)

    Oloeriu, Florin; Mocian, Oana

    2014-06-01

    Theoretical developments of certain engineering areas, the emergence of new investigation tools, which are better and more precise and their implementation on-board the everyday vehicles, all these represent main influence factors that impact the theoretical and experimental study of vehicle's dynamic behavior. Once the implementation of these new technologies onto the vehicle's construction had been achieved, it had led to more and more complex systems. Some of the most important, such as the electronic control of engine, transmission, suspension, steering, braking and traction had a positive impact onto the vehicle's dynamic behavior. The existence of CPU on-board vehicles allows data acquisition and storage and it leads to a more accurate and better experimental and theoretical study of vehicle dynamics. It uses the information offered directly by the already on-board built-in elements of electronic control systems. The technical literature that studies vehicle dynamics is entirely focused onto parametric analysis. This kind of approach adopts two simplifying assumptions. Functional parameters obey certain distribution laws, which are known in classical statistics theory. The second assumption states that the mathematical models are previously known and have coefficients that are not time-dependent. Both the mentioned assumptions are not confirmed in real situations: the functional parameters do not follow any known statistical repartition laws and the mathematical laws aren't previously known and contain families of parameters and are mostly time-dependent. The purpose of the paper is to present a more accurate analysis methodology that can be applied when studying vehicle's dynamic behavior. A method that provides the setting of non-parametrical mathematical models for vehicle's dynamic behavior is relying on neuronal networks. This method contains coefficients that are time-dependent. Neuronal networks are mostly used in various types' system controls, thus

  5. Inferring Single Neuron Properties in Conductance Based Balanced Networks

    PubMed Central

    Pool, Román Rossi; Mato, Germán

    2011-01-01

    Balanced states in large networks are a usual hypothesis for explaining the variability of neural activity in cortical systems. In this regime the statistics of the inputs is characterized by static and dynamic fluctuations. The dynamic fluctuations have a Gaussian distribution. Such statistics allows to use reverse correlation methods, by recording synaptic inputs and the spike trains of ongoing spontaneous activity without any additional input. By using this method, properties of the single neuron dynamics that are masked by the balanced state can be quantified. To show the feasibility of this approach we apply it to large networks of conductance based neurons. The networks are classified as Type I or Type II according to the bifurcations which neurons of the different populations undergo near the firing onset. We also analyze mixed networks, in which each population has a mixture of different neuronal types. We determine under which conditions the intrinsic noise generated by the network can be used to apply reverse correlation methods. We find that under realistic conditions we can ascertain with low error the types of neurons present in the network. We also find that data from neurons with similar firing rates can be combined to perform covariance analysis. We compare the results of these methods (that do not requite any external input) to the standard procedure (that requires the injection of Gaussian noise into a single neuron). We find a good agreement between the two procedures. PMID:22016730

  6. Slow waves in mutually inhibitory neuronal networks

    NASA Astrophysics Data System (ADS)

    Jalics, Jozsi

    2004-05-01

    A variety of experimental and modeling studies have been performed to investigate wave propagation in networks of thalamic neurons and their relationship to spindle sleep rhythms. It is believed that spindle oscillations result from the reciprocal interaction between thalamocortical (TC) and thalamic reticular (RE) neurons. We consider a network of TC and RE cells reduced to a one-layer network model and represented by a system of singularly perturbed integral-differential equations. Geometric singular perturbation methods are used to prove the existence of a locally unique slow wave pulse that propagates along the network. By seeking a slow pulse solution, we reformulate the problem to finding a heteroclinic orbit in a 3D system of ODEs with two additional constraints on the location of the orbit at two distinct points in time. In proving the persistence of the singular heteroclinic orbit, difficulties arising from the solution passing near points where normal hyperbolicity is lost on a 2D critical manifold are overcome by employing results by Wechselberger [Singularly perturbed folds and canards in R3, Thesis, TU-Wien, 1998].

  7. Stochastic resonance in mammalian neuronal networks

    SciTech Connect

    Gluckman, B.J.; So, P.; Netoff, T.I.; Spano, M.L.; Schiff, S.J. |

    1998-09-01

    We present stochastic resonance observed in the dynamics of neuronal networks from mammalian brain. Both sinusoidal signals and random noise were superimposed into an applied electric field. As the amplitude of the noise component was increased, an optimization (increase then decrease) in the signal-to-noise ratio of the network response to the sinusoidal signal was observed. The relationship between the measures used to characterize the dynamics is discussed. Finally, a computational model of these neuronal networks that includes the neuronal interactions with the electric field is presented to illustrate the physics behind the essential features of the experiment. {copyright} {ital 1998 American Institute of Physics.}

  8. Cultured neuronal networks as environmental biosensors.

    PubMed

    O'Shaughnessy, Thomas J; Gray, Samuel A; Pancrazio, Joseph J

    2004-01-01

    Contamination of water by toxins, either intentionally or unintentionally, is a growing concern for both military and civilian agencies and thus there is a need for systems capable of monitoring a wide range of natural and industrial toxicants. The EILATox-Oregon Workshop held in September 2002 provided an opportunity to test the capabilities of a prototype neuronal network-based biosensor with unknown contaminants in water samples. The biosensor is a portable device capable of recording the action potential activity from a network of mammalian neurons grown on glass microelectrode arrays. Changes in the action potential fi ring rate across the network are monitored to determine exposure to toxicants. A series of three neuronal networks derived from mice was used to test seven unknown samples. Two of these unknowns later were revealed to be blanks, to which the neuronal networks did not respond. Of the five remaining unknowns, a significant change in network activity was detected for four of the compounds at concentrations below a lethal level for humans: mercuric chloride, sodium arsenite, phosdrin and chlordimeform. These compounds--two heavy metals, an organophosphate and an insecticide--demonstrate the breadth of detection possible with neuronal networks. The results generated at the workshop show the promise of the neuronal network biosensor as an environmental detector but there is still considerable effort needed to produce a device suitable for routine environmental threat monitoring. PMID:15478174

  9. Oscillatorylike behavior in feedforward neuronal networks.

    PubMed

    Payeur, Alexandre; Maler, Leonard; Longtin, André

    2015-07-01

    We demonstrate how rhythmic activity can arise in neural networks from feedforward rather than recurrent circuitry and, in so doing, we provide a mechanism capable of explaining the temporal decorrelation of γ-band oscillations. We compare the spiking activity of a delayed recurrent network of inhibitory neurons with that of a feedforward network with the same neural properties and axonal delays. Paradoxically, these very different connectivities can yield very similar spike-train statistics in response to correlated input. This happens when neurons are noisy and axonal delays are short. A Taylor expansion of the feedback network's susceptibility-or frequency-dependent gain function-can then be stopped at first order to a good approximation, thus matching the feedforward net's susceptibility. The feedback network is known to display oscillations; these oscillations imply that the spiking activity of the population is felt by all neurons within the network, leading to direct spike correlations in a given neuron. On the other hand, in the output layer of the feedforward net, the interaction between the external drive and the delayed feedforward projection of this drive by the input layer causes indirect spike correlations: spikes fired by a given output layer neuron are correlated only through the activity of the input layer neurons. High noise and short delays partially bridge the gap between these two types of correlation, yielding similar spike-train statistics for both networks. This similarity is even stronger when the delay is distributed, as confirmed by linear response theory. PMID:26274199

  10. Oscillatorylike behavior in feedforward neuronal networks

    NASA Astrophysics Data System (ADS)

    Payeur, Alexandre; Maler, Leonard; Longtin, André

    2015-07-01

    We demonstrate how rhythmic activity can arise in neural networks from feedforward rather than recurrent circuitry and, in so doing, we provide a mechanism capable of explaining the temporal decorrelation of γ -band oscillations. We compare the spiking activity of a delayed recurrent network of inhibitory neurons with that of a feedforward network with the same neural properties and axonal delays. Paradoxically, these very different connectivities can yield very similar spike-train statistics in response to correlated input. This happens when neurons are noisy and axonal delays are short. A Taylor expansion of the feedback network's susceptibility—or frequency-dependent gain function—can then be stopped at first order to a good approximation, thus matching the feedforward net's susceptibility. The feedback network is known to display oscillations; these oscillations imply that the spiking activity of the population is felt by all neurons within the network, leading to direct spike correlations in a given neuron. On the other hand, in the output layer of the feedforward net, the interaction between the external drive and the delayed feedforward projection of this drive by the input layer causes indirect spike correlations: spikes fired by a given output layer neuron are correlated only through the activity of the input layer neurons. High noise and short delays partially bridge the gap between these two types of correlation, yielding similar spike-train statistics for both networks. This similarity is even stronger when the delay is distributed, as confirmed by linear response theory.

  11. Maximum hyperchaos in chaotic nonmonotonic neuronal networks

    NASA Astrophysics Data System (ADS)

    Shuai, J. W.; Chen, Z. X.; Liu, R. T.; Wu, B. X.

    1997-07-01

    Hyperchaos in chaotic nonmonotonic neuronal networks is discussed with computer simulations. Maximum chaos with all Lyapunov exponents positive is found not only in the present dissipative model with weak coupling connections between neurons, but also with some strong-coupling connections. Although the model presented is a noninvertible map, the information dimension of simple chaos still yields a good approximation to the Lyapunov dimension.

  12. Somatostatin-expressing neurons in cortical networks.

    PubMed

    Urban-Ciecko, Joanna; Barth, Alison L

    2016-07-01

    Somatostatin-expressing GABAergic neurons constitute a major class of inhibitory neurons in the mammalian cortex and are characterized by dense wiring into the local network and high basal firing activity that persists in the absence of synaptic input. This firing provides both GABA type A receptor (GABAAR)- and GABABR-mediated inhibition that operates at fast and slow timescales. The activity of somatostatin-expressing neurons is regulated by brain state, during learning and in rewarded behaviour. Here, we review recent advances in our understanding of how this class of cells can control network activity, with specific reference to how this is constrained by their anatomical and electrophysiological properties. PMID:27225074

  13. Associative memory in phasing neuron networks

    SciTech Connect

    Nair, Niketh S; Bochove, Erik J.; Braiman, Yehuda

    2014-01-01

    We studied pattern formation in a network of coupled Hindmarsh-Rose model neurons and introduced a new model for associative memory retrieval using networks of Kuramoto oscillators. Hindmarsh-Rose Neural Networks can exhibit a rich set of collective dynamics that can be controlled by their connectivity. Specifically, we showed an instance of Hebb's rule where spiking was correlated with network topology. Based on this, we presented a simple model of associative memory in coupled phase oscillators.

  14. Reducing Neuronal Networks to Discrete Dynamics

    PubMed Central

    Terman, David; Ahn, Sungwoo; Wang, Xueying; Just, Winfried

    2008-01-01

    We consider a general class of purely inhibitory and excitatory-inhibitory neuronal networks, with a general class of network architectures, and characterize the complex firing patterns that emerge. Our strategy for studying these networks is to first reduce them to a discrete model. In the discrete model, each neuron is represented as a finite number of states and there are rules for how a neuron transitions from one state to another. In this paper, we rigorously demonstrate that the continuous neuronal model can be reduced to the discrete model if the intrinsic and synaptic properties of the cells are chosen appropriately. In a companion paper [1], we analyze the discrete model. PMID:18443649

  15. Sloppiness in Spontaneously Active Neuronal Networks

    PubMed Central

    Panas, Dagmara; Amin, Hayder; Maccione, Alessandro; Muthmann, Oliver; van Rossum, Mark; Berdondini, Luca

    2015-01-01

    Various plasticity mechanisms, including experience-dependent, spontaneous, as well as homeostatic ones, continuously remodel neural circuits. Yet, despite fluctuations in the properties of single neurons and synapses, the behavior and function of neuronal assemblies are generally found to be very stable over time. This raises the important question of how plasticity is coordinated across the network. To address this, we investigated the stability of network activity in cultured rat hippocampal neurons recorded with high-density multielectrode arrays over several days. We used parametric models to characterize multineuron activity patterns and analyzed their sensitivity to changes. We found that the models exhibited sloppiness, a property where the model behavior is insensitive to changes in many parameter combinations, but very sensitive to a few. The activity of neurons with sloppy parameters showed faster and larger fluctuations than the activity of a small subset of neurons associated with sensitive parameters. Furthermore, parameter sensitivity was highly correlated with firing rates. Finally, we tested our observations from cell cultures on an in vivo recording from monkey visual cortex and we confirm that spontaneous cortical activity also shows hallmarks of sloppy behavior and firing rate dependence. Our findings suggest that a small subnetwork of highly active and stable neurons supports group stability, and that this endows neuronal networks with the flexibility to continuously remodel without compromising stability and function. PMID:26041916

  16. Structural Properties of the Caenorhabditis elegans Neuronal Network

    PubMed Central

    Varshney, Lav R.; Chen, Beth L.; Paniagua, Eric; Hall, David H.; Chklovskii, Dmitri B.

    2011-01-01

    Despite recent interest in reconstructing neuronal networks, complete wiring diagrams on the level of individual synapses remain scarce and the insights into function they can provide remain unclear. Even for Caenorhabditis elegans, whose neuronal network is relatively small and stereotypical from animal to animal, published wiring diagrams are neither accurate nor complete and self-consistent. Using materials from White et al. and new electron micrographs we assemble whole, self-consistent gap junction and chemical synapse networks of hermaphrodite C. elegans. We propose a method to visualize the wiring diagram, which reflects network signal flow. We calculate statistical and topological properties of the network, such as degree distributions, synaptic multiplicities, and small-world properties, that help in understanding network signal propagation. We identify neurons that may play central roles in information processing, and network motifs that could serve as functional modules of the network. We explore propagation of neuronal activity in response to sensory or artificial stimulation using linear systems theory and find several activity patterns that could serve as substrates of previously described behaviors. Finally, we analyze the interaction between the gap junction and the chemical synapse networks. Since several statistical properties of the C. elegans network, such as multiplicity and motif distributions are similar to those found in mammalian neocortex, they likely point to general principles of neuronal networks. The wiring diagram reported here can help in understanding the mechanistic basis of behavior by generating predictions about future experiments involving genetic perturbations, laser ablations, or monitoring propagation of neuronal activity in response to stimulation. PMID:21304930

  17. Synchrony and Control of Neuronal Networks.

    NASA Astrophysics Data System (ADS)

    Schiff, Steven

    2001-03-01

    Cooperative behavior in the brain stems from the nature and strength of the interactions between neurons within a networked ensemble. Normal network activity takes place in a state of partial synchrony between neurons, and some pathological behaviors, such as epilepsy and tremor, appear to share a common feature of increased interaction strength. We have focused on the parallel paths of both detecting and characterizing the nonlinear synchronization present within neuronal networks, and employing feedback control methodology using electrical fields to modulate that neuronal activity. From a theoretical perspective, we see evidence for nonlinear generalized synchrony in networks of neurons that linear techniques are incapable of detecting (PRE 54: 6708, 1996), and we have described a decoherence transition between asymmetric nonlinear systems that is experimentally observable (PRL 84: 1689, 2000). In addition, we have seen evidence for unstable dimension variability in real neuronal systems that indicates certain physical limits of modelability when observing such systems (PRL 85, 2490, 2000). From an experimental perspective, we have achieved success in modulating epileptic seizures in neuronal networks using electrical fields. Extracellular neuronal activity is continuously recorded during field application through differential extracellular recording techniques, and the applied electric field strength is continuously updated using a computer controlled proportional feedback algorithm. This approach appears capable of sustained amelioration of seizure events when used with negative feedback. In negative feedback mode, such findings may offer a novel technology for seizure control. In positive feedback mode, adaptively applied electric fields may offer a more physiological means for neural modulation for prosthetic purposes than previously possible (J. Neuroscience, 2001).

  18. Stability of Neuronal Networks with Homeostatic Regulation

    PubMed Central

    Harnack, Daniel; Pelko, Miha; Chaillet, Antoine; Chitour, Yacine; van Rossum, Mark C.W.

    2015-01-01

    Neurons are equipped with homeostatic mechanisms that counteract long-term perturbations of their average activity and thereby keep neurons in a healthy and information-rich operating regime. While homeostasis is believed to be crucial for neural function, a systematic analysis of homeostatic control has largely been lacking. The analysis presented here analyses the necessary conditions for stable homeostatic control. We consider networks of neurons with homeostasis and show that homeostatic control that is stable for single neurons, can destabilize activity in otherwise stable recurrent networks leading to strong non-abating oscillations in the activity. This instability can be prevented by slowing down the homeostatic control. The stronger the network recurrence, the slower the homeostasis has to be. Next, we consider how non-linearities in the neural activation function affect these constraints. Finally, we consider the case that homeostatic feedback is mediated via a cascade of multiple intermediate stages. Counter-intuitively, the addition of extra stages in the homeostatic control loop further destabilizes activity in single neurons and networks. Our theoretical framework for homeostasis thus reveals previously unconsidered constraints on homeostasis in biological networks, and identifies conditions that require the slow time-constants of homeostatic regulation observed experimentally. PMID:26154297

  19. Attractor dynamics in local neuronal networks

    PubMed Central

    Thivierge, Jean-Philippe; Comas, Rosa; Longtin, André

    2014-01-01

    Patterns of synaptic connectivity in various regions of the brain are characterized by the presence of synaptic motifs, defined as unidirectional and bidirectional synaptic contacts that follow a particular configuration and link together small groups of neurons. Recent computational work proposes that a relay network (two populations communicating via a third, relay population of neurons) can generate precise patterns of neural synchronization. Here, we employ two distinct models of neuronal dynamics and show that simulated neural circuits designed in this way are caught in a global attractor of activity that prevents neurons from modulating their response on the basis of incoming stimuli. To circumvent the emergence of a fixed global attractor, we propose a mechanism of selective gain inhibition that promotes flexible responses to external stimuli. We suggest that local neuronal circuits may employ this mechanism to generate precise patterns of neural synchronization whose transient nature delimits the occurrence of a brief stimulus. PMID:24688457

  20. Neural network with dynamically adaptable neurons

    NASA Technical Reports Server (NTRS)

    Tawel, Raoul (Inventor)

    1994-01-01

    This invention is an adaptive neuron for use in neural network processors. The adaptive neuron participates in the supervised learning phase of operation on a co-equal basis with the synapse matrix elements by adaptively changing its gain in a similar manner to the change of weights in the synapse IO elements. In this manner, training time is decreased by as much as three orders of magnitude.

  1. Self-excited relaxation oscillations in networks of impulse neurons

    NASA Astrophysics Data System (ADS)

    Glyzin, S. D.; Kolesov, A. Yu; Rozov, N. Kh

    2015-06-01

    This paper addresses the problem of mathematical modelling of neuron activity. New classes of singularly perturbed differential-difference equations with Volterra-type delay are proposed and used to describe how single neurons and also neural networks function with various kinds of connections (electrical or chemical). Special asymptotic methods are developed which make it possible to analyse questions of the existence and stability of relaxation periodic motions in such systems. Bibliography: 56 titles.

  2. Towards Reproducible Descriptions of Neuronal Network Models

    PubMed Central

    Nordlie, Eilen; Gewaltig, Marc-Oliver; Plesser, Hans Ekkehard

    2009-01-01

    Progress in science depends on the effective exchange of ideas among scientists. New ideas can be assessed and criticized in a meaningful manner only if they are formulated precisely. This applies to simulation studies as well as to experiments and theories. But after more than 50 years of neuronal network simulations, we still lack a clear and common understanding of the role of computational models in neuroscience as well as established practices for describing network models in publications. This hinders the critical evaluation of network models as well as their re-use. We analyze here 14 research papers proposing neuronal network models of different complexity and find widely varying approaches to model descriptions, with regard to both the means of description and the ordering and placement of material. We further observe great variation in the graphical representation of networks and the notation used in equations. Based on our observations, we propose a good model description practice, composed of guidelines for the organization of publications, a checklist for model descriptions, templates for tables presenting model structure, and guidelines for diagrams of networks. The main purpose of this good practice is to trigger a debate about the communication of neuronal network models in a manner comprehensible to humans, as opposed to machine-readable model description languages. We believe that the good model description practice proposed here, together with a number of other recent initiatives on data-, model-, and software-sharing, may lead to a deeper and more fruitful exchange of ideas among computational neuroscientists in years to come. We further hope that work on standardized ways of describing—and thinking about—complex neuronal networks will lead the scientific community to a clearer understanding of high-level concepts in network dynamics, and will thus lead to deeper insights into the function of the brain. PMID:19662159

  3. Inhibition Controls Asynchronous States of Neuronal Networks

    PubMed Central

    Treviño, Mario

    2016-01-01

    Computations in cortical circuits require action potentials from excitatory and inhibitory neurons. In this mini-review, I first provide a quick overview of findings that indicate that GABAergic neurons play a fundamental role in coordinating spikes and generating synchronized network activity. Next, I argue that these observations helped popularize the notion that network oscillations require a high degree of spike correlations among interneurons which, in turn, produce synchronous inhibition of the local microcircuit. The aim of this text is to discuss some recent experimental and computational findings that support a complementary view: one in which interneurons participate actively in producing asynchronous states in cortical networks. This requires a proper mixture of shared excitation and inhibition leading to asynchronous activity between neighboring cells. Such contribution from interneurons would be extremely important because it would tend to reduce the spike correlation between neighboring pyramidal cells, a drop in redundancy that could enhance the information-processing capacity of neural networks. PMID:27274721

  4. Neuronal network analyses: premises, promises and uncertainties

    PubMed Central

    Parker, David

    2010-01-01

    Neuronal networks assemble the cellular components needed for sensory, motor and cognitive functions. Any rational intervention in the nervous system will thus require an understanding of network function. Obtaining this understanding is widely considered to be one of the major tasks facing neuroscience today. Network analyses have been performed for some years in relatively simple systems. In addition to the direct insights these systems have provided, they also illustrate some of the difficulties of understanding network function. Nevertheless, in more complex systems (including human), claims are made that the cellular bases of behaviour are, or will shortly be, understood. While the discussion is necessarily limited, this issue will examine these claims and highlight some traditional and novel aspects of network analyses and their difficulties. This introduction discusses the criteria that need to be satisfied for network understanding, and how they relate to traditional and novel approaches being applied to addressing network function. PMID:20603354

  5. Complexities and uncertainties of neuronal network function

    PubMed Central

    Parker, David

    2005-01-01

    The nervous system generates behaviours through the activity in groups of neurons assembled into networks. Understanding these networks is thus essential to our understanding of nervous system function. Understanding a network requires information on its component cells, their interactions and their functional properties. Few networks come close to providing complete information on these aspects. However, even if complete information were available it would still only provide limited insight into network function. This is because the functional and structural properties of a network are not fixed but are plastic and can change over time. The number of interacting network components, their (variable) functional properties, and various plasticity mechanisms endows networks with considerable flexibility, but these features inevitably complicate network analyses. This review will initially discuss the general approaches and problems of network analyses. It will then examine the success of these analyses in a model spinal cord locomotor network in the lamprey, to determine to what extent in this relatively simple vertebrate system it is possible to claim detailed understanding of network function and plasticity. PMID:16553310

  6. Label-Free Characterization of Emerging Human Neuronal Networks

    NASA Astrophysics Data System (ADS)

    Mir, Mustafa; Kim, Taewoo; Majumder, Anirban; Xiang, Mike; Wang, Ru; Liu, S. Chris; Gillette, Martha U.; Stice, Steven; Popescu, Gabriel

    2014-03-01

    The emergent self-organization of a neuronal network in a developing nervous system is the result of a remarkably orchestrated process involving a multitude of chemical, mechanical and electrical signals. Little is known about the dynamic behavior of a developing network (especially in a human model) primarily due to a lack of practical and non-invasive methods to measure and quantify the process. Here we demonstrate that by using a novel optical interferometric technique, we can non-invasively measure several fundamental properties of neural networks from the sub-cellular to the cell population level. We applied this method to quantify network formation in human stem cell derived neurons and show for the first time, correlations between trends in the growth, transport, and spatial organization of such a system. Quantifying the fundamental behavior of such cell lines without compromising their viability may provide an important new tool in future longitudinal studies.

  7. Integrated microfluidic platforms for investigating neuronal networks

    NASA Astrophysics Data System (ADS)

    Kim, Hyung Joon

    (multielectrode array) or nanowire electrode array to study electrophysiology in neuronal network. Also, "diode-like" microgrooves to control the number of neuronal processes is embedded in this platform. Chapter 6 concludes with a possible future direction of this work. Interfacing micro/nanotechnology with primary neuron culture would open many doors in fundamental neuroscience research and also biomedical innovation.

  8. GENERAL: Complete and phase synchronization in a heterogeneous small-world neuronal network

    NASA Astrophysics Data System (ADS)

    Han, Fang; Lu, Qi-Shao; Wiercigroch, Marian; Ji, Quan-Bao

    2009-02-01

    Synchronous firing of neurons is thought to be important for information communication in neuronal networks. This paper investigates the complete and phase synchronization in a heterogeneous small-world chaotic Hindmarsh-Rose neuronal network. The effects of various network parameters on synchronization behaviour are discussed with some biological explanations. Complete synchronization of small-world neuronal networks is studied theoretically by the master stability function method. It is shown that the coupling strength necessary for complete or phase synchronization decreases with the neuron number, the node degree and the connection density are increased. The effect of heterogeneity of neuronal networks is also considered and it is found that the network heterogeneity has an adverse effect on synchrony.

  9. Micropatterning Facilitates the Long-Term Growth and Analysis of iPSC-Derived Individual Human Neurons and Neuronal Networks.

    PubMed

    Burbulla, Lena F; Beaumont, Kristin G; Mrksich, Milan; Krainc, Dimitri

    2016-08-01

    The discovery of induced pluripotent stem cells (iPSCs) and their application to patient-specific disease models offers new opportunities for studying the pathophysiology of neurological disorders. However, current methods for culturing iPSC-derived neuronal cells result in clustering of neurons, which precludes the analysis of individual neurons and defined neuronal networks. To address this challenge, cultures of human neurons on micropatterned surfaces are developed that promote neuronal survival over extended periods of time. This approach facilitates studies of neuronal development, cellular trafficking, and related mechanisms that require assessment of individual neurons and specific network connections. Importantly, micropatterns support the long-term stability of cultured neurons, which enables time-dependent analysis of cellular processes in living neurons. The approach described in this paper allows mechanistic studies of human neurons, both in terms of normal neuronal development and function, as well as time-dependent pathological processes, and provides a platform for testing of new therapeutics in neuropsychiatric disorders. PMID:27108930

  10. Collective Dynamics for Heterogeneous Networks of Theta Neurons

    NASA Astrophysics Data System (ADS)

    Luke, Tanushree

    Collective behavior in neural networks has often been used as an indicator of communication between different brain areas. These collective synchronization and desynchronization patterns are also considered an important feature in understanding normal and abnormal brain function. To understand the emergence of these collective patterns, I create an analytic model that identifies all such macroscopic steady-states attainable by a network of Type-I neurons. This network, whose basic unit is the model "theta'' neuron, contains a mixture of excitable and spiking neurons coupled via a smooth pulse-like synapse. Applying the Ott-Antonsen reduction method in the thermodynamic limit, I obtain a low-dimensional evolution equation that describes the asymptotic dynamics of the macroscopic mean field of the network. This model can be used as the basis in understanding more complicated neuronal networks when additional dynamical features are included. From this reduced dynamical equation for the mean field, I show that the network exhibits three collective attracting steady-states. The first two are equilibrium states that both reflect partial synchronization in the network, whereas the third is a limit cycle in which the degree of network synchronization oscillates in time. In addition to a comprehensive identification of all possible attracting macro-states, this analytic model permits a complete bifurcation analysis of the collective behavior of the network with respect to three key network features: the degree of excitability of the neurons, the heterogeneity of the population, and the overall coupling strength. The network typically tends towards the two macroscopic equilibrium states when the neuron's intrinsic dynamics and the network interactions reinforce each other. In contrast, the limit cycle state, bifurcations, and multistability tend to occur when there is competition between these network features. I also outline here an extension of the above model where the

  11. Spike Code Flow in Cultured Neuronal Networks.

    PubMed

    Tamura, Shinichi; Nishitani, Yoshi; Hosokawa, Chie; Miyoshi, Tomomitsu; Sawai, Hajime; Kamimura, Takuya; Yagi, Yasushi; Mizuno-Matsumoto, Yuko; Chen, Yen-Wei

    2016-01-01

    We observed spike trains produced by one-shot electrical stimulation with 8 × 8 multielectrodes in cultured neuronal networks. Each electrode accepted spikes from several neurons. We extracted the short codes from spike trains and obtained a code spectrum with a nominal time accuracy of 1%. We then constructed code flow maps as movies of the electrode array to observe the code flow of "1101" and "1011," which are typical pseudorandom sequence such as that we often encountered in a literature and our experiments. They seemed to flow from one electrode to the neighboring one and maintained their shape to some extent. To quantify the flow, we calculated the "maximum cross-correlations" among neighboring electrodes, to find the direction of maximum flow of the codes with lengths less than 8. Normalized maximum cross-correlations were almost constant irrespective of code. Furthermore, if the spike trains were shuffled in interval orders or in electrodes, they became significantly small. Thus, the analysis suggested that local codes of approximately constant shape propagated and conveyed information across the network. Hence, the codes can serve as visible and trackable marks of propagating spike waves as well as evaluating information flow in the neuronal network. PMID:27217825

  12. Spike Code Flow in Cultured Neuronal Networks

    PubMed Central

    Tamura, Shinichi; Nishitani, Yoshi; Miyoshi, Tomomitsu; Sawai, Hajime; Kamimura, Takuya; Yagi, Yasushi; Mizuno-Matsumoto, Yuko; Chen, Yen-Wei

    2016-01-01

    We observed spike trains produced by one-shot electrical stimulation with 8 × 8 multielectrodes in cultured neuronal networks. Each electrode accepted spikes from several neurons. We extracted the short codes from spike trains and obtained a code spectrum with a nominal time accuracy of 1%. We then constructed code flow maps as movies of the electrode array to observe the code flow of “1101” and “1011,” which are typical pseudorandom sequence such as that we often encountered in a literature and our experiments. They seemed to flow from one electrode to the neighboring one and maintained their shape to some extent. To quantify the flow, we calculated the “maximum cross-correlations” among neighboring electrodes, to find the direction of maximum flow of the codes with lengths less than 8. Normalized maximum cross-correlations were almost constant irrespective of code. Furthermore, if the spike trains were shuffled in interval orders or in electrodes, they became significantly small. Thus, the analysis suggested that local codes of approximately constant shape propagated and conveyed information across the network. Hence, the codes can serve as visible and trackable marks of propagating spike waves as well as evaluating information flow in the neuronal network. PMID:27217825

  13. Recording axonal conduction to evaluate the integration of pluripotent cell-derived neurons into a neuronal network.

    PubMed

    Shimba, Kenta; Sakai, Koji; Takayama, Yuzo; Kotani, Kiyoshi; Jimbo, Yasuhiko

    2015-10-01

    Stem cell transplantation is a promising therapy to treat neurodegenerative disorders, and a number of in vitro models have been developed for studying interactions between grafted neurons and the host neuronal network to promote drug discovery. However, methods capable of evaluating the process by which stem cells integrate into the host neuronal network are lacking. In this study, we applied an axonal conduction-based analysis to a co-culture study of primary and differentiated neurons. Mouse cortical neurons and neuronal cells differentiated from P19 embryonal carcinoma cells, a model for early neural differentiation of pluripotent stem cells, were co-cultured in a microfabricated device. The somata of these cells were separated by the co-culture device, but their axons were able to elongate through microtunnels and then form synaptic contacts. Propagating action potentials were recorded from these axons by microelectrodes embedded at the bottom of the microtunnels and sorted into clusters representing individual axons. While the number of axons of cortical neurons increased until 14 days in vitro and then decreased, those of P19 neurons increased throughout the culture period. Network burst analysis showed that P19 neurons participated in approximately 80% of the bursting activity after 14 days in vitro. Interestingly, the axonal conduction delay of P19 neurons was significantly greater than that of cortical neurons, suggesting that there are some physiological differences in their axons. These results suggest that our method is feasible to evaluate the process by which stem cell-derived neurons integrate into a host neuronal network. PMID:26303583

  14. [Inhibitory interactions in neuronal networks including cells of the auditory cortex and the medial geniculate body].

    PubMed

    Sil'kis, I G

    1994-01-01

    Cross-correlation method was used for revealing effective inhibitory interactions in neural networks containing simultaneously recorded neurons from different loci of auditory cortex (A1) and medial geniculate body (MGB). It was shown that (i) inhibitory connections were "divergent", i. e., one neuron in A1 (MGB) depressed activity of neurons in different loci of A1 and MGB simultaneously; (ii) inputs to inhibitory neuron were "convergent", i.e., one neuron in A1 (MGB) was excited by neurons from different loci of A1 and MGB simultaneously. There were inhibitory neurons which selectively depressed activity of only one neighbouring neuron. The results allow to suggest that the same inhibitory neuron may be involved in afferent and feedback inhibition. We supposed that the principles of organization of inhibitory connections in thalamo-cortical networks underlie the observed exceptions to mapping (tonotopic) principle of organization of receptive fields of A1 and MGB. PMID:7879428

  15. Transition to Chaos in Random Neuronal Networks

    NASA Astrophysics Data System (ADS)

    Kadmon, Jonathan; Sompolinsky, Haim

    2015-10-01

    Firing patterns in the central nervous system often exhibit strong temporal irregularity and considerable heterogeneity in time-averaged response properties. Previous studies suggested that these properties are the outcome of the intrinsic chaotic dynamics of the neural circuits. Indeed, simplified rate-based neuronal networks with synaptic connections drawn from Gaussian distribution and sigmoidal nonlinearity are known to exhibit chaotic dynamics when the synaptic gain (i.e., connection variance) is sufficiently large. In the limit of an infinitely large network, there is a sharp transition from a fixed point to chaos, as the synaptic gain reaches a critical value. Near the onset, chaotic fluctuations are slow, analogous to the ubiquitous, slow irregular fluctuations observed in the firing rates of many cortical circuits. However, the existence of a transition from a fixed point to chaos in neuronal circuit models with more realistic architectures and firing dynamics has not been established. In this work, we investigate rate-based dynamics of neuronal circuits composed of several subpopulations with randomly diluted connections. Nonzero connections are either positive for excitatory neurons or negative for inhibitory ones, while single neuron output is strictly positive with output rates rising as a power law above threshold, in line with known constraints in many biological systems. Using dynamic mean field theory, we find the phase diagram depicting the regimes of stable fixed-point, unstable-dynamic, and chaotic-rate fluctuations. We focus on the latter and characterize the properties of systems near this transition. We show that dilute excitatory-inhibitory architectures exhibit the same onset to chaos as the single population with Gaussian connectivity. In these architectures, the large mean excitatory and inhibitory inputs dynamically balance each other, amplifying the effect of the residual fluctuations. Importantly, the existence of a transition to chaos

  16. Resynchronization in neuronal network divided by femtosecond laser processing.

    PubMed

    Hosokawa, Chie; Kudoh, Suguru N; Kiyohara, Ai; Taguchi, Takahisa

    2008-05-01

    We demonstrated scission of a living neuronal network on multielectrode arrays (MEAs) using a focused femtosecond laser and evaluated the resynchronization of spontaneous electrical activity within the network. By an irradiation of femtosecond laser into hippocampal neurons cultured on a multielectrode array dish, neurites were cut at the focal point. After the irradiation, synchronization of neuronal activity within the network drastically decreased over the divided area, indicating diminished functional connections between neurons. Cross-correlation analysis revealed that spontaneous activity between the divided areas gradually resynchronized within 10 days. These findings indicate that hippocampal neurons have the potential to regenerate functional connections and to reconstruct a network by self-assembly. PMID:18418255

  17. Coping with variability in small neuronal networks.

    PubMed

    Calabrese, Ronald L; Norris, Brian J; Wenning, Angela; Wright, Terrence M

    2011-12-01

    Experimental and corresponding modeling studies indicate that there is a 2- to 5-fold variation of intrinsic and synaptic parameters across animals while functional output is maintained. Here, we review experiments, using the heartbeat central pattern generator (CPG) in medicinal leeches, which explore the consequences of animal-to-animal variation in synaptic strength for coordinated motor output. We focus on a set of segmental heart motor neurons that all receive inhibitory synaptic input from the same four premotor interneurons. These four premotor inputs fire in a phase progression and the motor neurons also fire in a phase progression because of differences in synaptic strength profiles of the four inputs among segments. Our work tested the hypothesis that functional output is maintained in the face of animal-to-animal variation in the absolute strength of connections because relative strengths of the four inputs onto particular motor neurons is maintained across animals. Our experiments showed that relative strength is not strictly maintained across animals even as functional output is maintained, and animal-to-animal variations in strength of particular inputs do not correlate strongly with output phase. Further experiments measured the precise temporal pattern of the premotor inputs, the segmental synaptic strength profiles of their connections onto motor neurons, and the temporal pattern (phase progression) of those motor neurons all in the same animal for a series of 12 animals. The analysis of input and output in this sample of 12 individuals suggests that the number (four) of inputs to each motor neuron and the variability of the temporal pattern of input from the CPG across individuals weaken the influence of the strength of individual inputs. Moreover, the temporal pattern of the output varies as much across individuals as that of the input. Essentially, each animal arrives at a unique solution for how the network produces functional output. PMID

  18. Energy coding in neural network with inhibitory neurons.

    PubMed

    Wang, Ziyin; Wang, Rubin; Fang, Ruiyan

    2015-04-01

    This paper aimed at assessing and comparing the effects of the inhibitory neurons in the neural network on the neural energy distribution, and the network activities in the absence of the inhibitory neurons to understand the nature of neural energy distribution and neural energy coding. Stimulus, synchronous oscillation has significant difference between neural networks with and without inhibitory neurons, and this difference can be quantitatively evaluated by the characteristic energy distribution. In addition, the synchronous oscillation difference of the neural activity can be quantitatively described by change of the energy distribution if the network parameters are gradually adjusted. Compared with traditional method of correlation coefficient analysis, the quantitative indicators based on nervous energy distribution characteristics are more effective in reflecting the dynamic features of the neural network activities. Meanwhile, this neural coding method from a global perspective of neural activity effectively avoids the current defects of neural encoding and decoding theory and enormous difficulties encountered. Our studies have shown that neural energy coding is a new coding theory with high efficiency and great potential. PMID:25806094

  19. Hierarchical networks, power laws, and neuronal avalanches

    NASA Astrophysics Data System (ADS)

    Friedman, Eric J.; Landsberg, Adam S.

    2013-03-01

    We show that in networks with a hierarchical architecture, critical dynamical behaviors can emerge even when the underlying dynamical processes are not critical. This finding provides explicit insight into current studies of the brain's neuronal network showing power-law avalanches in neural recordings, and provides a theoretical justification of recent numerical findings. Our analysis shows how the hierarchical organization of a network can itself lead to power-law distributions of avalanche sizes and durations, scaling laws between anomalous exponents, and universal functions—even in the absence of self-organized criticality or critical points. This hierarchy-induced phenomenon is independent of, though can potentially operate in conjunction with, standard dynamical mechanisms for generating power laws.

  20. Irregular behavior in an excitatory-inhibitory neuronal network

    NASA Astrophysics Data System (ADS)

    Park, Choongseok; Terman, David

    2010-06-01

    Excitatory-inhibitory networks arise in many regions throughout the central nervous system and display complex spatiotemporal firing patterns. These neuronal activity patterns (of individual neurons and/or the whole network) are closely related to the functional status of the system and differ between normal and pathological states. For example, neurons within the basal ganglia, a group of subcortical nuclei that are responsible for the generation of movement, display a variety of dynamic behaviors such as correlated oscillatory activity and irregular, uncorrelated spiking. Neither the origins of these firing patterns nor the mechanisms that underlie the patterns are well understood. We consider a biophysical model of an excitatory-inhibitory network in the basal ganglia and explore how specific biophysical properties of the network contribute to the generation of irregular spiking. We use geometric dynamical systems and singular perturbation methods to systematically reduce the model to a simpler set of equations, which is suitable for analysis. The results specify the dependence on the strengths of synaptic connections and the intrinsic firing properties of the cells in the irregular regime when applied to the subthalamopallidal network of the basal ganglia.

  1. Inferring Network Dynamics and Neuron Properties from Population Recordings

    PubMed Central

    Linaro, Daniele; Storace, Marco; Mattia, Maurizio

    2011-01-01

    Understanding the computational capabilities of the nervous system means to “identify” its emergent multiscale dynamics. For this purpose, we propose a novel model-driven identification procedure and apply it to sparsely connected populations of excitatory integrate-and-fire neurons with spike frequency adaptation (SFA). Our method does not characterize the system from its microscopic elements in a bottom-up fashion, and does not resort to any linearization. We investigate networks as a whole, inferring their properties from the response dynamics of the instantaneous discharge rate to brief and aspecific supra-threshold stimulations. While several available methods assume generic expressions for the system as a black box, we adopt a mean-field theory for the evolution of the network transparently parameterized by identified elements (such as dynamic timescales), which are in turn non-trivially related to single-neuron properties. In particular, from the elicited transient responses, the input–output gain function of the neurons in the network is extracted and direct links to the microscopic level are made available: indeed, we show how to extract the decay time constant of the SFA, the absolute refractory period and the average synaptic efficacy. In addition and contrary to previous attempts, our method captures the system dynamics across bifurcations separating qualitatively different dynamical regimes. The robustness and the generality of the methodology is tested on controlled simulations, reporting a good agreement between theoretically expected and identified values. The assumptions behind the underlying theoretical framework make the method readily applicable to biological preparations like cultured neuron networks and in vitro brain slices. PMID:22016731

  2. Critical behavior in networks of real neurons

    NASA Astrophysics Data System (ADS)

    Tkacik, Gasper

    2014-03-01

    The patterns of joint activity in a population of retinal ganglion cells encode the complete information about the visual world, and thus place limits on what could be learned about the environment by the brain. We analyze the recorded simultaneous activity of more than a hundred such neurons from an interacting population responding to naturalistic stimuli, at the single spike level, by constructing accurate maximum entropy models for the distribution of network activity states. This - essentially an ``inverse spin glass'' - construction reveals strong frustration in the pairwise couplings between the neurons that results in a rugged energy landscape with many local extrema; strong collective interactions in subgroups of neurons despite weak individual pairwise correlations; and a joint distribution of activity that has an extremely wide dynamic range characterized by a zipf-like power law, strong deviations from ``typicality,'' and a number of signatures of critical behavior. We hypothesize that this tuning to a critical operating point might be a dynamic property of the system and suggest experiments to test this hypothesis.

  3. Serotonin modulation of cortical neurons and networks

    PubMed Central

    Celada, Pau; Puig, M. Victoria; Artigas, Francesc

    2013-01-01

    The serotonergic pathways originating in the dorsal and median raphe nuclei (DR and MnR, respectively) are critically involved in cortical function. Serotonin (5-HT), acting on postsynaptic and presynaptic receptors, is involved in cognition, mood, impulse control and motor functions by (1) modulating the activity of different neuronal types, and (2) varying the release of other neurotransmitters, such as glutamate, GABA, acetylcholine and dopamine. Also, 5-HT seems to play an important role in cortical development. Of all cortical regions, the frontal lobe is the area most enriched in serotonergic axons and 5-HT receptors. 5-HT and selective receptor agonists modulate the excitability of cortical neurons and their discharge rate through the activation of several receptor subtypes, of which the 5-HT1A, 5-HT1B, 5-HT2A, and 5-HT3 subtypes play a major role. Little is known, however, on the role of other excitatory receptors moderately expressed in cortical areas, such as 5-HT2C, 5-HT4, 5-HT6, and 5-HT7. In vitro and in vivo studies suggest that 5-HT1A and 5-HT2A receptors are key players and exert opposite effects on the activity of pyramidal neurons in the medial prefrontal cortex (mPFC). The activation of 5-HT1A receptors in mPFC hyperpolarizes pyramidal neurons whereas that of 5-HT2A receptors results in neuronal depolarization, reduction of the afterhyperpolarization and increase of excitatory postsynaptic currents (EPSCs) and of discharge rate. 5-HT can also stimulate excitatory (5-HT2A and 5-HT3) and inhibitory (5-HT1A) receptors in GABA interneurons to modulate synaptic GABA inputs onto pyramidal neurons. Likewise, the pharmacological manipulation of various 5-HT receptors alters oscillatory activity in PFC, suggesting that 5-HT is also involved in the control of cortical network activity. A better understanding of the actions of 5-HT in PFC may help to develop treatments for mood and cognitive disorders associated with an abnormal function of the frontal lobe

  4. Integrated workflows for spiking neuronal network simulations

    PubMed Central

    Antolík, Ján; Davison, Andrew P.

    2013-01-01

    The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages. PMID

  5. Identification of neuronal network properties from the spectral analysis of calcium imaging signals in neuronal cultures

    PubMed Central

    Tibau, Elisenda; Valencia, Miguel; Soriano, Jordi

    2013-01-01

    Neuronal networks in vitro are prominent systems to study the development of connections in living neuronal networks and the interplay between connectivity, activity and function. These cultured networks show a rich spontaneous activity that evolves concurrently with the connectivity of the underlying network. In this work we monitor the development of neuronal cultures, and record their activity using calcium fluorescence imaging. We use spectral analysis to characterize global dynamical and structural traits of the neuronal cultures. We first observe that the power spectrum can be used as a signature of the state of the network, for instance when inhibition is active or silent, as well as a measure of the network's connectivity strength. Second, the power spectrum identifies prominent developmental changes in the network such as GABAA switch. And third, the analysis of the spatial distribution of the spectral density, in experiments with a controlled disintegration of the network through CNQX, an AMPA-glutamate receptor antagonist in excitatory neurons, reveals the existence of communities of strongly connected, highly active neurons that display synchronous oscillations. Our work illustrates the interest of spectral analysis for the study of in vitro networks, and its potential use as a network-state indicator, for instance to compare healthy and diseased neuronal networks. PMID:24385953

  6. Communication through resonance in spiking neuronal networks.

    PubMed

    Hahn, Gerald; Bujan, Alejandro F; Frégnac, Yves; Aertsen, Ad; Kumar, Arvind

    2014-08-01

    The cortex processes stimuli through a distributed network of specialized brain areas. This processing requires mechanisms that can route neuronal activity across weakly connected cortical regions. Routing models proposed thus far are either limited to propagation of spiking activity across strongly connected networks or require distinct mechanisms that create local oscillations and establish their coherence between distant cortical areas. Here, we propose a novel mechanism which explains how synchronous spiking activity propagates across weakly connected brain areas supported by oscillations. In our model, oscillatory activity unleashes network resonance that amplifies feeble synchronous signals and promotes their propagation along weak connections ("communication through resonance"). The emergence of coherent oscillations is a natural consequence of synchronous activity propagation and therefore the assumption of different mechanisms that create oscillations and provide coherence is not necessary. Moreover, the phase-locking of oscillations is a side effect of communication rather than its requirement. Finally, we show how the state of ongoing activity could affect the communication through resonance and propose that modulations of the ongoing activity state could influence information processing in distributed cortical networks. PMID:25165853

  7. Effect of Transcranial Magnetic Stimulation on Neuronal Networks

    NASA Astrophysics Data System (ADS)

    Unsal, Ahmet; Hadimani, Ravi; Jiles, David

    2013-03-01

    The human brain contains around 100 billion nerve cells controlling our day to day activities. Consequently, brain disorders often result in impairments such as paralysis, loss of coordination and seizure. It has been said that 1 in 5 Americans suffer some diagnosable mental disorder. There is an urgent need to understand the disorders, prevent them and if possible, develop permanent cure for them. As a result, a significant amount of research activities is being directed towards brain research. Transcranial Magnetic Stimulation (TMS) is a promising tool for diagnosing and treating brain disorders. It is a non-invasive treatment method that produces a current flow in the brain which excites the neurons. Even though TMS has been verified to have advantageous effects on various brain related disorders, there have not been enough studies on the impact of TMS on cells. In this study, we are investigating the electrophysiological effects of TMS on one dimensional neuronal culture grown in a circular pathway. Electrical currents are produced on the neuronal networks depending on the directionality of the applied field. This aids in understanding how neuronal networks react under TMS treatment.

  8. Synchronization in neuronal oscillator networks with input heterogeneity and arbitrary network structure

    NASA Astrophysics Data System (ADS)

    Davison, Elizabeth; Dey, Biswadip; Leonard, Naomi

    Mathematical studies of synchronization in networks of neuronal oscillators offer insight into neuronal ensemble behavior in the brain. Systematic means to understand how network structure and external input affect synchronization in network models have the potential to improve methods for treating synchronization-related neurological disorders such as epilepsy and Parkinson's disease. To elucidate the complex relationships between network structure, external input, and synchronization, we investigate synchronous firing patterns in arbitrary networks of neuronal oscillators coupled through gap junctions with heterogeneous external inputs. We first apply a passivity-based Lyapunov analysis to undirected networks of homogeneous FitzHugh-Nagumo (FN) oscillators with homogeneous inputs and derive a sufficient condition on coupling strength that guarantees complete synchronization. In biologically relevant regimes, we employ Gronwall's inequality to obtain a bound tighter than those previously reported. We extend both analyses to a homogeneous FN network with heterogeneous inputs and show how cluster synchronization emerges under conditions on the symmetry of the coupling matrix and external inputs. Our results can be generalized to any network of semi-passive oscillators.

  9. Method Accelerates Training Of Some Neural Networks

    NASA Technical Reports Server (NTRS)

    Shelton, Robert O.

    1992-01-01

    Three-layer networks trained faster provided two conditions are satisfied: numbers of neurons in layers are such that majority of work done in synaptic connections between input and hidden layers, and number of neurons in input layer at least as great as number of training pairs of input and output vectors. Based on modified version of back-propagation method.

  10. Multiplex Networks of Cortical and Hippocampal Neurons Revealed at Different Timescales

    PubMed Central

    Timme, Nicholas; Ito, Shinya; Myroshnychenko, Maxym; Yeh, Fang-Chin; Hiolski, Emma; Hottowy, Pawel; Beggs, John M.

    2014-01-01

    Recent studies have emphasized the importance of multiplex networks – interdependent networks with shared nodes and different types of connections – in systems primarily outside of neuroscience. Though the multiplex properties of networks are frequently not considered, most networks are actually multiplex networks and the multiplex specific features of networks can greatly affect network behavior (e.g. fault tolerance). Thus, the study of networks of neurons could potentially be greatly enhanced using a multiplex perspective. Given the wide range of temporally dependent rhythms and phenomena present in neural systems, we chose to examine multiplex networks of individual neurons with time scale dependent connections. To study these networks, we used transfer entropy – an information theoretic quantity that can be used to measure linear and nonlinear interactions – to systematically measure the connectivity between individual neurons at different time scales in cortical and hippocampal slice cultures. We recorded the spiking activity of almost 12,000 neurons across 60 tissue samples using a 512-electrode array with 60 micrometer inter-electrode spacing and 50 microsecond temporal resolution. To the best of our knowledge, this preparation and recording method represents a superior combination of number of recorded neurons and temporal and spatial recording resolutions to any currently available in vivo system. We found that highly connected neurons (“hubs”) were localized to certain time scales, which, we hypothesize, increases the fault tolerance of the network. Conversely, a large proportion of non-hub neurons were not localized to certain time scales. In addition, we found that long and short time scale connectivity was uncorrelated. Finally, we found that long time scale networks were significantly less modular and more disassortative than short time scale networks in both tissue types. As far as we are aware, this analysis represents the first

  11. The role of dimensionality in neuronal network dynamics.

    PubMed

    Ulloa Severino, Francesco Paolo; Ban, Jelena; Song, Qin; Tang, Mingliang; Bianconi, Ginestra; Cheng, Guosheng; Torre, Vincent

    2016-01-01

    Recent results from network theory show that complexity affects several dynamical properties of networks that favor synchronization. Here we show that synchronization in 2D and 3D neuronal networks is significantly different. Using dissociated hippocampal neurons we compared properties of cultures grown on a flat 2D substrates with those formed on 3D graphene foam scaffolds. Both 2D and 3D cultures had comparable glia to neuron ratio and the percentage of GABAergic inhibitory neurons. 3D cultures because of their dimension have many connections among distant neurons leading to small-world networks and their characteristic dynamics. After one week, calcium imaging revealed moderately synchronous activity in 2D networks, but the degree of synchrony of 3D networks was higher and had two regimes: a highly synchronized (HS) and a moderately synchronized (MS) regime. The HS regime was never observed in 2D networks. During the MS regime, neuronal assemblies in synchrony changed with time as observed in mammalian brains. After two weeks, the degree of synchrony in 3D networks decreased, as observed in vivo. These results show that dimensionality determines properties of neuronal networks and that several features of brain dynamics are a consequence of its 3D topology. PMID:27404281

  12. The role of dimensionality in neuronal network dynamics

    PubMed Central

    Ulloa Severino, Francesco Paolo; Ban, Jelena; Song, Qin; Tang, Mingliang; Bianconi, Ginestra; Cheng, Guosheng; Torre, Vincent

    2016-01-01

    Recent results from network theory show that complexity affects several dynamical properties of networks that favor synchronization. Here we show that synchronization in 2D and 3D neuronal networks is significantly different. Using dissociated hippocampal neurons we compared properties of cultures grown on a flat 2D substrates with those formed on 3D graphene foam scaffolds. Both 2D and 3D cultures had comparable glia to neuron ratio and the percentage of GABAergic inhibitory neurons. 3D cultures because of their dimension have many connections among distant neurons leading to small-world networks and their characteristic dynamics. After one week, calcium imaging revealed moderately synchronous activity in 2D networks, but the degree of synchrony of 3D networks was higher and had two regimes: a highly synchronized (HS) and a moderately synchronized (MS) regime. The HS regime was never observed in 2D networks. During the MS regime, neuronal assemblies in synchrony changed with time as observed in mammalian brains. After two weeks, the degree of synchrony in 3D networks decreased, as observed in vivo. These results show that dimensionality determines properties of neuronal networks and that several features of brain dynamics are a consequence of its 3D topology. PMID:27404281

  13. Synchronization and rhythm dynamics of a neuronal network consisting of mixed bursting neurons with hybrid synapses

    NASA Astrophysics Data System (ADS)

    Shi, Xia; Xi, Wenqi

    2016-05-01

    In this paper, burst synchronization and rhythm dynamics of a small-world neuronal network consisting of mixed bursting types of neurons coupled via inhibitory-excitatory chemical synapses are explored. Two quantities, the synchronization parameter and average width factor, are used to characterize the synchronization degree and rhythm dynamics of the neuronal network. Numerical results show that the percentage of the inhibitory synapses in the network is the major factor for we get a similarly bell-shaped dependence of synchronization on it, and the decrease of the average width factor of the network. We also find that not only the value of the coupling strength can promote the synchronization degree, but the probability of random edges adding to the small-world network also can. The ratio of the long bursting neurons has little effect on the burst synchronization and rhythm dynamics of the network.

  14. Signal propagation through feedforward neuronal networks with different operational modes

    NASA Astrophysics Data System (ADS)

    Li, Jie; Liu, Feng; Xu, Ding; Wang, Wei

    2009-02-01

    How neuronal activity is propagated across multiple layers of neurons is a fundamental issue in neuroscience. Using numerical simulations, we explored how the operational mode of neurons —coincidence detector or temporal integrator— could affect the propagation of rate signals through a 10-layer feedforward network with sparse connectivity. Our study was based on two kinds of neuron models. The Hodgkin-Huxley (HH) neuron can function as a coincidence detector, while the leaky integrate-and-fire (LIF) neuron can act as a temporal integrator. When white noise is afferent to the input layer, rate signals can be stably propagated through both networks, while neurons in deeper layers fire synchronously in the absence of background noise; but the underlying mechanism for the development of synchrony is different. When an aperiodic signal is presented, the network of HH neurons can represent the temporal structure of the signal in firing rate. Meanwhile, synchrony is well developed and is resistant to background noise. In contrast, rate signals are somewhat distorted during the propagation through the network of LIF neurons, and only weak synchrony occurs in deeper layers. That is, coincidence detectors have a performance advantage over temporal integrators in propagating rate signals. Therefore, given weak synaptic conductance and sparse connectivity between layers in both networks, synchrony does greatly subserve the propagation of rate signals with fidelity, and coincidence detection could be of considerable functional significance in cortical processing.

  15. Stimulus-dependent synchronization in delayed-coupled neuronal networks

    PubMed Central

    Esfahani, Zahra G.; Gollo, Leonardo L.; Valizadeh, Alireza

    2016-01-01

    Time delay is a general feature of all interactions. Although the effects of delayed interaction are often neglected when the intrinsic dynamics is much slower than the coupling delay, they can be crucial otherwise. We show that delayed coupled neuronal networks support transitions between synchronous and asynchronous states when the level of input to the network changes. The level of input determines the oscillation period of neurons and hence whether time-delayed connections are synchronizing or desynchronizing. We find that synchronizing connections lead to synchronous dynamics, whereas desynchronizing connections lead to out-of-phase oscillations in network motifs and to frustrated states with asynchronous dynamics in large networks. Since the impact of a neuronal network to downstream neurons increases when spikes are synchronous, networks with delayed connections can serve as gatekeeper layers mediating the firing transfer to other regions. This mechanism can regulate the opening and closing of communicating channels between cortical layers on demand. PMID:27001428

  16. Stimulus-dependent synchronization in delayed-coupled neuronal networks.

    PubMed

    Esfahani, Zahra G; Gollo, Leonardo L; Valizadeh, Alireza

    2016-01-01

    Time delay is a general feature of all interactions. Although the effects of delayed interaction are often neglected when the intrinsic dynamics is much slower than the coupling delay, they can be crucial otherwise. We show that delayed coupled neuronal networks support transitions between synchronous and asynchronous states when the level of input to the network changes. The level of input determines the oscillation period of neurons and hence whether time-delayed connections are synchronizing or desynchronizing. We find that synchronizing connections lead to synchronous dynamics, whereas desynchronizing connections lead to out-of-phase oscillations in network motifs and to frustrated states with asynchronous dynamics in large networks. Since the impact of a neuronal network to downstream neurons increases when spikes are synchronous, networks with delayed connections can serve as gatekeeper layers mediating the firing transfer to other regions. This mechanism can regulate the opening and closing of communicating channels between cortical layers on demand. PMID:27001428

  17. Real-time tracking of neuronal network structure using data assimilation

    NASA Astrophysics Data System (ADS)

    Hamilton, Franz; Berry, Tyrus; Peixoto, Nathalia; Sauer, Timothy

    2013-11-01

    A nonlinear data assimilation technique is applied to determine and track effective connections between ensembles of cultured spinal cord neurons measured with multielectrode arrays. The method is statistical, depending only on confidence intervals, and requiring no form of arbitrary thresholding. In addition, the method updates connection strengths sequentially, enabling real-time tracking of nonstationary networks. The ensemble Kalman filter is used with a generic spiking neuron model to estimate connection strengths as well as other system parameters to deal with model mismatch. The method is validated on noisy synthetic data from Hodgkin-Huxley model neurons before being used to find network connections in the neural culture recordings.

  18. On The Use of Dynamic Bayesian Networks in Reconstructing Functional Neuronal Networks from Spike Train Ensembles

    PubMed Central

    Eldawlatly, Seif; Zhou, Yang; Jin, Rong; Oweiss, Karim G.

    2009-01-01

    Coordination among cortical neurons is believed to be key element in mediating many high level cortical processes such as perception, attention, learning and memory formation. Inferring the topology of the neural circuitry underlying this coordination is important to characterize the highly non-linear, time-varying interactions between cortical neurons in the presence of complex stimuli. In this work, we investigate the applicability of Dynamic Bayesian Networks (DBNs) in inferring the effective connectivity between spiking cortical neurons from their observed spike trains. We demonstrate that DBNs can infer the underlying non-linear and time-varying causal interactions between these neurons and can discriminate between mono and polysynaptic links between them under certain constraints governing their putative connectivity. We analyzed conditionally-Poisson spike train data mimicking spiking activity of cortical networks of small and moderately-large sizes. The performance was assessed and compared to other methods under systematic variations of the network structure to mimic a wide range of responses typically observed in the cortex. Results demonstrate the utility of DBN in inferring the effective connectivity in cortical networks. PMID:19852619

  19. Temporally tuned neuronal differentiation supports the functional remodeling of a neuronal network in Drosophila.

    PubMed

    Veverytsa, Lyubov; Allan, Douglas W

    2012-03-27

    During insect metamorphosis, neuronal networks undergo extensive remodeling by restructuring their connectivity and recruiting newborn neurons from postembryonic lineages. The neuronal network that directs the essential behavior, ecdysis, generates a distinct behavioral sequence at each developmental transition. Larval ecdysis replaces the cuticle between larval stages, and pupal ecdysis externalizes and expands the head and appendages to their adult position. However, the network changes that support these differences are unknown. Crustacean cardioactive peptide (CCAP) neurons and the peptide hormones they secrete are critical for ecdysis; their targeted ablation alters larval ecdysis progression and results in a failure of pupal ecdysis. In this study, we demonstrate that the CCAP neuron network is remodeled immediately before pupal ecdysis by the emergence of 12 late CCAP neurons. All 12 are CCAP efferents that exit the central nervous system. Importantly, these late CCAP neurons were found to be entirely sufficient for wild-type pupal ecdysis, even after targeted ablation of all other 42 CCAP neurons. Our evidence indicates that late CCAP neurons are derived from early, likely embryonic, lineages. However, they do not differentiate to express their peptide hormone battery, nor do they project an axon via lateral nerve trunks until pupariation, both of which are believed to be critical for the function of CCAP efferent neurons in ecdysis. Further analysis implicated ecdysone signaling via ecdysone receptors A/B1 and the nuclear receptor ftz-f1 as the differentiation trigger. These results demonstrate the utility of temporally tuned neuronal differentiation as a hard-wired developmental mechanism to remodel a neuronal network to generate a scheduled change in behavior. PMID:22393011

  20. Network of hypothalamic neurons that control appetite

    PubMed Central

    Sohn, Jong-Woo

    2015-01-01

    The central nervous system (CNS) controls food intake and energy expenditure via tight coordinations between multiple neuronal populations. Specifically, two distinct neuronal populations exist in the arcuate nucleus of hypothalamus (ARH): the anorexigenic (appetite-suppressing) pro-opiomelanocortin (POMC) neurons and the orexigenic (appetite-increasing) neuropeptide Y (NPY)/agouti-related peptide (AgRP) neurons. The coordinated regulation of neuronal circuit involving these neurons is essential in properly maintaining energy balance, and any disturbance therein may result in hyperphagia/obesity or hypophagia/starvation. Thus, adequate knowledge of the POMC and NPY/AgRP neuron physiology is mandatory to understand the pathophysiology of obesity and related metabolic diseases. This review will discuss the history and recent updates on the POMC and NPY/AgRP neuronal circuits, as well as the general anorexigenic and orexigenic circuits in the CNS. [BMB Reports 2015; 48(4): 229-233] PMID:25560696

  1. Small is beautiful: models of small neuronal networks.

    PubMed

    Lamb, Damon G; Calabrese, Ronald L

    2012-08-01

    Modeling has contributed a great deal to our understanding of how individual neurons and neuronal networks function. In this review, we focus on models of the small neuronal networks of invertebrates, especially rhythmically active CPG networks. Models have elucidated many aspects of these networks, from identifying key interacting membrane properties to pointing out gaps in our understanding, for example missing neurons. Even the complex CPGs of vertebrates, such as those that underlie respiration, have been reduced to small network models to great effect. Modeling of these networks spans from simplified models, which are amenable to mathematical analyses, to very complicated biophysical models. Some researchers have now adopted a population approach, where they generate and analyze many related models that differ in a few to several judiciously chosen free parameters; often these parameters show variability across animals and thus justify the approach. Models of small neuronal networks will continue to expand and refine our understanding of how neuronal networks in all animals program motor output, process sensory information and learn. PMID:22364687

  2. Small is beautiful: models of small neuronal networks

    PubMed Central

    Lamb, Damon G; Calabrese, Ronald L

    2013-01-01

    Modeling has contributed a great deal to our understanding of how individual neurons and neuronal networks function. In this review, we focus on models of the small neuronal networks of invertebrates, especially rhythmically active CPG networks. Models have elucidated many aspects of these networks, from identifying key interacting membrane properties to pointing out gaps in our understanding, for example missing neurons. Even the complex CPGs of vertebrates, such as those that underlie respiration, have been reduced to small network models to great effect. Modeling of these networks spans from simplified models, which are amenable to mathematical analyses, to very complicated biophysical models. Some researchers have now adopted a population approach, where they generate and analyze many related models that differ in a few to several judiciously chosen free parameters; often these parameters show variability across animals and thus justify the approach. Models of small neuronal networks will continue to expand and refine our understanding of how neuronal networks in all animals program motor output, process sensory information and learn. PMID:22364687

  3. Highly connected neurons spike less frequently in balanced networks.

    PubMed

    Pyle, Ryan; Rosenbaum, Robert

    2016-04-01

    Biological neuronal networks exhibit highly variable spiking activity. Balanced networks offer a parsimonious model of this variability in which strong excitatory synaptic inputs are canceled by strong inhibitory inputs on average, and irregular spiking activity is driven by fluctuating synaptic currents. Most previous studies of balanced networks assume a homogeneous or distance-dependent connectivity structure, but connectivity in biological cortical networks is more intricate. We use a heterogeneous mean-field theory of balanced networks to show that heterogeneous in-degrees can break balance. Moreover, heterogeneous architectures that achieve balance promote lower firing rates in neurons with larger in-degrees, consistent with some recent experimental observations. PMID:27176240

  4. Highly connected neurons spike less frequently in balanced networks

    NASA Astrophysics Data System (ADS)

    Pyle, Ryan; Rosenbaum, Robert

    2016-04-01

    Biological neuronal networks exhibit highly variable spiking activity. Balanced networks offer a parsimonious model of this variability in which strong excitatory synaptic inputs are canceled by strong inhibitory inputs on average, and irregular spiking activity is driven by fluctuating synaptic currents. Most previous studies of balanced networks assume a homogeneous or distance-dependent connectivity structure, but connectivity in biological cortical networks is more intricate. We use a heterogeneous mean-field theory of balanced networks to show that heterogeneous in-degrees can break balance. Moreover, heterogeneous architectures that achieve balance promote lower firing rates in neurons with larger in-degrees, consistent with some recent experimental observations.

  5. Colloid-guided assembly of oriented 3D neuronal networks

    PubMed Central

    Pautot, Sophie; Wyart, Claire; Isacoff, Ehud Y

    2009-01-01

    A central challenge in neuroscience is to understand the formation and function of three-dimensional (3D) neuronal networks. In vitro studies have been mainly limited to measurements of small numbers of neurons connected in two dimensions. Here we demonstrate the use of colloids as moveable supports for neuronal growth, maturation, transfection and manipulation, where the colloids serve as guides for the assembly of controlled 3D, millimeter-sized neuronal networks. Process growth can be guided into layered connectivity with a density similar to what is found in vivo. The colloidal superstructures are optically transparent, enabling remote stimulation and recording of neuronal activity using layer-specific expression of light-activated channels and indicator dyes. The modular approach toward in vitro circuit construction provides a stepping stone for applications ranging from basic neuroscience to neuron-based screening of targeted drugs. PMID:18641658

  6. Scalable Semisupervised Functional Neurocartography Reveals Canonical Neurons in Behavioral Networks.

    PubMed

    Frady, E Paxon; Kapoor, Ashish; Horvitz, Eric; Kristan, William B

    2016-08-01

    Large-scale data collection efforts to map the brain are underway at multiple spatial and temporal scales, but all face fundamental problems posed by high-dimensional data and intersubject variability. Even seemingly simple problems, such as identifying a neuron/brain region across animals/subjects, become exponentially more difficult in high dimensions, such as recognizing dozens of neurons/brain regions simultaneously. We present a framework and tools for functional neurocartography-the large-scale mapping of neural activity during behavioral states. Using a voltage-sensitive dye (VSD), we imaged the multifunctional responses of hundreds of leech neurons during several behaviors to identify and functionally map homologous neurons. We extracted simple features from each of these behaviors and combined them with anatomical features to create a rich medium-dimensional feature space. This enabled us to use machine learning techniques and visualizations to characterize and account for intersubject variability, piece together a canonical atlas of neural activity, and identify two behavioral networks. We identified 39 neurons (18 pairs, 3 unpaired) as part of a canonical swim network and 17 neurons (8 pairs, 1 unpaired) involved in a partially overlapping preparatory network. All neurons in the preparatory network rapidly depolarized at the onsets of each behavior, suggesting that it is part of a dedicated rapid-response network. This network is likely mediated by the S cell, and we referenced VSD recordings to an activity atlas to identify multiple cells of interest simultaneously in real time for further experiments. We targeted and electrophysiologically verified several neurons in the swim network and further showed that the S cell is presynaptic to multiple neurons in the preparatory network. This study illustrates the basic framework to map neural activity in high dimensions with large-scale recordings and how to extract the rich information necessary to perform

  7. Emerging dynamics in neuronal networks of diffusively coupled hard oscillators.

    PubMed

    Ponta, L; Lanza, V; Bonnin, M; Corinto, F

    2011-06-01

    Oscillatory networks are a special class of neural networks where each neuron exhibits time periodic behavior. They represent bio-inspired architectures which can be exploited to model biological processes such as the binding problem and selective attention. In this paper we investigate the dynamics of networks whose neurons are hard oscillators, namely they exhibit the coexistence of different stable attractors. We consider a constant external stimulus applied to each neuron, which influences the neuron's own natural frequency. We show that, due to the interaction between different kinds of attractors, as well as between attractors and repellors, new interesting dynamics arises, in the form of synchronous oscillations of various amplitudes. We also show that neurons subject to different stimuli are able to synchronize if their couplings are strong enough. PMID:21411276

  8. Intermittent synchronization in a network of bursting neurons

    NASA Astrophysics Data System (ADS)

    Park, Choongseok; Rubchinsky, Leonid L.

    2011-09-01

    Synchronized oscillations in networks of inhibitory and excitatory coupled bursting neurons are common in a variety of neural systems from central pattern generators to human brain circuits. One example of the latter is the subcortical network of the basal ganglia, formed by excitatory and inhibitory bursters of the subthalamic nucleus and globus pallidus, involved in motor control and affected in Parkinson's disease. Recent experiments have demonstrated the intermittent nature of the phase-locking of neural activity in this network. Here, we explore one potential mechanism to explain the intermittent phase-locking in a network. We simplify the network to obtain a model of two inhibitory coupled elements and explore its dynamics. We used geometric analysis and singular perturbation methods for dynamical systems to reduce the full model to a simpler set of equations. Mathematical analysis was completed using three slow variables with two different time scales. Intermittently, synchronous oscillations are generated by overlapped spiking which crucially depends on the geometry of the slow phase plane and the interplay between slow variables as well as the strength of synapses. Two slow variables are responsible for the generation of activity patterns with overlapped spiking, and the other slower variable enhances the robustness of an irregular and intermittent activity pattern. While the analyzed network and the explored mechanism of intermittent synchrony appear to be quite generic, the results of this analysis can be used to trace particular values of biophysical parameters (synaptic strength and parameters of calcium dynamics), which are known to be impacted in Parkinson's disease.

  9. Qualitative-Modeling-Based Silicon Neurons and Their Networks

    PubMed Central

    Kohno, Takashi; Sekikawa, Munehisa; Li, Jing; Nanami, Takuya; Aihara, Kazuyuki

    2016-01-01

    The ionic conductance models of neuronal cells can finely reproduce a wide variety of complex neuronal activities. However, the complexity of these models has prompted the development of qualitative neuron models. They are described by differential equations with a reduced number of variables and their low-dimensional polynomials, which retain the core mathematical structures. Such simple models form the foundation of a bottom-up approach in computational and theoretical neuroscience. We proposed a qualitative-modeling-based approach for designing silicon neuron circuits, in which the mathematical structures in the polynomial-based qualitative models are reproduced by differential equations with silicon-native expressions. This approach can realize low-power-consuming circuits that can be configured to realize various classes of neuronal cells. In this article, our qualitative-modeling-based silicon neuron circuits for analog and digital implementations are quickly reviewed. One of our CMOS analog silicon neuron circuits can realize a variety of neuronal activities with a power consumption less than 72 nW. The square-wave bursting mode of this circuit is explained. Another circuit can realize Class I and II neuronal activities with about 3 nW. Our digital silicon neuron circuit can also realize these classes. An auto-associative memory realized on an all-to-all connected network of these silicon neurons is also reviewed, in which the neuron class plays important roles in its performance. PMID:27378842

  10. Emergence of Slow-Switching Assemblies in Structured Neuronal Networks

    PubMed Central

    Schaub, Michael T.; Billeh, Yazan N.; Anastassiou, Costas A.; Koch, Christof; Barahona, Mauricio

    2015-01-01

    Unraveling the interplay between connectivity and spatio-temporal dynamics in neuronal networks is a key step to advance our understanding of neuronal information processing. Here we investigate how particular features of network connectivity underpin the propensity of neural networks to generate slow-switching assembly (SSA) dynamics, i.e., sustained epochs of increased firing within assemblies of neurons which transition slowly between different assemblies throughout the network. We show that the emergence of SSA activity is linked to spectral properties of the asymmetric synaptic weight matrix. In particular, the leading eigenvalues that dictate the slow dynamics exhibit a gap with respect to the bulk of the spectrum, and the associated Schur vectors exhibit a measure of block-localization on groups of neurons, thus resulting in coherent dynamical activity on those groups. Through simple rate models, we gain analytical understanding of the origin and importance of the spectral gap, and use these insights to develop new network topologies with alternative connectivity paradigms which also display SSA activity. Specifically, SSA dynamics involving excitatory and inhibitory neurons can be achieved by modifying the connectivity patterns between both types of neurons. We also show that SSA activity can occur at multiple timescales reflecting a hierarchy in the connectivity, and demonstrate the emergence of SSA in small-world like networks. Our work provides a step towards understanding how network structure (uncovered through advancements in neuroanatomy and connectomics) can impact on spatio-temporal neural activity and constrain the resulting dynamics. PMID:26176664

  11. Genotoxicants Target Distinct Molecular Networks in Neonatal Neurons

    PubMed Central

    Kisby, Glen E.; Olivas, Antoinette; Standley, Melissa; Lu, Xinfang; Pattee, Patrick; O’Malley, Jean; Li, Xiaorong; Muniz, Juan; Nagalla, Srinavasa R.

    2006-01-01

    Background Exposure of the brain to environmental agents during critical periods of neuronal development is considered a key factor underlying many neurologic disorders. Objectives In this study we examined the influence of genotoxicants on cerebellar function during early development by measuring global gene expression changes. Methods We measured global gene expression in immature cerebellar neurons (i.e., granule cells) after treatment with two distinct alkylating agents, methylazoxymethanol (MAM) and nitrogen mustard (HN2). Granule cell cultures were treated for 24 hr with MAM (10–1,000 μM) or HN2 (0.1–20 μM) and examined for cell viability, DNA damage, and markers of apoptosis. Results Neuronal viability was significantly reduced (p < 0.01) at concentrations > 500 μM for MAM and > 1.0 μM for HN2; this correlated with an increase in both DNA damage and markers of apoptosis. Neuronal cultures treated with sublethal concentrations of MAM (100 μM) or HN2 (1.0 μM) were then examined for gene expression using large-scale mouse cDNA microarrays (27,648). Gene expression results revealed that a) global gene expression was predominantly up-regulated by both genotoxicants; b) the number of down-regulated genes was approximately 3-fold greater for HN2 than for MAM; and c) distinct classes of molecules were influenced by MAM (i.e, neuronal differentiation, the stress and immune response, and signal transduction) and HN2 (i.e, protein synthesis and apoptosis). Conclusions These studies demonstrate that individual genotoxicants induce distinct gene expression signatures. Further study of these molecular networks may explain the variable response of the developing brain to different types of environmental genotoxicants. PMID:17107856

  12. Visualizing Neuronal Network Connectivity with Connectivity Pattern Tables

    PubMed Central

    Nordlie, Eilen; Plesser, Hans Ekkehard

    2009-01-01

    Complex ideas are best conveyed through well-designed illustrations. Up to now, computational neuroscientists have mostly relied on box-and-arrow diagrams of even complex neuronal networks, often using ad hoc notations with conflicting use of symbols from paper to paper. This significantly impedes the communication of ideas in neuronal network modeling. We present here Connectivity Pattern Tables (CPTs) as a clutter-free visualization of connectivity in large neuronal networks containing two-dimensional populations of neurons. CPTs can be generated automatically from the same script code used to create the actual network in the NEST simulator. Through aggregation, CPTs can be viewed at different levels, providing either full detail or summary information. We also provide the open source ConnPlotter tool as a means to create connectivity pattern tables. PMID:20140265

  13. An FPGA-Based Silicon Neuronal Network with Selectable Excitability Silicon Neurons.

    PubMed

    Li, Jing; Katori, Yuichi; Kohno, Takashi

    2012-01-01

    This paper presents a digital silicon neuronal network which simulates the nerve system in creatures and has the ability to execute intelligent tasks, such as associative memory. Two essential elements, the mathematical-structure-based digital spiking silicon neuron (DSSN) and the transmitter release based silicon synapse, allow us to tune the excitability of silicon neurons and are computationally efficient for hardware implementation. We adopt mixed pipeline and parallel structure and shift operations to design a sufficient large and complex network without excessive hardware resource cost. The network with 256 full-connected neurons is built on a Digilent Atlys board equipped with a Xilinx Spartan-6 LX45 FPGA. Besides, a memory control block and USB control block are designed to accomplish the task of data communication between the network and the host PC. This paper also describes the mechanism of associative memory performed in the silicon neuronal network. The network is capable of retrieving stored patterns if the inputs contain enough information of them. The retrieving probability increases with the similarity between the input and the stored pattern increasing. Synchronization of neurons is observed when the successful stored pattern retrieval occurs. PMID:23269911

  14. Mapping Generative Models onto a Network of Digital Spiking Neurons.

    PubMed

    Pedroni, Bruno U; Das, Srinjoy; Arthur, John V; Merolla, Paul A; Jackson, Bryan L; Modha, Dharmendra S; Kreutz-Delgado, Kenneth; Cauwenberghs, Gert

    2016-08-01

    Stochastic neural networks such as Restricted Boltzmann Machines (RBMs) have been successfully used in applications ranging from speech recognition to image classification, and are particularly interesting because of their potential for generative tasks. Inference and learning in these algorithms use a Markov Chain Monte Carlo procedure called Gibbs sampling, where a logistic function forms the kernel of this sampler. On the other side of the spectrum, neuromorphic systems have shown great promise for low-power and parallelized cognitive computing, but lack well-suited applications and automation procedures. In this work, we propose a systematic method for bridging the RBM algorithm and digital neuromorphic systems, with a generative pattern completion task as proof of concept. For this, we first propose a method of producing the Gibbs sampler using bio-inspired digital noisy integrate-and-fire neurons. Next, we describe the process of mapping generative RBMs trained offline onto the IBM TrueNorth neurosynaptic processor-a low-power digital neuromorphic VLSI substrate. Mapping these algorithms onto neuromorphic hardware presents unique challenges in network connectivity and weight and bias quantization, which, in turn, require architectural and design strategies for the physical realization. Generative performance is analyzed to validate the neuromorphic requirements and to best select the neuron parameters for the model. Lastly, we describe a design automation procedure which achieves optimal resource usage, accounting for the novel hardware adaptations. This work represents the first implementation of generative RBM inference on a neuromorphic VLSI substrate. PMID:27214915

  15. Cluster synchronization in networks of neurons with chemical synapses

    SciTech Connect

    Juang, Jonq; Liang, Yu-Hao

    2014-03-15

    In this work, we study the cluster synchronization of chemically coupled and generally formulated networks which are allowed to be nonidentical. The sufficient condition for the existence of stably synchronous clusters is derived. Specifically, we only need to check the stability of the origins of m decoupled linear systems. Here, m is the number of subpopulations. Examples of nonidentical networks such as Hindmarsh-Rose (HR) neurons with various choices of parameters in different subpopulations, or HR neurons in one subpopulation and FitzHugh-Nagumo neurons in the other subpopulation are provided. Explicit threshold for the coupling strength that guarantees the stably cluster synchronization can be obtained.

  16. Ordering spatiotemporal chaos in complex thermosensitive neuron networks

    NASA Astrophysics Data System (ADS)

    Gong, Yubing; Xu, Bo; Xu, Qiang; Yang, Chuanlu; Ren, Tingqi; Hou, Zhonghuai; Xin, Houwen

    2006-04-01

    We have studied the effect of random long-range connections in chaotic thermosensitive neuron networks with each neuron being capable of exhibiting diverse bursting behaviors, and found stochastic synchronization and optimal spatiotemporal patterns. For a given coupling strength, the chaotic burst-firings of the neurons become more and more synchronized as the number of random connections (or randomness) is increased and, rather, the most pronounced spatiotemporal pattern appears for an optimal randomness. As the coupling strength is increased, the optimal randomness shifts towards a smaller strength. This result shows that random long-range connections can tame the chaos in the neural networks and make the neurons more effectively reach synchronization. Since the model studied can be used to account for hypothalamic neurons of dogfish, catfish, etc., this result may reflect the significant role of random connections in transferring biological information.

  17. Developing neuronal networks: Self-organized criticality predicts the future

    NASA Astrophysics Data System (ADS)

    Pu, Jiangbo; Gong, Hui; Li, Xiangning; Luo, Qingming

    2013-01-01

    Self-organized criticality emerged in neural activity is one of the key concepts to describe the formation and the function of developing neuronal networks. The relationship between critical dynamics and neural development is both theoretically and experimentally appealing. However, whereas it is well-known that cortical networks exhibit a rich repertoire of activity patterns at different stages during in vitro maturation, dynamical activity patterns through the entire neural development still remains unclear. Here we show that a series of metastable network states emerged in the developing and ``aging'' process of hippocampal networks cultured from dissociated rat neurons. The unidirectional sequence of state transitions could be only observed in networks showing power-law scaling of distributed neuronal avalanches. Our data suggest that self-organized criticality may guide spontaneous activity into a sequential succession of homeostatically-regulated transient patterns during development, which may help to predict the tendency of neural development at early ages in the future.

  18. Structure-Dynamics Relationships in Bursting Neuronal Networks Revealed Using a Prediction Framework

    PubMed Central

    Mäki-Marttunen, Tuomo; Aćimović, Jugoslava; Ruohonen, Keijo; Linne, Marja-Leena

    2013-01-01

    The question of how the structure of a neuronal network affects its functionality has gained a lot of attention in neuroscience. However, the vast majority of the studies on structure-dynamics relationships consider few types of network structures and assess limited numbers of structural measures. In this in silico study, we employ a wide diversity of network topologies and search among many possibilities the aspects of structure that have the greatest effect on the network excitability. The network activity is simulated using two point-neuron models, where the neurons are activated by noisy fluctuation of the membrane potential and their connections are described by chemical synapse models, and statistics on the number and quality of the emergent network bursts are collected for each network type. We apply a prediction framework to the obtained data in order to find out the most relevant aspects of network structure. In this framework, predictors that use different sets of graph-theoretic measures are trained to estimate the activity properties, such as burst count or burst length, of the networks. The performances of these predictors are compared with each other. We show that the best performance in prediction of activity properties for networks with sharp in-degree distribution is obtained when the prediction is based on clustering coefficient. By contrast, for networks with broad in-degree distribution, the maximum eigenvalue of the connectivity graph gives the most accurate prediction. The results shown for small () networks hold with few exceptions when different neuron models, different choices of neuron population and different average degrees are applied. We confirm our conclusions using larger () networks as well. Our findings reveal the relevance of different aspects of network structure from the viewpoint of network excitability, and our integrative method could serve as a general framework for structure-dynamics studies in biosciences. PMID:23935998

  19. Structure-dynamics relationships in bursting neuronal networks revealed using a prediction framework.

    PubMed

    Mäki-Marttunen, Tuomo; Aćimović, Jugoslava; Ruohonen, Keijo; Linne, Marja-Leena

    2013-01-01

    The question of how the structure of a neuronal network affects its functionality has gained a lot of attention in neuroscience. However, the vast majority of the studies on structure-dynamics relationships consider few types of network structures and assess limited numbers of structural measures. In this in silico study, we employ a wide diversity of network topologies and search among many possibilities the aspects of structure that have the greatest effect on the network excitability. The network activity is simulated using two point-neuron models, where the neurons are activated by noisy fluctuation of the membrane potential and their connections are described by chemical synapse models, and statistics on the number and quality of the emergent network bursts are collected for each network type. We apply a prediction framework to the obtained data in order to find out the most relevant aspects of network structure. In this framework, predictors that use different sets of graph-theoretic measures are trained to estimate the activity properties, such as burst count or burst length, of the networks. The performances of these predictors are compared with each other. We show that the best performance in prediction of activity properties for networks with sharp in-degree distribution is obtained when the prediction is based on clustering coefficient. By contrast, for networks with broad in-degree distribution, the maximum eigenvalue of the connectivity graph gives the most accurate prediction. The results shown for small ([Formula: see text]) networks hold with few exceptions when different neuron models, different choices of neuron population and different average degrees are applied. We confirm our conclusions using larger ([Formula: see text]) networks as well. Our findings reveal the relevance of different aspects of network structure from the viewpoint of network excitability, and our integrative method could serve as a general framework for structure

  20. Transient electrical coupling regulates formation of neuronal networks.

    PubMed

    Szabo, Theresa M; Zoran, Mark J

    2007-01-19

    Electrical synapses are abundant before and during developmental windows of intense chemical synapse formation, and might therefore contribute to the establishment of neuronal networks. Transient electrical coupling develops and is then eliminated between regenerating Helisoma motoneurons 110 and 19 during a period of 48-72 h in vivo and in vitro following nerve injury. An inverse relationship exists between electrical coupling and chemical synaptic transmission at these synapses, such that the decline in electrical coupling is coincident with the emergence of cholinergic synaptic transmission. In this study, we have generated two- and three-cell neuronal networks to test whether predicted synaptogenic capabilities were affected by previous synaptic interactions. Electrophysiological analyses demonstrated that synapses formed in three-cell neuronal networks were not those predicted based on synaptogenic outcomes in two-cell networks. Thus, new electrical and chemical synapse formation within a neuronal network is dependent on existing connectivity of that network. In addition, new contacts formed with established networks have little impact on these existing connections. These results suggest that network-dependent mechanisms, particularly those mediated by gap junctional coupling, regulate synapse formation within simple neural networks. PMID:17156754

  1. Spike-timing error backpropagation in theta neuron networks.

    PubMed

    McKennoch, Sam; Voegtlin, Thomas; Bushnell, Linda

    2009-01-01

    The main contribution of this letter is the derivation of a steepest gradient descent learning rule for a multilayer network of theta neurons, a one-dimensional nonlinear neuron model. Central to our model is the assumption that the intrinsic neuron dynamics are sufficient to achieve consistent time coding, with no need to involve the precise shape of postsynaptic currents; this assumption departs from other related models such as SpikeProp and Tempotron learning. Our results clearly show that it is possible to perform complex computations by applying supervised learning techniques to the spike times and time response properties of nonlinear integrate and fire neurons. Networks trained with our multilayer training rule are shown to have similar generalization abilities for spike latency pattern classification as Tempotron learning. The rule is also able to train networks to perform complex regression tasks that neither SpikeProp or Tempotron learning appears to be capable of. PMID:19431278

  2. Network and neuronal membrane properties in hybrid networks reciprocally regulate selectivity to rapid thalamocortical inputs.

    PubMed

    Pesavento, Michael J; Pinto, David J

    2012-11-01

    Rapidly changing environments require rapid processing from sensory inputs. Varying deflection velocities of a rodent's primary facial vibrissa cause varying temporal neuronal activity profiles within the ventral posteromedial thalamic nucleus. Local neuron populations in a single somatosensory layer 4 barrel transform sparsely coded input into a spike count based on the input's temporal profile. We investigate this transformation by creating a barrel-like hybrid network with whole cell recordings of in vitro neurons from a cortical slice preparation, embedding the biological neuron in the simulated network by presenting virtual synaptic conductances via a conductance clamp. Utilizing the hybrid network, we examine the reciprocal network properties (local excitatory and inhibitory synaptic convergence) and neuronal membrane properties (input resistance) by altering the barrel population response to diverse thalamic input. In the presence of local network input, neurons are more selective to thalamic input timing; this arises from strong feedforward inhibition. Strongly inhibitory (damping) network regimes are more selective to timing and less selective to the magnitude of input but require stronger initial input. Input selectivity relies heavily on the different membrane properties of excitatory and inhibitory neurons. When inhibitory and excitatory neurons had identical membrane properties, the sensitivity of in vitro neurons to temporal vs. magnitude features of input was substantially reduced. Increasing the mean leak conductance of the inhibitory cells decreased the network's temporal sensitivity, whereas increasing excitatory leak conductance enhanced magnitude sensitivity. Local network synapses are essential in shaping thalamic input, and differing membrane properties of functional classes reciprocally modulate this effect. PMID:22896716

  3. Effects of extracellular potassium diffusion on electrically coupled neuron networks

    NASA Astrophysics Data System (ADS)

    Wu, Xing-Xing; Shuai, Jianwei

    2015-02-01

    Potassium accumulation and diffusion during neuronal epileptiform activity have been observed experimentally, and potassium lateral diffusion has been suggested to play an important role in nonsynaptic neuron networks. We adopt a hippocampal CA1 pyramidal neuron network in a zero-calcium condition to better understand the influence of extracellular potassium dynamics on the stimulus-induced activity. The potassium concentration in the interstitial space for each neuron is regulated by potassium currents, Na+-K+ pumps, glial buffering, and ion diffusion. In addition to potassium diffusion, nearby neurons are also coupled through gap junctions. Our results reveal that the latency of the first spike responding to stimulus monotonically decreases with increasing gap-junction conductance but is insensitive to potassium diffusive coupling. The duration of network oscillations shows a bell-like shape with increasing potassium diffusive coupling at weak gap-junction coupling. For modest electrical coupling, there is an optimal K+ diffusion strength, at which the flow of potassium ions among the network neurons appropriately modulates interstitial potassium concentrations in a degree that provides the most favorable environment for the generation and continuance of the action potential waves in the network.

  4. Balanced Networks of Spiking Neurons with Spatially Dependent Recurrent Connections

    NASA Astrophysics Data System (ADS)

    Rosenbaum, Robert; Doiron, Brent

    2014-04-01

    Networks of model neurons with balanced recurrent excitation and inhibition capture the irregular and asynchronous spiking activity reported in cortex. While mean-field theories of spatially homogeneous balanced networks are well understood, a mean-field analysis of spatially heterogeneous balanced networks has not been fully developed. We extend the analysis of balanced networks to include a connection probability that depends on the spatial separation between neurons. In the continuum limit, we derive that stable, balanced firing rate solutions require that the spatial spread of external inputs be broader than that of recurrent excitation, which in turn must be broader than or equal to that of recurrent inhibition. Notably, this implies that network models with broad recurrent inhibition are inconsistent with the balanced state. For finite size networks, we investigate the pattern-forming dynamics arising when balanced conditions are not satisfied. Our study highlights the new challenges that balanced networks pose for the spatiotemporal dynamics of complex systems.

  5. The estimation of neurotransmitter release probability in feedforward neuronal network based on adaptive synchronization

    NASA Astrophysics Data System (ADS)

    Xue, Ming; Wang, Jiang; Jia, Chenhui; Yu, Haitao; Deng, Bin; Wei, Xile; Che, Yanqiu

    2013-03-01

    In this paper, we proposed a new approach to estimate unknown parameters and topology of a neuronal network based on the adaptive synchronization control scheme. A virtual neuronal network is constructed as an observer to track the membrane potential of the corresponding neurons in the original network. When they achieve synchronization, the unknown parameters and topology of the original network are obtained. The method is applied to estimate the real-time status of the connection in the feedforward network and the neurotransmitter release probability of unreliable synapses is obtained by statistic computation. Numerical simulations are also performed to demonstrate the effectiveness of the proposed adaptive controller. The obtained results may have important implications in system identification in neural science.

  6. Neural Dynamics as Sampling: A Model for Stochastic Computation in Recurrent Networks of Spiking Neurons

    PubMed Central

    Buesing, Lars; Bill, Johannes; Nessler, Bernhard; Maass, Wolfgang

    2011-01-01

    The organization of computations in networks of spiking neurons in the brain is still largely unknown, in particular in view of the inherently stochastic features of their firing activity and the experimentally observed trial-to-trial variability of neural systems in the brain. In principle there exists a powerful computational framework for stochastic computations, probabilistic inference by sampling, which can explain a large number of macroscopic experimental data in neuroscience and cognitive science. But it has turned out to be surprisingly difficult to create a link between these abstract models for stochastic computations and more detailed models of the dynamics of networks of spiking neurons. Here we create such a link and show that under some conditions the stochastic firing activity of networks of spiking neurons can be interpreted as probabilistic inference via Markov chain Monte Carlo (MCMC) sampling. Since common methods for MCMC sampling in distributed systems, such as Gibbs sampling, are inconsistent with the dynamics of spiking neurons, we introduce a different approach based on non-reversible Markov chains that is able to reflect inherent temporal processes of spiking neuronal activity through a suitable choice of random variables. We propose a neural network model and show by a rigorous theoretical analysis that its neural activity implements MCMC sampling of a given distribution, both for the case of discrete and continuous time. This provides a step towards closing the gap between abstract functional models of cortical computation and more detailed models of networks of spiking neurons. PMID:22096452

  7. Vibrational resonance in a heterogeneous scale free network of neurons

    NASA Astrophysics Data System (ADS)

    Uzuntarla, Muhammet; Yilmaz, Ergin; Wagemakers, Alexandre; Ozer, Mahmut

    2015-05-01

    Vibrational resonance (VR) is a phenomenon whereby the response of some dynamical systems to a weak low-frequency signal can be maximized with the assistance of an optimal intensity of another high-frequency signal. In this paper, we study the VR in a heterogeneous neural system having a complex network topology. We consider a scale-free network of neurons where the heterogeneity is in the intrinsic excitability of the individual neurons. It is shown that emergence of VR in heterogeneous neuron population requires less energy than a homogeneous population. We also find that electrical coupling strength among neurons plays a key role in determining the weak signal processing capacity of the heterogeneous population. Lastly, we investigate the influence of interneuronal link density on the VR and demonstrate that the energy needed to obtain the resonance grows with the increase in average degree.

  8. Anticipated synchronization in neuronal network motifs

    NASA Astrophysics Data System (ADS)

    Matias, F. S.; Gollo, L. L.; Carelli, P. V.; Copelli, M.; Mirasso, C. R.

    2013-01-01

    Two identical dynamical systems coupled unidirectionally (in a so called master-slave configuration) exhibit anticipated synchronization (AS) if the one which receives the coupling (the slave) also receives a negative delayed self-feedback. In oscillatory neuronal systems AS is characterized by a phase-locking with negative time delay τ between the spikes of the master and of the slave (slave fires before the master), while in the usual delayed synchronization (DS) regime τ is positive (slave fires after the master). A 3-neuron motif in which the slave self-feedback is replaced by a feedback loop mediated by an interneuron can exhibits both AS and DS regimes. Here we show that AS is robust in the presence of noise in a 3 Hodgkin-Huxley type neuronal motif. We also show that AS is stable for large values of τ in a chain of connected slaves-interneurons.

  9. Minimal attractors in digraph system models of neuronal networks

    NASA Astrophysics Data System (ADS)

    Just, Winfried; Ahn, Sungwoo; Terman, David

    2008-12-01

    We study a class of discrete dynamical systems models of neuronal networks. In these models, each neuron is represented by a finite number of states and there are rules for how a neuron transitions from one state to another. In particular, the rules determine when a neuron fires and how this affects the state of other neurons. In an earlier paper [D. Terman, S. Ahn, X. Wang, W. Just, Reducing neuronal networks to discrete dynamics, Physica D 237 (2008) 324-338], we demonstrate that a general class of excitatory-inhibitory networks can, in fact, be rigorously reduced to the discrete model. In the present paper, we analyze how the connectivity of the network influences the dynamics of the discrete model. For randomly connected networks, we find two major phase transitions. If the connection probability is above the second but below the first phase transition, then starting in a generic initial state, most but not all cells will fire at all times along the trajectory as soon as they reach the end of their refractory period. Above the first phase transition, this will be true for all cells in a typical initial state; thus most states will belong to a minimal attractor of oscillatory behavior (in a sense that is defined precisely in the paper). The exact positions of the phase transitions depend on intrinsic properties of the cells including the lengths of the cells’ refractory periods and the thresholds for firing. Existence of these phase transitions is both rigorously proved for sufficiently large networks and corroborated by numerical experiments on networks of moderate size.

  10. Network-induced chaos in integrate-and-fire neuronal ensembles

    NASA Astrophysics Data System (ADS)

    Zhou, Douglas; Rangan, Aaditya V.; Sun, Yi; Cai, David

    2009-09-01

    It has been shown that a single standard linear integrate-and-fire (IF) neuron under a general time-dependent stimulus cannot possess chaotic dynamics despite the firing-reset discontinuity. Here we address the issue of whether conductance-based, pulsed-coupled network interactions can induce chaos in an IF neuronal ensemble. Using numerical methods, we demonstrate that all-to-all, homogeneously pulse-coupled IF neuronal networks can indeed give rise to chaotic dynamics under an external periodic current drive. We also provide a precise characterization of the largest Lyapunov exponent for these high dimensional nonsmooth dynamical systems. In addition, we present a stable and accurate numerical algorithm for evaluating the largest Lyapunov exponent, which can overcome difficulties encountered by traditional methods for these nonsmooth dynamical systems with degeneracy induced by, e.g., refractoriness of neurons.

  11. How adaptation shapes spike rate oscillations in recurrent neuronal networks

    PubMed Central

    Augustin, Moritz; Ladenbauer, Josef; Obermayer, Klaus

    2012-01-01

    Neural mass signals from in-vivo recordings often show oscillations with frequencies ranging from <1 to 100 Hz. Fast rhythmic activity in the beta and gamma range can be generated by network-based mechanisms such as recurrent synaptic excitation-inhibition loops. Slower oscillations might instead depend on neuronal adaptation currents whose timescales range from tens of milliseconds to seconds. Here we investigate how the dynamics of such adaptation currents contribute to spike rate oscillations and resonance properties in recurrent networks of excitatory and inhibitory neurons. Based on a network of sparsely coupled spiking model neurons with two types of adaptation current and conductance-based synapses with heterogeneous strengths and delays we use a mean-field approach to analyze oscillatory network activity. For constant external input, we find that spike-triggered adaptation currents provide a mechanism to generate slow oscillations over a wide range of adaptation timescales as long as recurrent synaptic excitation is sufficiently strong. Faster rhythms occur when recurrent inhibition is slower than excitation and oscillation frequency increases with the strength of inhibition. Adaptation facilitates such network-based oscillations for fast synaptic inhibition and leads to decreased frequencies. For oscillatory external input, adaptation currents amplify a narrow band of frequencies and cause phase advances for low frequencies in addition to phase delays at higher frequencies. Our results therefore identify the different key roles of neuronal adaptation dynamics for rhythmogenesis and selective signal propagation in recurrent networks. PMID:23450654

  12. Autonomous Optimization of Targeted Stimulation of Neuronal Networks

    PubMed Central

    Kumar, Sreedhar S.; Wülfing, Jan; Okujeni, Samora; Boedecker, Joschka; Riedmiller, Martin

    2016-01-01

    Driven by clinical needs and progress in neurotechnology, targeted interaction with neuronal networks is of increasing importance. Yet, the dynamics of interaction between intrinsic ongoing activity in neuronal networks and their response to stimulation is unknown. Nonetheless, electrical stimulation of the brain is increasingly explored as a therapeutic strategy and as a means to artificially inject information into neural circuits. Strategies using regular or event-triggered fixed stimuli discount the influence of ongoing neuronal activity on the stimulation outcome and are therefore not optimal to induce specific responses reliably. Yet, without suitable mechanistic models, it is hardly possible to optimize such interactions, in particular when desired response features are network-dependent and are initially unknown. In this proof-of-principle study, we present an experimental paradigm using reinforcement-learning (RL) to optimize stimulus settings autonomously and evaluate the learned control strategy using phenomenological models. We asked how to (1) capture the interaction of ongoing network activity, electrical stimulation and evoked responses in a quantifiable ‘state’ to formulate a well-posed control problem, (2) find the optimal state for stimulation, and (3) evaluate the quality of the solution found. Electrical stimulation of generic neuronal networks grown from rat cortical tissue in vitro evoked bursts of action potentials (responses). We show that the dynamic interplay of their magnitudes and the probability to be intercepted by spontaneous events defines a trade-off scenario with a network-specific unique optimal latency maximizing stimulus efficacy. An RL controller was set to find this optimum autonomously. Across networks, stimulation efficacy increased in 90% of the sessions after learning and learned latencies strongly agreed with those predicted from open-loop experiments. Our results show that autonomous techniques can exploit

  13. Autonomous Optimization of Targeted Stimulation of Neuronal Networks.

    PubMed

    Kumar, Sreedhar S; Wülfing, Jan; Okujeni, Samora; Boedecker, Joschka; Riedmiller, Martin; Egert, Ulrich

    2016-08-01

    Driven by clinical needs and progress in neurotechnology, targeted interaction with neuronal networks is of increasing importance. Yet, the dynamics of interaction between intrinsic ongoing activity in neuronal networks and their response to stimulation is unknown. Nonetheless, electrical stimulation of the brain is increasingly explored as a therapeutic strategy and as a means to artificially inject information into neural circuits. Strategies using regular or event-triggered fixed stimuli discount the influence of ongoing neuronal activity on the stimulation outcome and are therefore not optimal to induce specific responses reliably. Yet, without suitable mechanistic models, it is hardly possible to optimize such interactions, in particular when desired response features are network-dependent and are initially unknown. In this proof-of-principle study, we present an experimental paradigm using reinforcement-learning (RL) to optimize stimulus settings autonomously and evaluate the learned control strategy using phenomenological models. We asked how to (1) capture the interaction of ongoing network activity, electrical stimulation and evoked responses in a quantifiable 'state' to formulate a well-posed control problem, (2) find the optimal state for stimulation, and (3) evaluate the quality of the solution found. Electrical stimulation of generic neuronal networks grown from rat cortical tissue in vitro evoked bursts of action potentials (responses). We show that the dynamic interplay of their magnitudes and the probability to be intercepted by spontaneous events defines a trade-off scenario with a network-specific unique optimal latency maximizing stimulus efficacy. An RL controller was set to find this optimum autonomously. Across networks, stimulation efficacy increased in 90% of the sessions after learning and learned latencies strongly agreed with those predicted from open-loop experiments. Our results show that autonomous techniques can exploit quantitative

  14. Rich club neurons dominate Information Transfer in local cortical networks

    NASA Astrophysics Data System (ADS)

    Nigam, Sunny; Shimono, Masanori; Sporns, Olaf; Beggs, John

    2015-03-01

    The performance of complex networks depends on how they route their traffic. It is unknown how information is transferred in local cortical networks of hundreds of closely-spaced neurons. To address this, it is necessary to record simultaneously from hundreds of neurons at a spacing that matches typical axonal connection distances, and at a temporal resolution that matches synaptic delays. We used a 512 electrode array (60 μm spacing) to record spontaneous activity at 20 kHz, simultaneously from up to 700 neurons in slice cultures of mouse somatosensory cortex for 1 hr at a time. We used transfer entropy to quantify directed information transfer (IT) between pairs of neurons. We found an approximately lognormal distribution of firing rates as reported in in-vivo. Pairwise information transfer strengths also were nearly lognormally distributed, similar to synaptic strengths. 20% of the neurons accounted for 70% of the total IT coming into, and going out of the network and were defined as rich nodes. These rich nodes were more densely and strongly connected to each other expected by chance, forming a rich club. This highly uneven distribution of IT has implications for the efficiency and robustness of local cortical networks, and gives clues to the plastic processes that shape them. JSPS.

  15. Non-Boltzmann Dynamics in Networks of Neurons

    NASA Astrophysics Data System (ADS)

    Crair, Michael Charles

    We present a theory for a network of neurons that communicate via action potentials. Our model balances the need for an accurate in detail picture for the functioning of neurons with the desire for a simple and tractable description. We view the problem at the mesoscopic level, with an abstract neural state capturing what we assume to be the relevant physical properties of all the ionic and molecular interactions that make up an active cell. We include in our description of the neural state a stochastic component which mimics the intracellular and extracellular commotion in a network of neurons. Because our model is based on a realistic spiking neural network, we can make firm predictions about the behavior of real biological networks of neurons. For instance, we find that attractor dynamics, a general property exhibited by standard models of neural networks, is preserved in our model but the symmetry which exists in standard models between the 'on' and 'off' neural state is broken in our description by the spike driven noisy dynamics. These predictions are generally corroborated by the limited experimental evidence available, and we make suggestions for further experiments that would clarify the validity of our description. The spiking properties of neurons also leads us to a model for learning which is based on modifying the temporal form of neural interactions instead of the usual connection strength. This suggests that a network of neurons can reinforce associative behavior by changing the time course of the neural interactions expressed in the synaptic potentials instead of changing the size of the synaptic interactions.

  16. FPGA implementation of motifs-based neuronal network and synchronization analysis

    NASA Astrophysics Data System (ADS)

    Deng, Bin; Zhu, Zechen; Yang, Shuangming; Wei, Xile; Wang, Jiang; Yu, Haitao

    2016-06-01

    Motifs in complex networks play a crucial role in determining the brain functions. In this paper, 13 kinds of motifs are implemented with Field Programmable Gate Array (FPGA) to investigate the relationships between the networks properties and motifs properties. We use discretization method and pipelined architecture to construct various motifs with Hindmarsh-Rose (HR) neuron as the node model. We also build a small-world network based on these motifs and conduct the synchronization analysis of motifs as well as the constructed network. We find that the synchronization properties of motif determine that of motif-based small-world network, which demonstrates effectiveness of our proposed hardware simulation platform. By imitation of some vital nuclei in the brain to generate normal discharges, our proposed FPGA-based artificial neuronal networks have the potential to replace the injured nuclei to complete the brain function in the treatment of Parkinson's disease and epilepsy.

  17. A simple chaotic neuron model: stochastic behavior of neural networks.

    PubMed

    Aydiner, Ekrem; Vural, Adil M; Ozcelik, Bekir; Kiymac, Kerim; Tan, Uner

    2003-05-01

    We have briefly reviewed the occurrence of the post-synaptic potentials between neurons, the relationship between EEG and neuron dynamics, as well as methods of signal analysis. We propose a simple stochastic model representing electrical activity of neuronal systems. The model is constructed using the Monte Carlo simulation technique. The results yielded EEG-like signals with their phase portraits in three-dimensional space. The Lyapunov exponent was positive, indicating chaotic behavior. The correlation of the EEG-like signals was.92, smaller than those reported by others. It was concluded that this neuron model may provide valuable clues about the dynamic behavior of neural systems. PMID:12745622

  18. Brain extracellular matrix retains connectivity in neuronal networks

    PubMed Central

    Bikbaev, Arthur; Frischknecht, Renato; Heine, Martin

    2015-01-01

    The formation and maintenance of connectivity are critically important for the processing and storage of information in neuronal networks. The brain extracellular matrix (ECM) appears during postnatal development and surrounds most neurons in the adult mammalian brain. Importantly, the removal of the ECM was shown to improve plasticity and post-traumatic recovery in the CNS, but little is known about the mechanisms. Here, we investigated the role of the ECM in the regulation of the network activity in dissociated hippocampal cultures grown on microelectrode arrays (MEAs). We found that enzymatic removal of the ECM in mature cultures led to transient enhancement of neuronal activity, but prevented disinhibition-induced hyperexcitability that was evident in age-matched control cultures with intact ECM. Furthermore, the ECM degradation followed by disinhibition strongly affected the network interaction so that it strongly resembled the juvenile pattern seen in naïve developing cultures. Taken together, our results demonstrate that the ECM plays an important role in retention of existing connectivity in mature neuronal networks that can be exerted through synaptic confinement of glutamate. On the other hand, removal of the ECM can play a permissive role in modification of connectivity and adaptive exploration of novel network architecture. PMID:26417723

  19. Carbon nanotubes: artificial nanomaterials to engineer single neurons and neuronal networks.

    PubMed

    Fabbro, Alessandra; Bosi, Susanna; Ballerini, Laura; Prato, Maurizio

    2012-08-15

    In the past decade, nanotechnology applications to the nervous system have often involved the study and the use of novel nanomaterials to improve the diagnosis and therapy of neurological diseases. In the field of nanomedicine, carbon nanotubes are evaluated as promising materials for diverse therapeutic and diagnostic applications. Besides, carbon nanotubes are increasingly employed in basic neuroscience approaches, and they have been used in the design of neuronal interfaces or in that of scaffolds promoting neuronal growth in vitro. Ultimately, carbon nanotubes are thought to hold the potential for the development of innovative neurological implants. In this framework, it is particularly relevant to document the impact of interfacing such materials with nerve cells. Carbon nanotubes were shown, when modified with biologically active compounds or functionalized in order to alter their charge, to affect neurite outgrowth and branching. Notably, purified carbon nanotubes used as scaffolds can promote the formation of nanotube-neuron hybrid networks, able per se to affect neuron integrative abilities, network connectivity, and synaptic plasticity. We focus this review on our work over several years directed to investigate the ability of carbon nanotube platforms in providing a new tool for nongenetic manipulations of neuronal performance and network signaling. PMID:22896805

  20. Gap junctions in developing thalamic and neocortical neuronal networks.

    PubMed

    Niculescu, Dragos; Lohmann, Christian

    2014-12-01

    The presence of direct, cytoplasmatic, communication between neurons in the brain of vertebrates has been demonstrated a long time ago. These gap junctions have been characterized in many brain areas in terms of subunit composition, biophysical properties, neuronal connectivity patterns, and developmental regulation. Although interesting findings emerged, showing that different subunits are specifically regulated during development, or that excitatory and inhibitory neuronal networks exhibit various electrical connectivity patterns, gap junctions did not receive much further interest. Originally, it was believed that gap junctions represent simple passageways for electrical and biochemical coordination early in development. Today, we know that gap junction connectivity is tightly regulated, following independent developmental patterns for excitatory and inhibitory networks. Electrical connections are important for many specific functions of neurons, and are, for example, required for the development of neuronal stimulus tuning in the visual system. Here, we integrate the available data on neuronal connectivity and gap junction properties, as well as the most recent findings concerning the functional implications of electrical connections in the developing thalamus and neocortex. PMID:23843439

  1. Microglia Control Neuronal Network Excitability via BDNF Signalling

    PubMed Central

    2013-01-01

    Microglia-neuron interactions play a crucial role in several neurological disorders characterized by altered neural network excitability, such as epilepsy and neuropathic pain. While a series of potential messengers have been postulated as substrates of the communication between microglia and neurons, including cytokines, purines, prostaglandins, and nitric oxide, the specific links between messengers, microglia, neuronal networks, and diseases have remained elusive. Brain-derived neurotrophic factor (BDNF) released by microglia emerges as an exception in this riddle. Here, we review the current knowledge on the role played by microglial BDNF in controlling neuronal excitability by causing disinhibition. The efforts made by different laboratories during the last decade have collectively provided a robust mechanistic paradigm which elucidates the mechanisms involved in the synthesis and release of BDNF from microglia, the downstream TrkB-mediated signals in neurons, and the biophysical mechanism by which disinhibition occurs, via the downregulation of the K+-Cl− cotransporter KCC2, dysrupting Cl−homeostasis, and hence the strength of GABAA- and glycine receptor-mediated inhibition. The resulting altered network activity appears to explain several features of the associated pathologies. Targeting the molecular players involved in this canonical signaling pathway may lead to novel therapeutic approach for ameliorating a wide array of neural dysfunctions. PMID:24089642

  2. Multiscale modeling of brain dynamics: from single neurons and networks to mathematical tools.

    PubMed

    Siettos, Constantinos; Starke, Jens

    2016-09-01

    The extreme complexity of the brain naturally requires mathematical modeling approaches on a large variety of scales; the spectrum ranges from single neuron dynamics over the behavior of groups of neurons to neuronal network activity. Thus, the connection between the microscopic scale (single neuron activity) to macroscopic behavior (emergent behavior of the collective dynamics) and vice versa is a key to understand the brain in its complexity. In this work, we attempt a review of a wide range of approaches, ranging from the modeling of single neuron dynamics to machine learning. The models include biophysical as well as data-driven phenomenological models. The discussed models include Hodgkin-Huxley, FitzHugh-Nagumo, coupled oscillators (Kuramoto oscillators, Rössler oscillators, and the Hindmarsh-Rose neuron), Integrate and Fire, networks of neurons, and neural field equations. In addition to the mathematical models, important mathematical methods in multiscale modeling and reconstruction of the causal connectivity are sketched. The methods include linear and nonlinear tools from statistics, data analysis, and time series analysis up to differential equations, dynamical systems, and bifurcation theory, including Granger causal connectivity analysis, phase synchronization connectivity analysis, principal component analysis (PCA), independent component analysis (ICA), and manifold learning algorithms such as ISOMAP, and diffusion maps and equation-free techniques. WIREs Syst Biol Med 2016, 8:438-458. doi: 10.1002/wsbm.1348 For further resources related to this article, please visit the WIREs website. PMID:27340949

  3. Autapse-induced multiple coherence resonance in single neurons and neuronal networks

    PubMed Central

    Yilmaz, Ergin; Ozer, Mahmut; Baysal, Veli; Perc, Matjaž

    2016-01-01

    We study the effects of electrical and chemical autapse on the temporal coherence or firing regularity of single stochastic Hodgkin-Huxley neurons and scale-free neuronal networks. Also, we study the effects of chemical autapse on the occurrence of spatial synchronization in scale-free neuronal networks. Irrespective of the type of autapse, we observe autaptic time delay induced multiple coherence resonance for appropriately tuned autaptic conductance levels in single neurons. More precisely, we show that in the presence of an electrical autapse, there is an optimal intensity of channel noise inducing the multiple coherence resonance, whereas in the presence of chemical autapse the occurrence of multiple coherence resonance is less sensitive to the channel noise intensity. At the network level, we find autaptic time delay induced multiple coherence resonance and synchronization transitions, occurring at approximately the same delay lengths. We show that these two phenomena can arise only at a specific range of the coupling strength, and that they can be observed independently of the average degree of the network. PMID:27480120

  4. Autapse-induced multiple coherence resonance in single neurons and neuronal networks.

    PubMed

    Yilmaz, Ergin; Ozer, Mahmut; Baysal, Veli; Perc, Matjaž

    2016-01-01

    We study the effects of electrical and chemical autapse on the temporal coherence or firing regularity of single stochastic Hodgkin-Huxley neurons and scale-free neuronal networks. Also, we study the effects of chemical autapse on the occurrence of spatial synchronization in scale-free neuronal networks. Irrespective of the type of autapse, we observe autaptic time delay induced multiple coherence resonance for appropriately tuned autaptic conductance levels in single neurons. More precisely, we show that in the presence of an electrical autapse, there is an optimal intensity of channel noise inducing the multiple coherence resonance, whereas in the presence of chemical autapse the occurrence of multiple coherence resonance is less sensitive to the channel noise intensity. At the network level, we find autaptic time delay induced multiple coherence resonance and synchronization transitions, occurring at approximately the same delay lengths. We show that these two phenomena can arise only at a specific range of the coupling strength, and that they can be observed independently of the average degree of the network. PMID:27480120

  5. Burst synchronization transitions in a neuronal network of subnetworks

    NASA Astrophysics Data System (ADS)

    Sun, Xiaojuan; Lei, Jinzhi; Perc, Matjaž; Kurths, Jürgen; Chen, Guanrong

    2011-03-01

    In this paper, the transitions of burst synchronization are explored in a neuronal network consisting of subnetworks. The studied network is composed of electrically coupled bursting Hindmarsh-Rose neurons. Numerical results show that two types of burst synchronization transitions can be induced not only by the variations of intra- and intercoupling strengths but also by changing the probability of random links between different subnetworks and the number of subnetworks. Furthermore, we find that the underlying mechanisms for these two bursting synchronization transitions are different: one is due to the change of spike numbers per burst, while the other is caused by the change of the bursting type. Considering that changes in the coupling strengths and neuronal connections are closely interlaced with brain plasticity, the presented results could have important implications for the role of the brain plasticity in some functional behavior that are associated with synchronization.

  6. Enhancement of synchronization in inter-intra-connected neuronal networks

    NASA Astrophysics Data System (ADS)

    Moukam Kakmeni, F. M.; Nguemaha, V. M.

    2016-01-01

    We study the enhancement of neural synchrony in a network of electrically coupled Hindmarsh Rose (HR) neurons. The behavior of the network under control by an external environment modeled by the Fitzhugh Nagumo (FN) is analyzed. Biologically, such a control system could mimic the modification of normal neuronal dynamics due to drugs or other chemical substances. We show that the environment could have as effect the suppression of chaos, enhancement of synchrony and favor interesting properties such as sub-threshold membrane oscillations, and oscillation death for relatively strong local coupling. Interestingly, we find that the electrical coupling between each two coupled HR and FN is less important to synchronization than the local coupling between the HR and the FN neurons. In other words, local interactions are found to play a stronger role in synchronization than long-range (global) interactions.

  7. COMMUNICATION: Neuron network activity scales exponentially with synapse density

    NASA Astrophysics Data System (ADS)

    Brewer, G. J.; Boehler, M. D.; Pearson, R. A.; DeMaris, A. A.; Ide, A. N.; Wheeler, B. C.

    2009-02-01

    Neuronal network output in the cortex as a function of synapse density during development has not been explicitly determined. Synaptic scaling in cortical brain networks seems to alter excitatory and inhibitory synaptic inputs to produce a representative rate of synaptic output. Here, we cultured rat hippocampal neurons over a three-week period to correlate synapse density with the increase in spontaneous spiking activity. We followed the network development as synapse formation and spike rate in two serum-free media optimized for either (a) neuron survival (Neurobasal/B27) or (b) spike rate (NbActiv4). We found that while synaptophysin synapse density increased linearly with development, spike rates increased exponentially in developing neuronal networks. Synaptic receptor components NR1, GluR1 and GABA-A also increase linearly but with more excitatory receptors than inhibitory. These results suggest that the brain's information processing capability gains more from increasing connectivity of the processing units than increasing processing units, much as Internet information flow increases much faster than the linear number of nodes and connections.

  8. NEURON: Enabling Autonomicity in Wireless Sensor Networks

    PubMed Central

    Zafeiropoulos, Anastasios; Gouvas, Panagiotis; Liakopoulos, Athanassios; Mentzas, Gregoris; Mitrou, Nikolas

    2010-01-01

    Future Wireless Sensor Networks (WSNs) will be ubiquitous, large-scale networks interconnected with the existing IP infrastructure. Autonomic functionalities have to be designed in order to reduce the complexity of their operation and management, and support the dissemination of knowledge within a WSN. In this paper a novel protocol for energy efficient deployment, clustering and routing in WSNs is proposed that focuses on the incorporation of autonomic functionalities in the existing approaches. The design of the protocol facilitates the design of innovative applications and services that are based on overlay topologies created through cooperation among the sensor nodes. PMID:22399931

  9. NEURON: enabling autonomicity in wireless sensor networks.

    PubMed

    Zafeiropoulos, Anastasios; Gouvas, Panagiotis; Liakopoulos, Athanassios; Mentzas, Gregoris; Mitrou, Nikolas

    2010-01-01

    Future Wireless Sensor Networks (WSNs) will be ubiquitous, large-scale networks interconnected with the existing IP infrastructure. Autonomic functionalities have to be designed in order to reduce the complexity of their operation and management, and support the dissemination of knowledge within a WSN. In this paper a novel protocol for energy efficient deployment, clustering and routing in WSNs is proposed that focuses on the incorporation of autonomic functionalities in the existing approaches. The design of the protocol facilitates the design of innovative applications and services that are based on overlay topologies created through cooperation among the sensor nodes. PMID:22399931

  10. Blur identification by multilayer neural network based on multivalued neurons.

    PubMed

    Aizenberg, Igor; Paliy, Dmitriy V; Zurada, Jacek M; Astola, Jaakko T

    2008-05-01

    A multilayer neural network based on multivalued neurons (MLMVN) is a neural network with a traditional feedforward architecture. At the same time, this network has a number of specific different features. Its backpropagation learning algorithm is derivative-free. The functionality of MLMVN is superior to that of the traditional feedforward neural networks and of a variety kernel-based networks. Its higher flexibility and faster adaptation to the target mapping enables to model complex problems using simpler networks. In this paper, the MLMVN is used to identify both type and parameters of the point spread function, whose precise identification is of crucial importance for the image deblurring. The simulation results show the high efficiency of the proposed approach. It is confirmed that the MLMVN is a powerful tool for solving classification problems, especially multiclass ones. PMID:18467216

  11. Elucidation of The Behavioral Program and Neuronal Network Encoded by Dorsal Raphe Serotonergic Neurons.

    PubMed

    Urban, Daniel J; Zhu, Hu; Marcinkiewcz, Catherine A; Michaelides, Michael; Oshibuchi, Hidehiro; Rhea, Darren; Aryal, Dipendra K; Farrell, Martilias S; Lowery-Gionta, Emily; Olsen, Reid H J; Wetsel, William C; Kash, Thomas L; Hurd, Yasmin L; Tecott, Laurence H; Roth, Bryan L

    2016-04-01

    Elucidating how the brain's serotonergic network mediates diverse behavioral actions over both relatively short (minutes-hours) and long period of time (days-weeks) remains a major challenge for neuroscience. Our relative ignorance is largely due to the lack of technologies with robustness, reversibility, and spatio-temporal control. Recently, we have demonstrated that our chemogenetic approach (eg, Designer Receptors Exclusively Activated by Designer Drugs (DREADDs)) provides a reliable and robust tool for controlling genetically defined neural populations. Here we show how short- and long-term activation of dorsal raphe nucleus (DRN) serotonergic neurons induces robust behavioral responses. We found that both short- and long-term activation of DRN serotonergic neurons induce antidepressant-like behavioral responses. However, only short-term activation induces anxiogenic-like behaviors. In parallel, these behavioral phenotypes were associated with a metabolic map of whole brain network activity via a recently developed non-invasive imaging technology DREAMM (DREADD Associated Metabolic Mapping). Our findings reveal a previously unappreciated brain network elicited by selective activation of DRN serotonin neurons and illuminate potential therapeutic and adverse effects of drugs targeting DRN neurons. PMID:26383016

  12. Effects of acute spinalization on neurons of postural networks.

    PubMed

    Zelenin, Pavel V; Lyalka, Vladimir F; Hsu, Li-Ju; Orlovsky, Grigori N; Deliagina, Tatiana G

    2016-01-01

    Postural limb reflexes (PLRs) represent a substantial component of postural corrections. Spinalization results in loss of postural functions, including disappearance of PLRs. The aim of the present study was to characterize the effects of acute spinalization on two populations of spinal neurons (F and E) mediating PLRs, which we characterized previously. For this purpose, in decerebrate rabbits spinalized at T12, responses of interneurons from L5 to stimulation causing PLRs before spinalization, were recorded. The results were compared to control data obtained in our previous study. We found that spinalization affected the distribution of F- and E-neurons across the spinal grey matter, caused a significant decrease in their activity, as well as disturbances in processing of posture-related sensory inputs. A two-fold decrease in the proportion of F-neurons in the intermediate grey matter was observed. Location of populations of F- and E-neurons exhibiting significant decrease in their activity was determined. A dramatic decrease of the efficacy of sensory input from the ipsilateral limb to F-neurons, and from the contralateral limb to E-neurons was found. These changes in operation of postural networks underlie the loss of postural control after spinalization, and represent a starting point for the development of spasticity. PMID:27302149

  13. Effects of acute spinalization on neurons of postural networks

    PubMed Central

    Zelenin, Pavel V.; Lyalka, Vladimir F.; Hsu, Li-Ju; Orlovsky, Grigori N.; Deliagina, Tatiana G.

    2016-01-01

    Postural limb reflexes (PLRs) represent a substantial component of postural corrections. Spinalization results in loss of postural functions, including disappearance of PLRs. The aim of the present study was to characterize the effects of acute spinalization on two populations of spinal neurons (F and E) mediating PLRs, which we characterized previously. For this purpose, in decerebrate rabbits spinalized at T12, responses of interneurons from L5 to stimulation causing PLRs before spinalization, were recorded. The results were compared to control data obtained in our previous study. We found that spinalization affected the distribution of F- and E-neurons across the spinal grey matter, caused a significant decrease in their activity, as well as disturbances in processing of posture-related sensory inputs. A two-fold decrease in the proportion of F-neurons in the intermediate grey matter was observed. Location of populations of F- and E-neurons exhibiting significant decrease in their activity was determined. A dramatic decrease of the efficacy of sensory input from the ipsilateral limb to F-neurons, and from the contralateral limb to E-neurons was found. These changes in operation of postural networks underlie the loss of postural control after spinalization, and represent a starting point for the development of spasticity. PMID:27302149

  14. Network architecture underlying maximal separation of neuronal representations

    PubMed Central

    Jortner, Ron A.

    2011-01-01

    One of the most basic and general tasks faced by all nervous systems is extracting relevant information from the organism's surrounding world. While physical signals available to sensory systems are often continuous, variable, overlapping, and noisy, high-level neuronal representations used for decision-making tend to be discrete, specific, invariant, and highly separable. This study addresses the question of how neuronal specificity is generated. Inspired by experimental findings on network architecture in the olfactory system of the locust, I construct a highly simplified theoretical framework which allows for analytic solution of its key properties. For generalized feed-forward systems, I show that an intermediate range of connectivity values between source- and target-populations leads to a combinatorial explosion of wiring possibilities, resulting in input spaces which are, by their very nature, exquisitely sparsely populated. In particular, connection probability ½, as found in the locust antennal-lobe–mushroom-body circuit, serves to maximize separation of neuronal representations across the target Kenyon cells (KCs), and explains their specific and reliable responses. This analysis yields a function expressing response specificity in terms of lower network parameters; together with appropriate gain control this leads to a simple neuronal algorithm for generating arbitrarily sparse and selective codes and linking network architecture and neural coding. I suggest a straightforward way to construct ecologically meaningful representations from this code. PMID:23316159

  15. Multitasking attractor networks with neuronal threshold noise.

    PubMed

    Agliari, Elena; Barra, Adriano; Galluzzi, Andrea; Isopi, Marco

    2014-01-01

    We consider the multitasking associative network in the low-storage limit and we study its phase diagram with respect to the noise level T and the degree d of dilution in pattern entries. We find that the system is characterized by a rich variety of stable states, including pure states, parallel retrieval states, hierarchically organized states and symmetric mixtures (remarkably, both even and odd), whose complexity increases as the number of patterns P grows. The analysis is performed both analytically and numerically: Exploiting techniques based on partial differential equations, we are able to get the self-consistencies for the order parameters. Such self-consistency equations are then solved and the solutions are further checked through stability theory to catalog their organizations into the phase diagram, which is outlined at the end. This is a further step towards the understanding of spontaneous parallel processing in associative networks. PMID:24121044

  16. Complexity in neuronal noise depends on network interconnectivity.

    PubMed

    Serletis, Demitre; Zalay, Osbert C; Valiante, Taufik A; Bardakjian, Berj L; Carlen, Peter L

    2011-06-01

    "Noise," or noise-like activity (NLA), defines background electrical membrane potential fluctuations at the cellular level of the nervous system, comprising an important aspect of brain dynamics. Using whole-cell voltage recordings from fast-spiking stratum oriens interneurons and stratum pyramidale neurons located in the CA3 region of the intact mouse hippocampus, we applied complexity measures from dynamical systems theory (i.e., 1/f(γ) noise and correlation dimension) and found evidence for complexity in neuronal NLA, ranging from high- to low-complexity dynamics. Importantly, these high- and low-complexity signal features were largely dependent on gap junction and chemical synaptic transmission. Progressive neuronal isolation from the surrounding local network via gap junction blockade (abolishing gap junction-dependent spikelets) and then chemical synaptic blockade (abolishing excitatory and inhibitory post-synaptic potentials), or the reverse order of these treatments, resulted in emergence of high-complexity NLA dynamics. Restoring local network interconnectivity via blockade washout resulted in resolution to low-complexity behavior. These results suggest that the observed increase in background NLA complexity is the result of reduced network interconnectivity, thereby highlighting the potential importance of the NLA signal to the study of network state transitions arising in normal and abnormal brain dynamics (such as in epilepsy, for example). PMID:21347547

  17. The Geometry of Spontaneous Spiking in Neuronal Networks

    NASA Astrophysics Data System (ADS)

    Medvedev, Georgi S.; Zhuravytska, Svitlana

    2012-10-01

    The mathematical theory of pattern formation in electrically coupled networks of excitable neurons forced by small noise is presented in this work. Using the Freidlin-Wentzell large-deviation theory for randomly perturbed dynamical systems and the elements of the algebraic graph theory, we identify and analyze the main regimes in the network dynamics in terms of the key control parameters: excitability, coupling strength, and network topology. The analysis reveals the geometry of spontaneous dynamics in electrically coupled network. Specifically, we show that the location of the minima of a certain continuous function on the surface of the unit n-cube encodes the most likely activity patterns generated by the network. By studying how the minima of this function evolve under the variation of the coupling strength, we describe the principal transformations in the network dynamics. The minimization problem is also used for the quantitative description of the main dynamical regimes and transitions between them. In particular, for the weak and strong coupling regimes, we present asymptotic formulas for the network activity rate as a function of the coupling strength and the degree of the network. The variational analysis is complemented by the stability analysis of the synchronous state in the strong coupling regime. The stability estimates reveal the contribution of the network connectivity and the properties of the cycle subspace associated with the graph of the network to its synchronization properties. This work is motivated by the experimental and modeling studies of the ensemble of neurons in the Locus Coeruleus, a nucleus in the brainstem involved in the regulation of cognitive performance and behavior.

  18. Oscillations in the bistable regime of neuronal networks

    NASA Astrophysics Data System (ADS)

    Roxin, Alex; Compte, Albert

    2016-07-01

    Bistability between attracting fixed points in neuronal networks has been hypothesized to underlie persistent activity observed in several cortical areas during working memory tasks. In network models this kind of bistability arises due to strong recurrent excitation, sufficient to generate a state of high activity created in a saddle-node (SN) bifurcation. On the other hand, canonical network models of excitatory and inhibitory neurons (E-I networks) robustly produce oscillatory states via a Hopf (H) bifurcation due to the E-I loop. This mechanism for generating oscillations has been invoked to explain the emergence of brain rhythms in the β to γ bands. Although both bistability and oscillatory activity have been intensively studied in network models, there has not been much focus on the coincidence of the two. Here we show that when oscillations emerge in E-I networks in the bistable regime, their phenomenology can be explained to a large extent by considering coincident SN and H bifurcations, known as a codimension two Takens-Bogdanov bifurcation. In particular, we find that such oscillations are not composed of a stable limit cycle, but rather are due to noise-driven oscillatory fluctuations. Furthermore, oscillations in the bistable regime can, in principle, have arbitrarily low frequency.

  19. Oscillations in the bistable regime of neuronal networks.

    PubMed

    Roxin, Alex; Compte, Albert

    2016-07-01

    Bistability between attracting fixed points in neuronal networks has been hypothesized to underlie persistent activity observed in several cortical areas during working memory tasks. In network models this kind of bistability arises due to strong recurrent excitation, sufficient to generate a state of high activity created in a saddle-node (SN) bifurcation. On the other hand, canonical network models of excitatory and inhibitory neurons (E-I networks) robustly produce oscillatory states via a Hopf (H) bifurcation due to the E-I loop. This mechanism for generating oscillations has been invoked to explain the emergence of brain rhythms in the β to γ bands. Although both bistability and oscillatory activity have been intensively studied in network models, there has not been much focus on the coincidence of the two. Here we show that when oscillations emerge in E-I networks in the bistable regime, their phenomenology can be explained to a large extent by considering coincident SN and H bifurcations, known as a codimension two Takens-Bogdanov bifurcation. In particular, we find that such oscillations are not composed of a stable limit cycle, but rather are due to noise-driven oscillatory fluctuations. Furthermore, oscillations in the bistable regime can, in principle, have arbitrarily low frequency. PMID:27575167

  20. GABA-A receptor antagonists increase firing, bursting and synchrony of spontaneous activity in neuronal networks grown on microelectrode arrays: a step towards chemical "fingerprinting"

    EPA Science Inventory

    Assessment of effects on spontaneous network activity in neurons grown on MEAs is a proposed method to screen chemicals for potential neurotoxicity. In addition, differential effects on network activity (chemical "fingerprints") could be used to classify chemical modes of action....

  1. Continuous network of endoplasmic reticulum in cerebellar Purkinje neurons.

    PubMed Central

    Terasaki, M; Slater, N T; Fein, A; Schmidek, A; Reese, T S

    1994-01-01

    Purkinje neurons in rat cerebellar slices injected with an oil drop saturated with 1,1'-dihexadecyl-3,3,3',3'-tetramethylindocarbocyanine perchlorate [DiIC16(3) or DiI] to label the endoplasmic reticulum were observed by confocal microscopy. DiI spread throughout the cell body and dendrites and into the axon. DiI spreading is due to diffusion in a continuous bilayer and is not due to membrane trafficking because it also spreads in fixed neurons. DiI stained such features of the endoplasmic reticulum as densities at branch points, reticular networks in the cell body and dendrites, nuclear envelope, spines, and aggregates formed during anoxia nuclear envelope, spines, and aggregates formed during anoxia in low extracellular Ca2+. In cultured rat hippocampal neurons, where optical conditions provide more detail, DiI labeled a clearly delineated network of endoplasmic reticulum in the cell body. We conclude that there is a continuous compartment of endoplasmic reticulum extending from the cell body throughout the dendrites. This compartment may coordinate and integrate neuronal functions. Images PMID:7519781

  2. Thermodynamics and signatures of criticality in a network of neurons.

    PubMed

    Tkačik, Gašper; Mora, Thierry; Marre, Olivier; Amodei, Dario; Palmer, Stephanie E; Berry, Michael J; Bialek, William

    2015-09-15

    The activity of a neural network is defined by patterns of spiking and silence from the individual neurons. Because spikes are (relatively) sparse, patterns of activity with increasing numbers of spikes are less probable, but, with more spikes, the number of possible patterns increases. This tradeoff between probability and numerosity is mathematically equivalent to the relationship between entropy and energy in statistical physics. We construct this relationship for populations of up to N = 160 neurons in a small patch of the vertebrate retina, using a combination of direct and model-based analyses of experiments on the response of this network to naturalistic movies. We see signs of a thermodynamic limit, where the entropy per neuron approaches a smooth function of the energy per neuron as N increases. The form of this function corresponds to the distribution of activity being poised near an unusual kind of critical point. We suggest further tests of criticality, and give a brief discussion of its functional significance. PMID:26330611

  3. Midline thalamic neurons are differentially engaged during hippocampus network oscillations.

    PubMed

    Lara-Vásquez, Ariel; Espinosa, Nelson; Durán, Ernesto; Stockle, Marcelo; Fuentealba, Pablo

    2016-01-01

    The midline thalamus is reciprocally connected with the medial temporal lobe, where neural circuitry essential for spatial navigation and memory formation resides. Yet, little information is available on the dynamic relationship between activity patterns in the midline thalamus and medial temporal lobe. Here, we report on the functional heterogeneity of anatomically-identified thalamic neurons and the differential modulation of their activity with respect to dorsal hippocampal rhythms in the anesthetized mouse. Midline thalamic neurons expressing the calcium-binding protein calretinin, irrespective of their selective co-expression of calbindin, discharged at overall low levels, did not increase their activity during hippocampal theta oscillations, and their firing rates were inhibited during hippocampal sharp wave-ripples. Conversely, thalamic neurons lacking calretinin discharged at higher rates, increased their activity during hippocampal theta waves, but remained unaffected during sharp wave-ripples. Our results indicate that the midline thalamic system comprises at least two different classes of thalamic projection neuron, which can be partly defined by their differential engagement by hippocampal pathways during specific network oscillations that accompany distinct behavioral contexts. Thus, different midline thalamic neuronal populations might be selectively recruited to support distinct stages of memory processing, consistent with the thalamus being pivotal in the dialogue of cortical circuits. PMID:27411890

  4. Midline thalamic neurons are differentially engaged during hippocampus network oscillations

    PubMed Central

    Lara-Vásquez, Ariel; Espinosa, Nelson; Durán, Ernesto; Stockle, Marcelo; Fuentealba, Pablo

    2016-01-01

    The midline thalamus is reciprocally connected with the medial temporal lobe, where neural circuitry essential for spatial navigation and memory formation resides. Yet, little information is available on the dynamic relationship between activity patterns in the midline thalamus and medial temporal lobe. Here, we report on the functional heterogeneity of anatomically-identified thalamic neurons and the differential modulation of their activity with respect to dorsal hippocampal rhythms in the anesthetized mouse. Midline thalamic neurons expressing the calcium-binding protein calretinin, irrespective of their selective co-expression of calbindin, discharged at overall low levels, did not increase their activity during hippocampal theta oscillations, and their firing rates were inhibited during hippocampal sharp wave-ripples. Conversely, thalamic neurons lacking calretinin discharged at higher rates, increased their activity during hippocampal theta waves, but remained unaffected during sharp wave-ripples. Our results indicate that the midline thalamic system comprises at least two different classes of thalamic projection neuron, which can be partly defined by their differential engagement by hippocampal pathways during specific network oscillations that accompany distinct behavioral contexts. Thus, different midline thalamic neuronal populations might be selectively recruited to support distinct stages of memory processing, consistent with the thalamus being pivotal in the dialogue of cortical circuits. PMID:27411890

  5. Graph-based unsupervised segmentation algorithm for cultured neuronal networks' structure characterization and modeling.

    PubMed

    de Santos-Sierra, Daniel; Sendiña-Nadal, Irene; Leyva, Inmaculada; Almendral, Juan A; Ayali, Amir; Anava, Sarit; Sánchez-Ávila, Carmen; Boccaletti, Stefano

    2015-06-01

    Large scale phase-contrast images taken at high resolution through the life of a cultured neuronal network are analyzed by a graph-based unsupervised segmentation algorithm with a very low computational cost, scaling linearly with the image size. The processing automatically retrieves the whole network structure, an object whose mathematical representation is a matrix in which nodes are identified neurons or neurons' clusters, and links are the reconstructed connections between them. The algorithm is also able to extract any other relevant morphological information characterizing neurons and neurites. More importantly, and at variance with other segmentation methods that require fluorescence imaging from immunocytochemistry techniques, our non invasive measures entitle us to perform a longitudinal analysis during the maturation of a single culture. Such an analysis furnishes the way of individuating the main physical processes underlying the self-organization of the neurons' ensemble into a complex network, and drives the formulation of a phenomenological model yet able to describe qualitatively the overall scenario observed during the culture growth. PMID:25393432

  6. Translating network models to parallel hardware in NEURON

    PubMed Central

    Hines, M.L.; Carnevale, N.T.

    2008-01-01

    The increasing complexity of network models poses a growing computational burden. At the same time, computational neuroscientists are finding it easier to access parallel hardware, such as multiprocessor personal computers, workstation clusters, and massively parallel supercomputers. The practical question is how to move a working network model from a single processor to parallel hardware. Here we show how to make this transition for models implemented with NEURON, in such a way that the final result will run and produce numerically identical results on either serial or parallel hardware. This allows users to develop and debug models on readily available local resources, then run their code without modification on a parallel supercomputer. PMID:17997162

  7. Slow fluctuations in recurrent networks of spiking neurons

    NASA Astrophysics Data System (ADS)

    Wieland, Stefan; Bernardi, Davide; Schwalger, Tilo; Lindner, Benjamin

    2015-10-01

    Networks of fast nonlinear elements may display slow fluctuations if interactions are strong. We find a transition in the long-term variability of a sparse recurrent network of perfect integrate-and-fire neurons at which the Fano factor switches from zero to infinity and the correlation time is minimized. This corresponds to a bifurcation in a linear map arising from the self-consistency of temporal input and output statistics. More realistic neural dynamics with a leak current and refractory period lead to smoothed transitions and modified critical couplings that can be theoretically predicted.

  8. Slow fluctuations in recurrent networks of spiking neurons.

    PubMed

    Wieland, Stefan; Bernardi, Davide; Schwalger, Tilo; Lindner, Benjamin

    2015-10-01

    Networks of fast nonlinear elements may display slow fluctuations if interactions are strong. We find a transition in the long-term variability of a sparse recurrent network of perfect integrate-and-fire neurons at which the Fano factor switches from zero to infinity and the correlation time is minimized. This corresponds to a bifurcation in a linear map arising from the self-consistency of temporal input and output statistics. More realistic neural dynamics with a leak current and refractory period lead to smoothed transitions and modified critical couplings that can be theoretically predicted. PMID:26565154

  9. Emergence and robustness of target waves in a neuronal network

    NASA Astrophysics Data System (ADS)

    Xu, Ying; Jin, Wuyin; Ma, Jun

    2015-08-01

    Target waves in excitable media such as neuronal network can regulate the spatial distribution and orderliness as a continuous pacemaker. Three different schemes are used to develop stable target wave in the network, and the potential mechanism for emergence of target waves in the excitable media is investigated. For example, a local pacing driven by external periodical forcing can generate stable target wave in the excitable media, furthermore, heterogeneity and local feedback under self-feedback coupling are also effective to generate continuous target wave as well. To discern the difference of these target waves, a statistical synchronization factor is defined by using mean field theory and artificial defects are introduced into the network to block the target wave, thus the robustness of these target waves could be detected. However, these target waves developed from the above mentioned schemes show different robustness to the blocking from artificial defects. A regular network of Hindmarsh-Rose neurons is designed in a two-dimensional square array, target waves are induced by using three different ways, and then some artificial defects, which are associated with anatomical defects, are set in the network to detect the effect of defects blocking on the travelling waves. It confirms that the robustness of target waves to defects blocking depends on the intrinsic properties (ways to generate target wave) of target waves.

  10. Synchrony in stochastically driven neuronal networks with complex topologies

    NASA Astrophysics Data System (ADS)

    Newhall, Katherine A.; Shkarayev, Maxim S.; Kramer, Peter R.; Kovačič, Gregor; Cai, David

    2015-05-01

    We study the synchronization of a stochastically driven, current-based, integrate-and-fire neuronal model on a preferential-attachment network with scale-free characteristics and high clustering. The synchrony is induced by cascading total firing events where every neuron in the network fires at the same instant of time. We show that in the regime where the system remains in this highly synchronous state, the firing rate of the network is completely independent of the synaptic coupling, and depends solely on the external drive. On the other hand, the ability for the network to maintain synchrony depends on a balance between the fluctuations of the external input and the synaptic coupling strength. In order to accurately predict the probability of repeated cascading total firing events, we go beyond mean-field and treelike approximations and conduct a detailed second-order calculation taking into account local clustering. Our explicit analytical results are shown to give excellent agreement with direct numerical simulations for the particular preferential-attachment network model investigated.

  11. Modularity Induced Gating and Delays in Neuronal Networks.

    PubMed

    Shein-Idelson, Mark; Cohen, Gilad; Ben-Jacob, Eshel; Hanein, Yael

    2016-04-01

    Neural networks, despite their highly interconnected nature, exhibit distinctly localized and gated activation. Modularity, a distinctive feature of neural networks, has been recently proposed as an important parameter determining the manner by which networks support activity propagation. Here we use an engineered biological model, consisting of engineered rat cortical neurons, to study the role of modular topology in gating the activity between cell populations. We show that pairs of connected modules support conditional propagation (transmitting stronger bursts with higher probability), long delays and propagation asymmetry. Moreover, large modular networks manifest diverse patterns of both local and global activation. Blocking inhibition decreased activity diversity and replaced it with highly consistent transmission patterns. By independently controlling modularity and disinhibition, experimentally and in a model, we pose that modular topology is an important parameter affecting activation localization and is instrumental for population-level gating by disinhibition. PMID:27104350

  12. Modularity Induced Gating and Delays in Neuronal Networks

    PubMed Central

    Shein-Idelson, Mark; Cohen, Gilad; Hanein, Yael

    2016-01-01

    Neural networks, despite their highly interconnected nature, exhibit distinctly localized and gated activation. Modularity, a distinctive feature of neural networks, has been recently proposed as an important parameter determining the manner by which networks support activity propagation. Here we use an engineered biological model, consisting of engineered rat cortical neurons, to study the role of modular topology in gating the activity between cell populations. We show that pairs of connected modules support conditional propagation (transmitting stronger bursts with higher probability), long delays and propagation asymmetry. Moreover, large modular networks manifest diverse patterns of both local and global activation. Blocking inhibition decreased activity diversity and replaced it with highly consistent transmission patterns. By independently controlling modularity and disinhibition, experimentally and in a model, we pose that modular topology is an important parameter affecting activation localization and is instrumental for population-level gating by disinhibition. PMID:27104350

  13. Novel Method for Neuronal Nanosurgical Connection

    PubMed Central

    Katchinskiy, Nir; Goez, Helly R.; Dutta, Indrani; Godbout, Roseline; Elezzabi, Abdulhakem Y.

    2016-01-01

    Neuronal injury may cause an irreversible damage to cellular, organ and organism function. While preventing neural injury is ideal, it is not always possible. There are multiple etiologies for neuronal injury including trauma, infection, inflammation, immune mediated disorders, toxins and hereditary conditions. We describe a novel laser application, utilizing femtosecond laser pulses, in order to connect neuronal axon to neuronal soma. We were able to maintain cellular viability, and demonstrate that this technique is universal as it is applicable to multiple cell types and media. PMID:26846892

  14. Network feedback regulates motor output across a range of modulatory neuron activity.

    PubMed

    Spencer, Robert M; Blitz, Dawn M

    2016-06-01

    Modulatory projection neurons alter network neuron synaptic and intrinsic properties to elicit multiple different outputs. Sensory and other inputs elicit a range of modulatory neuron activity that is further shaped by network feedback, yet little is known regarding how the impact of network feedback on modulatory neurons regulates network output across a physiological range of modulatory neuron activity. Identified network neurons, a fully described connectome, and a well-characterized, identified modulatory projection neuron enabled us to address this issue in the crab (Cancer borealis) stomatogastric nervous system. The modulatory neuron modulatory commissural neuron 1 (MCN1) activates and modulates two networks that generate rhythms via different cellular mechanisms and at distinct frequencies. MCN1 is activated at rates of 5-35 Hz in vivo and in vitro. Additionally, network feedback elicits MCN1 activity time-locked to motor activity. We asked how network activation, rhythm speed, and neuron activity levels are regulated by the presence or absence of network feedback across a physiological range of MCN1 activity rates. There were both similarities and differences in responses of the two networks to MCN1 activity. Many parameters in both networks were sensitive to network feedback effects on MCN1 activity. However, for most parameters, MCN1 activity rate did not determine the extent to which network output was altered by the addition of network feedback. These data demonstrate that the influence of network feedback on modulatory neuron activity is an important determinant of network output and feedback can be effective in shaping network output regardless of the extent of network modulation. PMID:27030739

  15. Convergent neuromodulation onto a network neuron can have divergent effects at the network level.

    PubMed

    Kintos, Nickolas; Nusbaum, Michael P; Nadim, Farzan

    2016-04-01

    Different neuromodulators often target the same ion channel. When such modulators act on different neuron types, this convergent action can enable a rhythmic network to produce distinct outputs. Less clear are the functional consequences when two neuromodulators influence the same ion channel in the same neuron. We examine the consequences of this seeming redundancy using a mathematical model of the crab gastric mill (chewing) network. This network is activated in vitro by the projection neuron MCN1, which elicits a half-center bursting oscillation between the reciprocally-inhibitory neurons LG and Int1. We focus on two neuropeptides which modulate this network, including a MCN1 neurotransmitter and the hormone crustacean cardioactive peptide (CCAP). Both activate the same voltage-gated current (I MI ) in the LG neuron. However, I MI-MCN1 , resulting from MCN1 released neuropeptide, has phasic dynamics in its maximal conductance due to LG presynaptic inhibition of MCN1, while I MI-CCAP retains the same maximal conductance in both phases of the gastric mill rhythm. Separation of time scales allows us to produce a 2D model from which phase plane analysis shows that, as in the biological system, I MI-MCN1 and I MI-CCAP primarily influence the durations of opposing phases of this rhythm. Furthermore, I MI-MCN1 influences the rhythmic output in a manner similar to the Int1-to-LG synapse, whereas I MI-CCAP has an influence similar to the LG-to-Int1 synapse. These results show that distinct neuromodulators which target the same voltage-gated ion channel in the same network neuron can nevertheless produce distinct effects at the network level, providing divergent neuromodulator actions on network activity. PMID:26798029

  16. Realistic modeling of neurons and networks: towards brain simulation

    PubMed Central

    D’Angelo, Egidio; Solinas, Sergio; Garrido, Jesus; Casellato, Claudia; Pedrocchi, Alessandra; Mapelli, Jonathan; Gandolfi, Daniela; Prestori, Francesca

    Summary Realistic modeling is a new advanced methodology for investigating brain functions. Realistic modeling is based on a detailed biophysical description of neurons and synapses, which can be integrated into microcircuits. The latter can, in turn, be further integrated to form large-scale brain networks and eventually to reconstruct complex brain systems. Here we provide a review of the realistic simulation strategy and use the cerebellar network as an example. This network has been carefully investigated at molecular and cellular level and has been the object of intense theoretical investigation. The cerebellum is thought to lie at the core of the forward controller operations of the brain and to implement timing and sensory prediction functions. The cerebellum is well described and provides a challenging field in which one of the most advanced realistic microcircuit models has been generated. We illustrate how these models can be elaborated and embedded into robotic control systems to gain insight into how the cellular properties of cerebellar neurons emerge in integrated behaviors. Realistic network modeling opens up new perspectives for the investigation of brain pathologies and for the neurorobotic field. PMID:24139652

  17. Pharmacodynamics of potassium channel openers in cultured neuronal networks.

    PubMed

    Wu, Calvin; V Gopal, Kamakshi; Lukas, Thomas J; Gross, Guenter W; Moore, Ernest J

    2014-06-01

    A novel class of drugs - potassium (K(+)) channel openers or activators - has recently been shown to cause anticonvulsive and neuroprotective effects by activating hyperpolarizing K(+) currents, and therefore, may show efficacy for treating tinnitus. This study presents measurements of the modulatory effects of four K(+) channel openers on the spontaneous activity and action potential waveforms of neuronal networks. The networks were derived from mouse embryonic auditory cortices and grown on microelectrode arrays. Pentylenetetrazol was used to create hyperactivity states in the neuronal networks as a first approximation for mimicking tinnitus or tinnitus-like activity. We then compared the pharmacodynamics of the four channel activators, retigabine and flupirtine (voltage-gated K(+) channel KV7 activators), NS1619 and isopimaric acid ("big potassium" BK channel activators). The EC50 of retigabine, flupirtine, NS1619, and isopimaric acid were 8.0, 4.0, 5.8, and 7.8µM, respectively. The reduction of hyperactivity compared to the reference activity was significant. The present results highlight the notion of re-purposing the K(+) channel activators for reducing hyperactivity of spontaneously active auditory networks, serving as a platform for these drugs to show efficacy toward target identification, prevention, as well as treatment of tinnitus. PMID:24681057

  18. A Neuronal Network Model for Pitch Selectivity and Representation

    PubMed Central

    Huang, Chengcheng; Rinzel, John

    2016-01-01

    Pitch is a perceptual correlate of periodicity. Sounds with distinct spectra can elicit the same pitch. Despite the importance of pitch perception, understanding the cellular mechanism of pitch perception is still a major challenge and a mechanistic model of pitch is lacking. A multi-stage neuronal network model is developed for pitch frequency estimation using biophysically-based, high-resolution coincidence detector neurons. The neuronal units respond only to highly coincident input among convergent auditory nerve fibers across frequency channels. Their selectivity for only very fast rising slopes of convergent input enables these slope-detectors to distinguish the most prominent coincidences in multi-peaked input time courses. Pitch can then be estimated from the first-order interspike intervals of the slope-detectors. The regular firing pattern of the slope-detector neurons are similar for sounds sharing the same pitch despite the distinct timbres. The decoded pitch strengths also correlate well with the salience of pitch perception as reported by human listeners. Therefore, our model can serve as a neural representation for pitch. Our model performs successfully in estimating the pitch of missing fundamental complexes and reproducing the pitch variation with respect to the frequency shift of inharmonic complexes. It also accounts for the phase sensitivity of pitch perception in the cases of Schroeder phase, alternating phase and random phase relationships. Moreover, our model can also be applied to stochastic sound stimuli, iterated-ripple-noise, and account for their multiple pitch perceptions. PMID:27378900

  19. Collapse of ordered spatial pattern in neuronal network

    NASA Astrophysics Data System (ADS)

    Song, Xinlin; Wang, Chunni; Ma, Jun; Ren, Guodong

    2016-06-01

    Spatiotemporal systems can emerge some regular spatial patterns due to self organization or under external periodical pacing while external attack or intrinsic collapse can destroy the regularity in the spatial system. For an example, the electrical activities of neurons in nervous system show regular spatial distribution under appropriate coupling and connection. It is believed that distinct regularity could be induced in the media by appropriate forcing or feedback, while a diffusive collapse induced by continuous destruction can cause breakdown of the media. In this paper, the collapse of ordered spatial distribution is investigated in a regular network of neurons (Morris-Lecar, Hindmarsh-Rose) in two-dimensional array. A stable target wave is developed regular spatial distribution emerges by imposing appropriate external forcing with diversity, or generating heterogeneity (parameter diversity in space). The diffusive invasion could be produced by continuous parameter collapse or switch in local area, e.g, the diffusive poisoning in ion channels of potassium in Morris-Lecar neurons causes breakdown in conductance of channels. It is found that target wave-dominated regularity can be suppressed when the collapsed area is diffused in random. Statistical correlation functions for sampled nodes (neurons) are defined to detect the collapse of ordered state by series analysis.

  20. Mechanism of quasi-periodic lag jitter in bursting rhythms by a neuronal network

    NASA Astrophysics Data System (ADS)

    Barrio, R.; Rodríguez, Marcos; Serrano, S.; Shilnikov, Andrey

    2015-11-01

    We study a heteroclinic bifurcation leading to the onset of robust phase-lag jittering in bursting rhythms generated by a neuronal circuit. We show that the jitter phenomenon is associated with the occurrence of a stable invariant curve emerging through a torus bifurcation in 2D return maps for phase lags between three constituent bursters. To study biologically plausible and phenomenological models of rhythmic neuronal networks we have further developed parallel computational techniques for parameter continuations of all possible fixed points and invariant curves of such return maps. The method is based on a “fine” brute-force analysis of the large data set generated by the computational techniques.

  1. PyNN: A Common Interface for Neuronal Network Simulators

    PubMed Central

    Davison, Andrew P.; Brüderle, Daniel; Eppler, Jochen; Kremkow, Jens; Muller, Eilif; Pecevski, Dejan; Perrinet, Laurent; Yger, Pierre

    2008-01-01

    Computational neuroscience has produced a diversity of software for simulations of networks of spiking neurons, with both negative and positive consequences. On the one hand, each simulator uses its own programming or configuration language, leading to considerable difficulty in porting models from one simulator to another. This impedes communication between investigators and makes it harder to reproduce and build on the work of others. On the other hand, simulation results can be cross-checked between different simulators, giving greater confidence in their correctness, and each simulator has different optimizations, so the most appropriate simulator can be chosen for a given modelling task. A common programming interface to multiple simulators would reduce or eliminate the problems of simulator diversity while retaining the benefits. PyNN is such an interface, making it possible to write a simulation script once, using the Python programming language, and run it without modification on any supported simulator (currently NEURON, NEST, PCSIM, Brian and the Heidelberg VLSI neuromorphic hardware). PyNN increases the productivity of neuronal network modelling by providing high-level abstraction, by promoting code sharing and reuse, and by providing a foundation for simulator-agnostic analysis, visualization and data-management tools. PyNN increases the reliability of modelling studies by making it much easier to check results on multiple simulators. PyNN is open-source software and is available from http://neuralensemble.org/PyNN. PMID:19194529

  2. PyNN: A Common Interface for Neuronal Network Simulators.

    PubMed

    Davison, Andrew P; Brüderle, Daniel; Eppler, Jochen; Kremkow, Jens; Muller, Eilif; Pecevski, Dejan; Perrinet, Laurent; Yger, Pierre

    2008-01-01

    Computational neuroscience has produced a diversity of software for simulations of networks of spiking neurons, with both negative and positive consequences. On the one hand, each simulator uses its own programming or configuration language, leading to considerable difficulty in porting models from one simulator to another. This impedes communication between investigators and makes it harder to reproduce and build on the work of others. On the other hand, simulation results can be cross-checked between different simulators, giving greater confidence in their correctness, and each simulator has different optimizations, so the most appropriate simulator can be chosen for a given modelling task. A common programming interface to multiple simulators would reduce or eliminate the problems of simulator diversity while retaining the benefits. PyNN is such an interface, making it possible to write a simulation script once, using the Python programming language, and run it without modification on any supported simulator (currently NEURON, NEST, PCSIM, Brian and the Heidelberg VLSI neuromorphic hardware). PyNN increases the productivity of neuronal network modelling by providing high-level abstraction, by promoting code sharing and reuse, and by providing a foundation for simulator-agnostic analysis, visualization and data-management tools. PyNN increases the reliability of modelling studies by making it much easier to check results on multiple simulators. PyNN is open-source software and is available from http://neuralensemble.org/PyNN. PMID:19194529

  3. Information Transmission and Anderson Localization in two-dimensional networks of firing-rate neurons

    NASA Astrophysics Data System (ADS)

    Natale, Joseph; Hentschel, George

    Firing-rate networks offer a coarse model of signal propagation in the brain. Here we analyze sparse, 2D planar firing-rate networks with no synapses beyond a certain cutoff distance. Additionally, we impose Dale's Principle to ensure that each neuron makes only or inhibitory outgoing connections. Using spectral methods, we find that the number of neurons participating in excitations of the network becomes insignificant whenever the connectivity cutoff is tuned to a value near or below the average interneuron separation. Further, neural activations exceeding a certain threshold stay confined to a small region of space. This behavior is an instance of Anderson localization, a disorder-induced phase transition by which an information channel is rendered unable to transmit signals. We discuss several potential implications of localization for both local and long-range computation in the brain. This work was supported in part by Grants JSMF/ 220020321 and NSF/IOS/1208126.

  4. On the properties of input-to-output transformations in neuronal networks.

    PubMed

    Olypher, Andrey; Vaillant, Jean

    2016-06-01

    Information processing in neuronal networks in certain important cases can be considered as maps of binary vectors, where ones (spikes) and zeros (no spikes) of input neurons are transformed into spikes and no spikes of output neurons. A simple but fundamental characteristic of such a map is how it transforms distances between input vectors into distances between output vectors. We advanced earlier known results by finding an exact solution to this problem for McCulloch-Pitts neurons. The obtained explicit formulas allow for detailed analysis of how the network connectivity and neuronal excitability affect the transformation of distances in neurons. As an application, we explored a simple model of information processing in the hippocampus, a brain area critically implicated in learning and memory. We found network connectivity and neuronal excitability parameter values that optimize discrimination between similar and distinct inputs. A decrease of neuronal excitability, which in biological neurons may be associated with decreased inhibition, impaired the optimality of discrimination. PMID:27106188

  5. Neuronal response impedance mechanism implementing cooperative networks with low firing rates and μs precision.

    PubMed

    Vardi, Roni; Goldental, Amir; Marmari, Hagar; Brama, Haya; Stern, Edward A; Sardi, Shira; Sabo, Pinhas; Kanter, Ido

    2015-01-01

    Realizations of low firing rates in neural networks usually require globally balanced distributions among excitatory and inhibitory links, while feasibility of temporal coding is limited by neuronal millisecond precision. We show that cooperation, governing global network features, emerges through nodal properties, as opposed to link distributions. Using in vitro and in vivo experiments we demonstrate microsecond precision of neuronal response timings under low stimulation frequencies, whereas moderate frequencies result in a chaotic neuronal phase characterized by degraded precision. Above a critical stimulation frequency, which varies among neurons, response failures were found to emerge stochastically such that the neuron functions as a low pass filter, saturating the average inter-spike-interval. This intrinsic neuronal response impedance mechanism leads to cooperation on a network level, such that firing rates are suppressed toward the lowest neuronal critical frequency simultaneously with neuronal microsecond precision. Our findings open up opportunities of controlling global features of network dynamics through few nodes with extreme properties. PMID:26124707

  6. Neuronal response impedance mechanism implementing cooperative networks with low firing rates and μs precision

    PubMed Central

    Vardi, Roni; Goldental, Amir; Marmari, Hagar; Brama, Haya; Stern, Edward A.; Sardi, Shira; Sabo, Pinhas; Kanter, Ido

    2015-01-01

    Realizations of low firing rates in neural networks usually require globally balanced distributions among excitatory and inhibitory links, while feasibility of temporal coding is limited by neuronal millisecond precision. We show that cooperation, governing global network features, emerges through nodal properties, as opposed to link distributions. Using in vitro and in vivo experiments we demonstrate microsecond precision of neuronal response timings under low stimulation frequencies, whereas moderate frequencies result in a chaotic neuronal phase characterized by degraded precision. Above a critical stimulation frequency, which varies among neurons, response failures were found to emerge stochastically such that the neuron functions as a low pass filter, saturating the average inter-spike-interval. This intrinsic neuronal response impedance mechanism leads to cooperation on a network level, such that firing rates are suppressed toward the lowest neuronal critical frequency simultaneously with neuronal microsecond precision. Our findings open up opportunities of controlling global features of network dynamics through few nodes with extreme properties. PMID:26124707

  7. Echo state networks with filter neurons and a delay&sum readout.

    PubMed

    Holzmann, Georg; Hauser, Helmut

    2010-03-01

    Echo state networks (ESNs) are a novel approach to recurrent neural network training with the advantage of a very simple and linear learning algorithm. It has been demonstrated that ESNs outperform other methods on a number of benchmark tasks. Although the approach is appealing, there are still some inherent limitations in the original formulation. Here we suggest two enhancements of this network model. First, the previously proposed idea of filters in neurons is extended to arbitrary infinite impulse response (IIR) filter neurons. This enables such networks to learn multiple attractors and signals at different timescales, which is especially important for modeling real-world time series. Second, a delay&sum readout is introduced, which adds trainable delays in the synaptic connections of output neurons and therefore vastly improves the memory capacity of echo state networks. It is shown in commonly used benchmark tasks and real-world examples, that this new structure is able to significantly outperform standard ESNs and other state-of-the-art models for nonlinear dynamical system modeling. PMID:19625164

  8. Self-Organized Criticality in Developing Neuronal Networks

    PubMed Central

    Tetzlaff, Christian; Okujeni, Samora; Egert, Ulrich; Wörgötter, Florentin; Butz, Markus

    2010-01-01

    Recently evidence has accumulated that many neural networks exhibit self-organized criticality. In this state, activity is similar across temporal scales and this is beneficial with respect to information flow. If subcritical, activity can die out, if supercritical epileptiform patterns may occur. Little is known about how developing networks will reach and stabilize criticality. Here we monitor the development between 13 and 95 days in vitro (DIV) of cortical cell cultures (n = 20) and find four different phases, related to their morphological maturation: An initial low-activity state (≈19 DIV) is followed by a supercritical (≈20 DIV) and then a subcritical one (≈36 DIV) until the network finally reaches stable criticality (≈58 DIV). Using network modeling and mathematical analysis we describe the dynamics of the emergent connectivity in such developing systems. Based on physiological observations, the synaptic development in the model is determined by the drive of the neurons to adjust their connectivity for reaching on average firing rate homeostasis. We predict a specific time course for the maturation of inhibition, with strong onset and delayed pruning, and that total synaptic connectivity should be strongly linked to the relative levels of excitation and inhibition. These results demonstrate that the interplay between activity and connectivity guides developing networks into criticality suggesting that this may be a generic and stable state of many networks in vivo and in vitro. PMID:21152008

  9. Collective behavior of interacting locally synchronized oscillations in neuronal networks

    NASA Astrophysics Data System (ADS)

    Jalili, Mahdi

    2012-10-01

    Local circuits in the cortex and hippocampus are endowed with resonant, oscillatory firing properties which underlie oscillations in various frequency ranges (e.g. gamma range) frequently observed in the local field potentials, and in electroencephalography. Synchronized oscillations are thought to play important roles in information binding in the brain. This paper addresses the collective behavior of interacting locally synchronized oscillations in realistic neural networks. A network of five neurons is proposed in order to produce locally synchronized oscillations. The neuron models are Hindmarsh-Rose type with electrical and/or chemical couplings. We construct large-scale models using networks of such units which capture the essential features of the dynamics of cells and their connectivity patterns. The profile of the spike synchronization is then investigated considering different model parameters such as strength and ratio of excitatory/inhibitory connections. We also show that transmission time-delay might enhance the spike synchrony. The influence of spike-timing-dependence-plasticity is also studies on the spike synchronization.

  10. To Break or to Brake Neuronal Network Accelerated by Ammonium Ions?

    PubMed Central

    Dynnik, Vladimir V.; Kononov, Alexey V.; Sergeev, Alexander I.; Teplov, Iliya Y.; Tankanag, Arina V.; Zinchenko, Valery P.

    2015-01-01

    Purpose The aim of present study was to investigate the effects of ammonium ions on in vitro neuronal network activity and to search alternative methods of acute ammonia neurotoxicity prevention. Methods Rat hippocampal neuronal and astrocytes co-cultures in vitro, fluorescent microscopy and perforated patch clamp were used to monitor the changes in intracellular Ca2+- and membrane potential produced by ammonium ions and various modulators in the cells implicated in neural networks. Results Low concentrations of NH4Cl (0.1–4 mM) produce short temporal effects on network activity. Application of 5–8 mM NH4Cl: invariably transforms diverse network firing regimen to identical burst patterns, characterized by substantial neuronal membrane depolarization at plateau phase of potential and high-amplitude Ca2+-oscillations; raises frequency and average for period of oscillations Ca2+-level in all cells implicated in network; results in the appearance of group of «run out» cells with high intracellular Ca2+ and steadily diminished amplitudes of oscillations; increases astrocyte Ca2+-signalling, characterized by the appearance of groups of cells with increased intracellular Ca2+-level and/or chaotic Ca2+-oscillations. Accelerated network activity may be suppressed by the blockade of NMDA or AMPA/kainate-receptors or by overactivation of AMPA/kainite-receptors. Ammonia still activate neuronal firing in the presence of GABA(A) receptors antagonist bicuculline, indicating that «disinhibition phenomenon» is not implicated in the mechanisms of networks acceleration. Network activity may also be slowed down by glycine, agonists of metabotropic inhibitory receptors, betaine, L-carnitine, L-arginine, etc. Conclusions Obtained results demonstrate that ammonium ions accelerate neuronal networks firing, implicating ionotropic glutamate receptors, having preserved the activities of group of inhibitory ionotropic and metabotropic receptors. This may mean, that ammonia

  11. Dynamical state of the network determines the efficacy of single neuron properties in shaping the network activity

    PubMed Central

    Sahasranamam, Ajith; Vlachos, Ioannis; Aertsen, Ad; Kumar, Arvind

    2016-01-01

    Spike patterns are among the most common electrophysiological descriptors of neuron types. Surprisingly, it is not clear how the diversity in firing patterns of the neurons in a network affects its activity dynamics. Here, we introduce the state-dependent stochastic bursting neuron model allowing for a change in its firing patterns independent of changes in its input-output firing rate relationship. Using this model, we show that the effect of single neuron spiking on the network dynamics is contingent on the network activity state. While spike bursting can both generate and disrupt oscillations, these patterns are ineffective in large regions of the network state space in changing the network activity qualitatively. Finally, we show that when single-neuron properties are made dependent on the population activity, a hysteresis like dynamics emerges. This novel phenomenon has important implications for determining the network response to time-varying inputs and for the network sensitivity at different operating points. PMID:27212008

  12. Dynamical state of the network determines the efficacy of single neuron properties in shaping the network activity.

    PubMed

    Sahasranamam, Ajith; Vlachos, Ioannis; Aertsen, Ad; Kumar, Arvind

    2016-01-01

    Spike patterns are among the most common electrophysiological descriptors of neuron types. Surprisingly, it is not clear how the diversity in firing patterns of the neurons in a network affects its activity dynamics. Here, we introduce the state-dependent stochastic bursting neuron model allowing for a change in its firing patterns independent of changes in its input-output firing rate relationship. Using this model, we show that the effect of single neuron spiking on the network dynamics is contingent on the network activity state. While spike bursting can both generate and disrupt oscillations, these patterns are ineffective in large regions of the network state space in changing the network activity qualitatively. Finally, we show that when single-neuron properties are made dependent on the population activity, a hysteresis like dynamics emerges. This novel phenomenon has important implications for determining the network response to time-varying inputs and for the network sensitivity at different operating points. PMID:27212008

  13. Robust spatial memory maps in flickering neuronal networks: a topological model

    NASA Astrophysics Data System (ADS)

    Dabaghian, Yuri; Babichev, Andrey; Memoli, Facundo; Chowdhury, Samir; Rice University Collaboration; Ohio State University Collaboration

    It is widely accepted that the hippocampal place cells provide a substrate of the neuronal representation of the environment--the ``cognitive map''. However, hippocampal network, as any other network in the brain is transient: thousands of hippocampal neurons die every day and the connections formed by these cells constantly change due to various forms of synaptic plasticity. What then explains the remarkable reliability of our spatial memories? We propose a computational approach to answering this question based on a couple of insights. First, we propose that the hippocampal cognitive map is fundamentally topological, and hence it is amenable to analysis by topological methods. We then apply several novel methods from homology theory, to understand how dynamic connections between cells influences the speed and reliability of spatial learning. We simulate the rat's exploratory movements through different environments and study how topological invariants of these environments arise in a network of simulated neurons with ``flickering'' connectivity. We find that despite transient connectivity the network of place cells produces a stable representation of the topology of the environment.

  14. Computation emerges from adaptive synchronization of networking neurons.

    PubMed

    Zanin, Massimiliano; Del Pozo, Francisco; Boccaletti, Stefano

    2011-01-01

    The activity of networking neurons is largely characterized by the alternation of synchronous and asynchronous spiking sequences. One of the most relevant challenges that scientists are facing today is, then, relating that evidence with the fundamental mechanisms through which the brain computes and processes information, as well as with the arousal (or progress) of a number of neurological illnesses. In other words, the problem is how to associate an organized dynamics of interacting neural assemblies to a computational task. Here we show that computation can be seen as a feature emerging from the collective dynamics of an ensemble of networking neurons, which interact by means of adaptive dynamical connections. Namely, by associating logical states to synchronous neuron's dynamics, we show how the usual Boolean logics can be fully recovered, and a universal Turing machine can be constructed. Furthermore, we show that, besides the static binary gates, a wider class of logical operations can be efficiently constructed as the fundamental computational elements interact within an adaptive network, each operation being represented by a specific motif. Our approach qualitatively differs from the past attempts to encode information and compute with complex systems, where computation was instead the consequence of the application of control loops enforcing a desired state into the specific system's dynamics. Being the result of an emergent process, the computation mechanism here described is not limited to a binary Boolean logic, but it can involve a much larger number of states. As such, our results can enlighten new concepts for the understanding of the real computing processes taking place in the brain. PMID:22073167

  15. Automated Detection of Soma Location and Morphology in Neuronal Network Cultures

    PubMed Central

    Ozcan, Burcin; Negi, Pooran; Laezza, Fernanda; Papadakis, Manos; Labate, Demetrio

    2015-01-01

    Automated identification of the primary components of a neuron and extraction of its sub-cellular features are essential steps in many quantitative studies of neuronal networks. The focus of this paper is the development of an algorithm for the automated detection of the location and morphology of somas in confocal images of neuronal network cultures. This problem is motivated by applications in high-content screenings (HCS), where the extraction of multiple morphological features of neurons on large data sets is required. Existing algorithms are not very efficient when applied to the analysis of confocal image stacks of neuronal cultures. In addition to the usual difficulties associated with the processing of fluorescent images, these types of stacks contain a small number of images so that only a small number of pixels are available along the z-direction and it is challenging to apply conventional 3D filters. The algorithm we present in this paper applies a number of innovative ideas from the theory of directional multiscale representations and involves the following steps: (i) image segmentation based on support vector machines with specially designed multiscale filters; (ii) soma extraction and separation of contiguous somas, using a combination of level set method and directional multiscale filters. We also present an approach to extract the soma’s surface morphology using the 3D shearlet transform. Extensive numerical experiments show that our algorithms are computationally efficient and highly accurate in segmenting the somas and separating contiguous ones. The algorithms presented in this paper will facilitate the development of a high-throughput quantitative platform for the study of neuronal networks for HCS applications. PMID:25853656

  16. Neuronal Networks during Burst Suppression as Revealed by Source Analysis

    PubMed Central

    Reinicke, Christine; Moeller, Friederike; Anwar, Abdul Rauf; Mideksa, Kidist Gebremariam; Pressler, Ronit; Deuschl, Günther; Stephani, Ulrich; Siniatchkin, Michael

    2015-01-01

    Introduction Burst-suppression (BS) is an electroencephalography (EEG) pattern consisting of alternant periods of slow waves of high amplitude (burst) and periods of so called flat EEG (suppression). It is generally associated with coma of various etiologies (hypoxia, drug-related intoxication, hypothermia, and childhood encephalopathies, but also anesthesia). Animal studies suggest that both the cortex and the thalamus are involved in the generation of BS. However, very little is known about mechanisms of BS in humans. The aim of this study was to identify the neuronal network underlying both burst and suppression phases using source reconstruction and analysis of functional and effective connectivity in EEG. Material/Methods Dynamic imaging of coherent sources (DICS) was applied to EEG segments of 13 neonates and infants with burst and suppression EEG pattern. The brain area with the strongest power in the analyzed frequency (1–4 Hz) range was defined as the reference region. DICS was used to compute the coherence between this reference region and the entire brain. The renormalized partial directed coherence (RPDC) was used to describe the informational flow between the identified sources. Results/Conclusion Delta activity during the burst phases was associated with coherent sources in the thalamus and brainstem as well as bilateral sources in cortical regions mainly frontal and parietal, whereas suppression phases were associated with coherent sources only in cortical regions. Results of the RPDC analyses showed an upwards informational flow from the brainstem towards the thalamus and from the thalamus to cortical regions, which was absent during the suppression phases. These findings may support the theory that a “cortical deafferentiation” between the cortex and sub-cortical structures exists especially in suppression phases compared to burst phases in burst suppression EEGs. Such a deafferentiation may play a role in the poor neurological outcome of

  17. Neuronal network disturbance after focal ischemia in rats

    SciTech Connect

    Kataoka, K.; Hayakawa, T.; Yamada, K.; Mushiroi, T.; Kuroda, R.; Mogami, H. )

    1989-09-01

    We studied functional disturbances following left middle cerebral artery occlusion in rats. Neuronal function was evaluated by (14C)2-deoxyglucose autoradiography 1 day after occlusion. We analyzed the mechanisms of change in glucose utilization outside the infarct using Fink-Heimer silver impregnation, axonal transport of wheat germ agglutinin-conjugated-horseradish peroxidase, and succinate dehydrogenase histochemistry. One day after occlusion, glucose utilization was remarkably reduced in the areas surrounding the infarct. There were many silver grains indicating degeneration of the synaptic terminals in the cortical areas surrounding the infarct and the ipsilateral cingulate cortex. Moreover, in the left thalamus where the left middle cerebral artery supplied no blood, glucose utilization significantly decreased compared with sham-operated rats. In the left thalamus, massive silver staining of degenerated synaptic terminals and decreases in succinate dehydrogenase activity were observed 4 and 5 days after occlusion. The absence of succinate dehydrogenase staining may reflect early changes in retrograde degeneration of thalamic neurons after ischemic injury of the thalamocortical pathway. Terminal degeneration even affected areas remote from the infarct: there were silver grains in the contralateral hemisphere transcallosally connected to the infarct and in the ipsilateral substantia nigra. Axonal transport study showed disruption of the corticospinal tract by subcortical ischemia; the transcallosal pathways in the cortex surrounding the infarct were preserved. The relation between neural function and the neuronal network in the area surrounding the focal cerebral infarct is discussed with regard to ischemic penumbra and diaschisis.

  18. Microstate description of stable chaos in networks of spiking neurons

    NASA Astrophysics Data System (ADS)

    Puelma Touzel, Maximilian; Michael, Monteforte; Wolf, Fred

    2014-03-01

    Dynamic instabilities have been proposed to explain the decorrelation of stimulus-driven activity observed in sensory areas such as the olfactory bulb, but are sensitive to noise. Simple neuron models coupled through inhibition can nevertheless exhibit a negative maximum Lyapunov exponent, despite displaying irregular and asynchronous (AI) activity and having an exponential instability to finite-sized perturbations above a critical strength that scales with the size, density and activity of the circuit. This stable chaos, a phenomenon first found in coupled-map lattices, produces a large, finite set of locally-attracting, yet mutually-repelling AI spike sequences ideally suited for discrete, high-dimensional coding. We analyze the effects of finite-sized perturbations on the spiking microstate and reveal the mechanism underlying the stable chaos. From this, we can analytically derive the aforementioned scaling relations and estimate the critical value of previously observed transitions to conventional chaos. This work highlights the features of intra-neuron dynamics and inter-neuron coupling that generate this phase space structure, which might serve as an attractor reservoir that downstream networks can use to decode sensory input.

  19. Entropy driven artificial neuronal networks and sensorial representation; A proposal

    SciTech Connect

    VanHulle, M.M. )

    1989-04-01

    A hierarchical Artificial Neuronal Network (ANN) is proposed as a model senosorium wherein feedback is allowed to modify the categorization abilities of the system. In this way, the original representation, being abstract and precategorical, is refined, yielding a more concrete representation. As thermodynamical entropy is a hierarchical invariant and an explicitly time dependent and compact measure of state dynamics, it is chosen as feedback measure. The main features of the network are shown to be plausible from the point of view of the physiology and anatomy of the visual system of cats and primates and one of these, double-layered maps performing combinatorial processing and evaluation, respectively, is illustrated by simulations in the orientation domain.

  20. Fuzzy operators and cyclic behavior in formal neuronal networks

    NASA Technical Reports Server (NTRS)

    Labos, E.; Holden, A. V.; Laczko, J.; Orzo, L.; Labos, A. S.

    1992-01-01

    Formal neuronal networks (FNN), which are comprised of threshold gates, make use of the unit step function. It is regarded as a degenerated distribution function (DDF) and will be referred to here as a non-fuzzy threshold operator (nFTO). Special networks of this kind generating long cycles of states are modified by introduction of fuzzy threshold operators (FTO), i.e., non-degenerated distribution functions (nDDF). The cyclic behavior of the new nets is compared with the original ones. The interconnection matrix and threshold values are not modified. It is concluded that the original long cycles change the fixed points and short cycles, and as the computer simulations demonstrate, the aperiodic motion that is associated with chaotic behavior appears. The emergence of the above changes depend on the steepness of the threshold operators.

  1. Can Simple Rules Control Development of a Pioneer Vertebrate Neuronal Network Generating Behavior?

    PubMed Central

    Conte, Deborah; Hull, Mike; Merrison-Hort, Robert; al Azad, Abul Kalam; Buhl, Edgar; Borisyuk, Roman; Soffe, Stephen R.

    2014-01-01

    How do the pioneer networks in the axial core of the vertebrate nervous system first develop? Fundamental to understanding any full-scale neuronal network is knowledge of the constituent neurons, their properties, synaptic interconnections, and normal activity. Our novel strategy uses basic developmental rules to generate model networks that retain individual neuron and synapse resolution and are capable of reproducing correct, whole animal responses. We apply our developmental strategy to young Xenopus tadpoles, whose brainstem and spinal cord share a core vertebrate plan, but at a tractable complexity. Following detailed anatomical and physiological measurements to complete a descriptive library of each type of spinal neuron, we build models of their axon growth controlled by simple chemical gradients and physical barriers. By adding dendrites and allowing probabilistic formation of synaptic connections, we reconstruct network connectivity among up to 2000 neurons. When the resulting “network” is populated by model neurons and synapses, with properties based on physiology, it can respond to sensory stimulation by mimicking tadpole swimming behavior. This functioning model represents the most complete reconstruction of a vertebrate neuronal network that can reproduce the complex, rhythmic behavior of a whole animal. The findings validate our novel developmental strategy for generating realistic networks with individual neuron- and synapse-level resolution. We use it to demonstrate how early functional neuronal connectivity and behavior may in life result from simple developmental “rules,” which lay out a scaffold for the vertebrate CNS without specific neuron-to-neuron recognition. PMID:24403159

  2. In vitro neuronal network activity in NMDA receptor encephalitis

    PubMed Central

    2013-01-01

    Background Anti-NMDA-encephalitis is caused by antibodies against the N-methyl-D-aspartate receptor (NMDAR) and characterized by a severe encephalopathy with psychosis, epileptic seizures and autonomic disturbances. It predominantly occurs in young women and is associated in 59% with an ovarian teratoma. Results We describe effects of cerebrospinal fluid (CSF) from an anti-N-methyl-D-aspartate receptor (NMDAR) encephalitis patient on in vitro neuronal network activity (ivNNA). In vitro NNA of dissociated primary rat cortical populations was recorded by the microelectrode array (MEA) system. The 23-year old patient was severely affected but showed an excellent recovery following multimodal immunomodulatory therapy and removal of an ovarian teratoma. Patient CSF (pCSF) taken during the initial weeks after disease onset suppressed global spike- and burst rates of ivNNA in contrast to pCSF sampled after clinical recovery and decrease of NMDAR antibody titers. The synchrony of pCSF-affected ivNNA remained unaltered during the course of the disease. Conclusion Patient CSF directly suppresses global activity of neuronal networks recorded by the MEA system. In contrast, pCSF did not regulate the synchrony of ivNNA suggesting that NMDAR antibodies selectively regulate distinct parameters of ivNNA while sparing their functional connectivity. Thus, assessing ivNNA could represent a new technique to evaluate functional consequences of autoimmune encephalitis-related CSF changes. PMID:23379293

  3. How effective delays shape oscillatory dynamics in neuronal networks

    NASA Astrophysics Data System (ADS)

    Roxin, Alex; Montbrió, Ernest

    2011-02-01

    Synaptic, dendritic and single-cell kinetics generate significant time delays that shape the dynamics of large networks of spiking neurons. Previous work has shown that such effective delays can be taken into account with a rate model through the addition of an explicit, fixed delay (Roxin et al. (2005,2006) [29,30]). Here we extend this work to account for arbitrary symmetric patterns of synaptic connectivity and generic nonlinear transfer functions. Specifically, we conduct a weakly nonlinear analysis of the dynamical states arising via primary instabilities of the asynchronous state. In this way we determine analytically how the nature and stability of these states depend on the choice of transfer function and connectivity. We arrive at two general observations of physiological relevance that could not be explained in previous work. These are: 1 - fast oscillations are always supercritical for realistic transfer functions and 2 - traveling waves are preferred over standing waves given plausible patterns of local connectivity. We finally demonstrate that these results show good agreement with those obtained performing numerical simulations of a network of Hodgkin-Huxley neurons.

  4. Simulation of Code Spectrum and Code Flow of Cultured Neuronal Networks.

    PubMed

    Tamura, Shinichi; Nishitani, Yoshi; Hosokawa, Chie; Miyoshi, Tomomitsu; Sawai, Hajime

    2016-01-01

    It has been shown that, in cultured neuronal networks on a multielectrode, pseudorandom-like sequences (codes) are detected, and they flow with some spatial decay constant. Each cultured neuronal network is characterized by a specific spectrum curve. That is, we may consider the spectrum curve as a "signature" of its associated neuronal network that is dependent on the characteristics of neurons and network configuration, including the weight distribution. In the present study, we used an integrate-and-fire model of neurons with intrinsic and instantaneous fluctuations of characteristics for performing a simulation of a code spectrum from multielectrodes on a 2D mesh neural network. We showed that it is possible to estimate the characteristics of neurons such as the distribution of number of neurons around each electrode and their refractory periods. Although this process is a reverse problem and theoretically the solutions are not sufficiently guaranteed, the parameters seem to be consistent with those of neurons. That is, the proposed neural network model may adequately reflect the behavior of a cultured neuronal network. Furthermore, such prospect is discussed that code analysis will provide a base of communication within a neural network that will also create a base of natural intelligence. PMID:27239189

  5. Simulation of Code Spectrum and Code Flow of Cultured Neuronal Networks

    PubMed Central

    Tamura, Shinichi; Nishitani, Yoshi; Hosokawa, Chie; Miyoshi, Tomomitsu; Sawai, Hajime

    2016-01-01

    It has been shown that, in cultured neuronal networks on a multielectrode, pseudorandom-like sequences (codes) are detected, and they flow with some spatial decay constant. Each cultured neuronal network is characterized by a specific spectrum curve. That is, we may consider the spectrum curve as a “signature” of its associated neuronal network that is dependent on the characteristics of neurons and network configuration, including the weight distribution. In the present study, we used an integrate-and-fire model of neurons with intrinsic and instantaneous fluctuations of characteristics for performing a simulation of a code spectrum from multielectrodes on a 2D mesh neural network. We showed that it is possible to estimate the characteristics of neurons such as the distribution of number of neurons around each electrode and their refractory periods. Although this process is a reverse problem and theoretically the solutions are not sufficiently guaranteed, the parameters seem to be consistent with those of neurons. That is, the proposed neural network model may adequately reflect the behavior of a cultured neuronal network. Furthermore, such prospect is discussed that code analysis will provide a base of communication within a neural network that will also create a base of natural intelligence. PMID:27239189

  6. Block-Based Neural Networks with Pulsed Neuron Model

    NASA Astrophysics Data System (ADS)

    Iguchi, Syota; Koakutsu, Seiichi; Okamoto, Takashi; Hirata, Hironori

    In recent years, the study of hardware implementation of Neural Networks (NN) has been getting more important. In particular, Block-Based Neural Networks (BBNN) which are one of NN have been attracted attention. However, the conventional BBNN are analogue NN (ANN). The digital hardware implementation of ANN is very difficult, because the input and output signals are represented as analogue values. Pulsed Neural Networks (PNN) which adopt a pulsed neuron (PN) model instead of the AN model have been proposed in order to solve this problem. The input and output signals of PNN are represented as a series of pulses, and thus the digital hardware implementation of PNN becomes easy. In this paper, we propose Block-Based Pulsed Neural Networks (BBPNN) introducing the PN model into BBNN in order to faciliate the implementation of NN on digital hardware. We use particle swarm optimization (PSO) for optimization of weights of BBPNN, because PSO can produce a globally optimum solution of nonlinear continuous optimization problems in practicable calculation time by high accuracy. To evaluate the proposed BBPNN, we apply them to XOR problem and autonomous mobile robot control problems. Computational experiments indicate that the proposed BBPNN and the conventional BBNN can produce about the same results.

  7. Neuron-Like Networks Between Ribosomal Proteins Within the Ribosome

    PubMed Central

    Poirot, Olivier; Timsit, Youri

    2016-01-01

    From brain to the World Wide Web, information-processing networks share common scale invariant properties. Here, we reveal the existence of neural-like networks at a molecular scale within the ribosome. We show that with their extensions, ribosomal proteins form complex assortative interaction networks through which they communicate through tiny interfaces. The analysis of the crystal structures of 50S eubacterial particles reveals that most of these interfaces involve key phylogenetically conserved residues. The systematic observation of interactions between basic and aromatic amino acids at the interfaces and along the extension provides new structural insights that may contribute to decipher the molecular mechanisms of signal transmission within or between the ribosomal proteins. Similar to neurons interacting through “molecular synapses”, ribosomal proteins form a network that suggest an analogy with a simple molecular brain in which the “sensory-proteins” innervate the functional ribosomal sites, while the “inter-proteins” interconnect them into circuits suitable to process the information flow that circulates during protein synthesis. It is likely that these circuits have evolved to coordinate both the complex macromolecular motions and the binding of the multiple factors during translation. This opens new perspectives on nanoscale information transfer and processing. PMID:27225526

  8. Analyzing neuronal networks using discrete-time dynamics

    NASA Astrophysics Data System (ADS)

    Ahn, Sungwoo; Smith, Brian H.; Borisyuk, Alla; Terman, David

    2010-05-01

    We develop mathematical techniques for analyzing detailed Hodgkin-Huxley like models for excitatory-inhibitory neuronal networks. Our strategy for studying a given network is to first reduce it to a discrete-time dynamical system. The discrete model is considerably easier to analyze, both mathematically and computationally, and parameters in the discrete model correspond directly to parameters in the original system of differential equations. While these networks arise in many important applications, a primary focus of this paper is to better understand mechanisms that underlie temporally dynamic responses in early processing of olfactory sensory information. The models presented here exhibit several properties that have been described for olfactory codes in an insect’s Antennal Lobe. These include transient patterns of synchronization and decorrelation of sensory inputs. By reducing the model to a discrete system, we are able to systematically study how properties of the dynamics, including the complex structure of the transients and attractors, depend on factors related to connectivity and the intrinsic and synaptic properties of cells within the network.

  9. Neuron-Like Networks Between Ribosomal Proteins Within the Ribosome.

    PubMed

    Poirot, Olivier; Timsit, Youri

    2016-01-01

    From brain to the World Wide Web, information-processing networks share common scale invariant properties. Here, we reveal the existence of neural-like networks at a molecular scale within the ribosome. We show that with their extensions, ribosomal proteins form complex assortative interaction networks through which they communicate through tiny interfaces. The analysis of the crystal structures of 50S eubacterial particles reveals that most of these interfaces involve key phylogenetically conserved residues. The systematic observation of interactions between basic and aromatic amino acids at the interfaces and along the extension provides new structural insights that may contribute to decipher the molecular mechanisms of signal transmission within or between the ribosomal proteins. Similar to neurons interacting through "molecular synapses", ribosomal proteins form a network that suggest an analogy with a simple molecular brain in which the "sensory-proteins" innervate the functional ribosomal sites, while the "inter-proteins" interconnect them into circuits suitable to process the information flow that circulates during protein synthesis. It is likely that these circuits have evolved to coordinate both the complex macromolecular motions and the binding of the multiple factors during translation. This opens new perspectives on nanoscale information transfer and processing. PMID:27225526

  10. Analyzing Neuronal Networks Using Discrete-Time Dynamics

    PubMed Central

    Ahn, Sungwoo; Smith, Brian H.; Borisyuk, Alla; Terman, David

    2010-01-01

    We develop mathematical techniques for analyzing detailed Hodgkin-Huxley like models for excitatory-inhibitory neuronal networks. Our strategy for studying a given network is to first reduce it to a discrete-time dynamical system. The discrete model is considerably easier to analyze, both mathematically and computationally, and parameters in the discrete model correspond directly to parameters in the original system of differential equations. While these networks arise in many important applications, a primary focus of this paper is to better understand mechanisms that underlie temporally dynamic responses in early processing of olfactory sensory information. The models presented here exhibit several properties that have been described for olfactory codes in an insect's Antennal Lobe. These include transient patterns of synchronization and decorrelation of sensory inputs. By reducing the model to a discrete system, we are able to systematically study how properties of the dynamics, including the complex structure of the transients and attractors, depend on factors related to connectivity and the intrinsic and synaptic properties of cells within the network. PMID:20454529

  11. Recent Developments in VSD Imaging of Small Neuronal Networks

    ERIC Educational Resources Information Center

    Hill, Evan S.; Bruno, Angela M.; Frost, William N.

    2014-01-01

    Voltage-sensitive dye (VSD) imaging is a powerful technique that can provide, in single experiments, a large-scale view of network activity unobtainable with traditional sharp electrode recording methods. Here we review recent work using VSDs to study small networks and highlight several results from this approach. Topics covered include circuit…

  12. Quantification of degeneracy in Hodgkin-Huxley neurons on Newman-Watts small world network.

    PubMed

    Man, Menghua; Zhang, Ya; Ma, Guilei; Friston, Karl; Liu, Shanghe

    2016-08-01

    Degeneracy is a fundamental source of biological robustness, complexity and evolvability in many biological systems. However, degeneracy is often confused with redundancy. Furthermore, the quantification of degeneracy has not been addressed for realistic neuronal networks. The objective of this paper is to characterize degeneracy in neuronal network models via quantitative mathematic measures. Firstly, we establish Hodgkin-Huxley neuronal networks with Newman-Watts small world network architectures. Secondly, in order to calculate the degeneracy, redundancy and complexity in the ensuing networks, we use information entropy to quantify the information a neuronal response carries about the stimulus - and mutual information to measure the contribution of each subset of the neuronal network. Finally, we analyze the interdependency of degeneracy, redundancy and complexity - and how these three measures depend upon network architectures. Our results suggest that degeneracy can be applied to any neuronal network as a formal measure, and degeneracy is distinct from redundancy. Qualitatively degeneracy and complexity are more highly correlated over different network architectures, in comparison to redundancy. Quantitatively, the relationship between both degeneracy and redundancy depends on network coupling strength: both degeneracy and redundancy increase with complexity for small coupling strengths; however, as coupling strength increases, redundancy decreases with complexity (in contrast to degeneracy, which is relatively invariant). These results suggest that the degeneracy is a general topologic characteristic of neuronal networks, which could be applied quantitatively in neuroscience and connectomics. PMID:27155043

  13. Propagation of Spiking and Burst-Spiking Synchronous States in a Feed-Forward Neuronal Network

    NASA Astrophysics Data System (ADS)

    Zhang, Xi; Huang, Hong-Bin; Li, Pei-Jun; Wu, Fang-Ping; Wu, Wang-Jie; Jiang, Min

    2012-12-01

    Neuronal firing that carries information can propagate stably in neuronal networks. One important feature of the stable states is their spatiotemporal correlation (STC) developed in the propagation. The propagation of synchronous states of spiking and burst-spiking neuronal activities in a feed-forward neuronal network with high STC is studied. Different dynamic regions and synchronous regions of the second layer are clarified for spiking and burst-spiking neuronal activities. By calculating correlation, it is found that five layers are needed for stable propagation. Synchronous regions of the 4th layer and the 10th layer are compared.

  14. Multiparametric characterisation of neuronal network activity for in vitro agrochemical neurotoxicity assessment.

    PubMed

    Alloisio, Susanna; Nobile, Mario; Novellino, Antonio

    2015-05-01

    The last few decades have seen the marketing of hundreds of new pesticide products with a forecasted expansion of the global agrochemical industry. As several pesticides directly target nervous tissue as their mechanism of toxicity, alternative methods to routine in vivo animal testing, such as the Multi Electrode Array (MEAs)-based approach, have been proposed as an in vitro tool to perform sensitive, quick and low cost neuro-toxicological screening. Here, we examined the effects of a training set of eleven active substances known to have neuronal or non-neuronal targets, contained in the most commonly used agrochemicals, on the spontaneous electrical activity of cortical neuronal networks grown on MEAs. A multiparametric characterisation of neuronal network firing and bursting was performed with the aim of investigating how this can contribute to the efficient evaluation of in vitro chemical-induced neurotoxicity. The analysis of MFR, MBR, MBD, MISI_B and % Spikes_B parameters identified four different groups of chemicals: one wherein only inhibition is observed (chlorpyrifos, deltamethrin, orysastrobin, dimoxystrobin); a second one in which all parameters, except the MISI_B, are inhibited (carbaryl, quinmerac); a third in which increases at low chemical concentration are followed by decreases at high concentration, with exception of MISI_B that only decreased (fipronil); a fourth in which no effects are observed (paraquat, glyphosate, imidacloprid, mepiquat). The overall results demonstrated that the multiparametric description of the neuronal networks activity makes MEA-based screening platform an accurate and consistent tool for the evaluation of the toxic potential of chemicals. In particular, among the bursting parameters the MISI_B was the best that correlates with potency and may help to better define chemical toxicity when MFR is affected only at relatively high concentration. PMID:25845298

  15. From artificial neural networks to spiking neuron populations and back again.

    PubMed

    de Kamps, M; van der Velde, F

    2001-01-01

    In this paper, we investigate the relation between Artificial Neural Networks (ANNs) and networks of populations of spiking neurons. The activity of an artificial neuron is usually interpreted as the firing rate of a neuron or neuron population. Using a model of the visual cortex, we will show that this interpretation runs into serious difficulties. We propose to interpret the activity of an artificial neuron as the steady state of a cross-inhibitory circuit, in which one population codes for 'positive' artificial neuron activity and another for 'negative' activity. We will show that with this interpretation it is possible, under certain circumstances, to transform conventional ANNs (e.g. trained with 'back-propagation') into biologically plausible networks of spiking populations. However, in general, the use of biologically motivated spike response functions introduces artificial neurons that behave differently from the ones used in the classical ANN paradigm. PMID:11665784

  16. Patterning human neuronal networks on photolithographically engineered silicon dioxide substrates functionalized with glial analogues.

    PubMed

    Hughes, Mark A; Brennan, Paul M; Bunting, Andrew S; Cameron, Katherine; Murray, Alan F; Shipston, Mike J

    2014-05-01

    Interfacing neurons with silicon semiconductors is a challenge being tackled through various bioengineering approaches. Such constructs inform our understanding of neuronal coding and learning and ultimately guide us toward creating intelligent neuroprostheses. A fundamental prerequisite is to dictate the spatial organization of neuronal cells. We sought to pattern neurons using photolithographically defined arrays of polymer parylene-C, activated with fetal calf serum. We used a purified human neuronal cell line [Lund human mesencephalic (LUHMES)] to establish whether neurons remain viable when isolated on-chip or whether they require a supporting cell substrate. When cultured in isolation, LUHMES neurons failed to pattern and did not show any morphological signs of differentiation. We therefore sought a cell type with which to prepattern parylene regions, hypothesizing that this cellular template would enable secondary neuronal adhesion and network formation. From a range of cell lines tested, human embryonal kidney (HEK) 293 cells patterned with highest accuracy. LUHMES neurons adhered to pre-established HEK 293 cell clusters and this coculture environment promoted morphological differentiation of neurons. Neurites extended between islands of adherent cell somata, creating an orthogonally arranged neuronal network. HEK 293 cells appear to fulfill a role analogous to glia, dictating cell adhesion, and generating an environment conducive to neuronal survival. We next replaced HEK 293 cells with slower growing glioma-derived precursors. These primary human cells patterned accurately on parylene and provided a similarly effective scaffold for neuronal adhesion. These findings advance the use of this microfabrication-compatible platform for neuronal patterning. PMID:23733444

  17. Search for periodicity in the observational data by means of artificial neuron networks

    NASA Astrophysics Data System (ADS)

    Baluev, R.

    2012-05-01

    The possibility of application of artificial neural networks is considered for two classical model problems of observational data reduction: (i) the identification of periodic oscillations in noisy time series and (ii) assessment of the frequency of this oscillation (on the existing time series). On the inputs of the neural networks the values of the time series are given, and on the output, respectively, we have either an indicatior of the presence of signal (from 0 to 1), or the assessment of its frequency. It is shown that the theoretical limit, which a neural network can achieve in the training to solve such problems, corresponds to the Bayesian theory of estimation and testing of statistical hypotheses. Training of the neural network was carried out with a help of means of open-source package FANN. The best results were achieved using the algorithm Cascade2, which allows finding the optimal number of network neurons (not just the weight of the connection between them). In comparison with traditional methods based on the periodogram, which require long calculations, the trained neural network works almost instantly. Thus, artificial neural networks are very promising for the processing of large data sets. However, the threshold of signal detection so far failed to bring to Bayesian theoretical limit. In addition, it is not yet possible to train the neural network to analyze time-series with arbitrarily-uneven distribution of observations. This indicates on a need for further investigations to improve the efficiency of the method.

  18. Energy-efficient population coding constrains network size of a neuronal array system

    PubMed Central

    Yu, Lianchun; Zhang, Chi; Liu, Liwei; Yu, Yuguo

    2016-01-01

    We consider the open issue of how the energy efficiency of the neural information transmission process, in a general neuronal array, constrains the network size, and how well this network size ensures the reliable transmission of neural information in a noisy environment. By direct mathematical analysis, we have obtained general solutions proving that there exists an optimal number of neurons in the network, where the average coding energy cost (defined as energy consumption divided by mutual information) per neuron passes through a global minimum for both subthreshold and superthreshold signals. With increases in background noise intensity, the optimal neuronal number decreases for subthreshold signals and increases for suprathreshold signals. The existence of an optimal number of neurons in an array network reveals a general rule for population coding that states that the neuronal number should be large enough to ensure reliable information transmission that is robust to the noisy environment but small enough to minimize energy cost. PMID:26781354

  19. Energy-efficient population coding constrains network size of a neuronal array system.

    PubMed

    Yu, Lianchun; Zhang, Chi; Liu, Liwei; Yu, Yuguo

    2016-01-01

    We consider the open issue of how the energy efficiency of the neural information transmission process, in a general neuronal array, constrains the network size, and how well this network size ensures the reliable transmission of neural information in a noisy environment. By direct mathematical analysis, we have obtained general solutions proving that there exists an optimal number of neurons in the network, where the average coding energy cost (defined as energy consumption divided by mutual information) per neuron passes through a global minimum for both subthreshold and superthreshold signals. With increases in background noise intensity, the optimal neuronal number decreases for subthreshold signals and increases for suprathreshold signals. The existence of an optimal number of neurons in an array network reveals a general rule for population coding that states that the neuronal number should be large enough to ensure reliable information transmission that is robust to the noisy environment but small enough to minimize energy cost. PMID:26781354

  20. Energy-efficient population coding constrains network size of a neuronal array system

    NASA Astrophysics Data System (ADS)

    Yu, Lianchun; Zhang, Chi; Liu, Liwei; Yu, Yuguo

    2016-01-01

    We consider the open issue of how the energy efficiency of the neural information transmission process, in a general neuronal array, constrains the network size, and how well this network size ensures the reliable transmission of neural information in a noisy environment. By direct mathematical analysis, we have obtained general solutions proving that there exists an optimal number of neurons in the network, where the average coding energy cost (defined as energy consumption divided by mutual information) per neuron passes through a global minimum for both subthreshold and superthreshold signals. With increases in background noise intensity, the optimal neuronal number decreases for subthreshold signals and increases for suprathreshold signals. The existence of an optimal number of neurons in an array network reveals a general rule for population coding that states that the neuronal number should be large enough to ensure reliable information transmission that is robust to the noisy environment but small enough to minimize energy cost.

  1. Integration of neuroblasts into a two-dimensional small world neuronal network

    NASA Astrophysics Data System (ADS)

    Schneider-Mizell, Casey; Zochowski, Michal; Sander, Leonard

    2009-03-01

    Neurogenesis in the adult brain has been suggested to be important for learning and functional robustness to the neuronal death. New neurons integrate themselves into existing neuronal networks by moving into a target destination, extending axonal and dendritic processes, and inducing synaptogenesis to connect to active neurons. We hypothesize that increased plasticity of the network to novel stimuli can arise from activity-dependent cell and process motility rules. In complement to a similar in vitro model, we investigate a computational model of a two-dimensional small world network of integrate and fire neurons. After steady-state activity is reached in the extant network, we introduce new neurons which move, stop, and connect themselves through rules governed by position and firing rate.

  2. Neuronal oscillations and functional interactions between resting state networks.

    PubMed

    Lei, Xu; Wang, Yulin; Yuan, Hong; Mantini, Dante

    2014-07-01

    Functional magnetic imaging (fMRI) studies showed that resting state activity in the healthy brain is organized into multiple large-scale networks encompassing distant regions. A key finding of resting state fMRI studies is the anti-correlation typically observed between the dorsal attention network (DAN) and the default mode network (DMN), which - during task performance - are activated and deactivated, respectively. Previous studies have suggested that alcohol administration modulates the balance of activation/deactivation in brain networks, as well as it induces significant changes in oscillatory activity measured by electroencephalography (EEG). However, our knowledge of alcohol-induced changes in band-limited EEG power and their potential link with the functional interactions between DAN and DMN is still very limited. Here we address this issue, examining the neuronal effects of alcohol administration during resting state by using simultaneous EEG-fMRI. Our findings show increased EEG power in the theta frequency band (4-8 Hz) after administration of alcohol compared to placebo, which was prominent over the frontal cortex. More interestingly, increased frontal tonic EEG activity in this band was associated with greater anti-correlation between the DAN and the frontal component of the DMN. Furthermore, EEG theta power and DAN-DMN anti-correlation were relatively greater in subjects who reported a feeling of euphoria after alcohol administration, which may result from a diminished inhibition exerted by the prefrontal cortex. Overall, our findings suggest that slow brain rhythms are responsible for dynamic functional interactions between brain networks. They also confirm the applicability and potential usefulness of EEG-fMRI for central nervous system drug research. PMID:25050432

  3. Numbers And Gains Of Neurons In Winner-Take-All Networks

    NASA Technical Reports Server (NTRS)

    Brown, Timothy X.

    1993-01-01

    Report presents theoretical study of gains required in neurons to implement winner-take-all electronic neural network of given size and related question of maximum size of winner-take-all network in which neurons have specified sigmoid transfer or response function with specified gain.

  4. Analysis of connectivity map: Control to glutamate injured and phenobarbital treated neuronal network

    NASA Astrophysics Data System (ADS)

    Kamal, Hassan; Kanhirodan, Rajan; Srinivas, Kalyan V.; Sikdar, Sujit K.

    2010-04-01

    We study the responses of a cultured neural network when it is exposed to epileptogenesis glutamate injury causing epilepsy and subsequent treatment with phenobarbital by constructing connectivity map of neurons using correlation matrix. This study is particularly useful in understanding the pharmaceutical drug induced changes in the neuronal network properties with insights into changes at the systems biology level.

  5. Connectivity, excitability and activity patterns in neuronal networks

    NASA Astrophysics Data System (ADS)

    le Feber, Joost; Stoyanova, Irina I.; Chiappalone, Michela

    2014-06-01

    Extremely synchronized firing patterns such as those observed in brain diseases like epilepsy may result from excessive network excitability. Although network excitability is closely related to (excitatory) connectivity, a direct measure for network excitability remains unavailable. Several methods currently exist for estimating network connectivity, most of which are related to cross-correlation. An example is the conditional firing probability (CFP) analysis which calculates the pairwise probability (CFPi,j) that electrode j records an action potential at time t = τ, given that electrode i recorded a spike at t = 0. However, electrode i often records multiple spikes within the analysis interval, and CFP values are biased by the on-going dynamic state of the network. Here we show that in a linear approximation this bias may be removed by deconvoluting CFPi,j with the autocorrelation of i (i.e. CFPi,i), to obtain the single pulse response (SPRi,j)—the average response at electrode j to a single spike at electrode i. Thus, in a linear system SPRs would be independent of the dynamic network state. Nonlinear components of synaptic transmission, such as facilitation and short term depression, will however still affect SPRs. Therefore SPRs provide a clean measure of network excitability. We used carbachol and ghrelin to moderately activate cultured cortical networks to affect their dynamic state. Both neuromodulators transformed the bursting firing patterns of the isolated networks into more dispersed firing. We show that the influence of the dynamic state on SPRs is much smaller than the effect on CFPs, but not zero. The remaining difference reflects the alteration in network excitability. We conclude that SPRs are less contaminated by the dynamic network state and that mild excitation may decrease network excitability, possibly through short term synaptic depression.

  6. Communication Network Analysis Methods.

    ERIC Educational Resources Information Center

    Farace, Richard V.; Mabee, Timothy

    This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…

  7. Simulation of restricted neural networks with reprogrammable neurons

    SciTech Connect

    Hartline, D.K. )

    1989-05-01

    This paper describes a network model composed of reprogrammable neurons. It incorporates the following design features: spikes can be generated by a model representing repetitive firing at axon (and dendritic) trigger zones; active responses (plateau potentials; delaying mechanisms) are simulated with Hodgkin-huxley type kinetics; synaptic interactions both spike-mediated and non-spiking chemical ('chemotonic'), simulate transmitter release and binding to postsynaptic receptors. Facilitation and antifacilitation of spike-mediated postsynaptic potentials (PSP's) are included. Chemical pools are used to simulate second messenger systems, trapping of ions in extracellular spaces, and electrogenic pumps, as well as biochemical reaction chains of quite general character. Modulation of any of the parameters of any compartment can be effected through the pools. Intracellular messengers of three kinds are simulated explicitly: those produced by voltage-gated processes (e.g. Ca); those dependent on transmitter (or hormone) binding; and those dependent on other internal messengers (e.g., internally released Ca; enzymatically activated pathways).

  8. a Simple Neuron Network Based on Hebb's Rule

    NASA Astrophysics Data System (ADS)

    Zhang, Gui-Qing; Yu, Zi; Yang, Qiu-Ying; Chen, Tian-Lun

    A weighted mechanism in neural networks is studied. This paper focuses on the neuron's behaviors in an area of brain. Our model could regenerate the power-law behaviors and finite size effects of neural avalanche. The probability density functions (PDFs) for the neural avalanche size differing at different times (lattice size) have fat tails with a q-Gaussian shape and the same parameter value of q in the thermodynamical limit. Above two kinds of behaviors show that our neural model can well present self-organized critical behavior. The robustness of PDFs shows the stability of self-organized criticality. Meanwhile, the avalanche scaling relation of the waiting time has been found.

  9. FPGA implementation of a biological neural network based on the Hodgkin-Huxley neuron model

    PubMed Central

    Yaghini Bonabi, Safa; Asgharian, Hassan; Safari, Saeed; Nili Ahmadabadi, Majid

    2014-01-01

    A set of techniques for efficient implementation of Hodgkin-Huxley-based (H-H) model of a neural network on FPGA (Field Programmable Gate Array) is presented. The central implementation challenge is H-H model complexity that puts limits on the network size and on the execution speed. However, basics of the original model cannot be compromised when effect of synaptic specifications on the network behavior is the subject of study. To solve the problem, we used computational techniques such as CORDIC (Coordinate Rotation Digital Computer) algorithm and step-by-step integration in the implementation of arithmetic circuits. In addition, we employed different techniques such as sharing resources to preserve the details of model as well as increasing the network size in addition to keeping the network execution speed close to real time while having high precision. Implementation of a two mini-columns network with 120/30 excitatory/inhibitory neurons is provided to investigate the characteristic of our method in practice. The implementation techniques provide an opportunity to construct large FPGA-based network models to investigate the effect of different neurophysiological mechanisms, like voltage-gated channels and synaptic activities, on the behavior of a neural network in an appropriate execution time. Additional to inherent properties of FPGA, like parallelism and re-configurability, our approach makes the FPGA-based system a proper candidate for study on neural control of cognitive robots and systems as well. PMID:25484854

  10. FPGA implementation of a biological neural network based on the Hodgkin-Huxley neuron model.

    PubMed

    Yaghini Bonabi, Safa; Asgharian, Hassan; Safari, Saeed; Nili Ahmadabadi, Majid

    2014-01-01

    A set of techniques for efficient implementation of Hodgkin-Huxley-based (H-H) model of a neural network on FPGA (Field Programmable Gate Array) is presented. The central implementation challenge is H-H model complexity that puts limits on the network size and on the execution speed. However, basics of the original model cannot be compromised when effect of synaptic specifications on the network behavior is the subject of study. To solve the problem, we used computational techniques such as CORDIC (Coordinate Rotation Digital Computer) algorithm and step-by-step integration in the implementation of arithmetic circuits. In addition, we employed different techniques such as sharing resources to preserve the details of model as well as increasing the network size in addition to keeping the network execution speed close to real time while having high precision. Implementation of a two mini-columns network with 120/30 excitatory/inhibitory neurons is provided to investigate the characteristic of our method in practice. The implementation techniques provide an opportunity to construct large FPGA-based network models to investigate the effect of different neurophysiological mechanisms, like voltage-gated channels and synaptic activities, on the behavior of a neural network in an appropriate execution time. Additional to inherent properties of FPGA, like parallelism and re-configurability, our approach makes the FPGA-based system a proper candidate for study on neural control of cognitive robots and systems as well. PMID:25484854

  11. Ca^2+ Dynamics and Propagating Waves in Neural Networks with Excitatory and Inhibitory Neurons.

    NASA Astrophysics Data System (ADS)

    Bondarenko, Vladimir E.

    2008-03-01

    Dynamics of neural spikes, intracellular Ca^2+, and Ca^2+ in intracellular stores was investigated both in isolated Chay's neurons and in the neurons coupled in networks. Three types of neural networks were studied: a purely excitatory neural network, with only excitatory (AMPA) synapses; a purely inhibitory neural network with only inhibitory (GABA) synapses; and a hybrid neural network, with both AMPA and GABA synapses. In the hybrid neural network, the ratio of excitatory to inhibitory neurons was 4:1. For each case, we considered two types of connections, ``all-with-all" and 20 connections per neuron. Each neural network contained 100 neurons with randomly distributed connection strengths. In the neural networks with ``all-with-all" connections and AMPA/GABA synapses an increase in average synaptic strength yielded bursting activity with increased/decreased number of spikes per burst. The neural bursts and Ca^2+ transients were synchronous at relatively large connection strengths despite random connection strengths. Simulations of the neural networks with 20 connections per neuron and with only AMPA synapses showed synchronous oscillations, while the neural networks with GABA or hybrid synapses generated propagating waves of membrane potential and Ca^2+ transients.

  12. Self-organization of synchronous activity propagation in neuronal networks driven by local excitation.

    PubMed

    Bayati, Mehdi; Valizadeh, Alireza; Abbassian, Abdolhossein; Cheng, Sen

    2015-01-01

    Many experimental and theoretical studies have suggested that the reliable propagation of synchronous neural activity is crucial for neural information processing. The propagation of synchronous firing activity in so-called synfire chains has been studied extensively in feed-forward networks of spiking neurons. However, it remains unclear how such neural activity could emerge in recurrent neuronal networks through synaptic plasticity. In this study, we investigate whether local excitation, i.e., neurons that fire at a higher frequency than the other, spontaneously active neurons in the network, can shape a network to allow for synchronous activity propagation. We use two-dimensional, locally connected and heterogeneous neuronal networks with spike-timing dependent plasticity (STDP). We find that, in our model, local excitation drives profound network changes within seconds. In the emergent network, neural activity propagates synchronously through the network. This activity originates from the site of the local excitation and propagates through the network. The synchronous activity propagation persists, even when the local excitation is removed, since it derives from the synaptic weight matrix. Importantly, once this connectivity is established it remains stable even in the presence of spontaneous activity. Our results suggest that synfire-chain-like activity can emerge in a relatively simple way in realistic neural networks by locally exciting the desired origin of the neuronal sequence. PMID:26089794

  13. Experiments in clustered neuronal networks: A paradigm for complex modular dynamics

    NASA Astrophysics Data System (ADS)

    Teller, Sara; Soriano, Jordi

    2016-06-01

    Uncovering the interplay activity-connectivity is one of the major challenges in neuroscience. To deepen in the understanding of how a neuronal circuit shapes network dynamics, neuronal cultures have emerged as remarkable systems given their accessibility and easy manipulation. An attractive configuration of these in vitro systems consists in an ensemble of interconnected clusters of neurons. Using calcium fluorescence imaging to monitor spontaneous activity in these clustered neuronal networks, we were able to draw functional maps and reveal their topological features. We also observed that these networks exhibit a hierarchical modular dynamics, in which clusters fire in small groups that shape characteristic communities in the network. The structure and stability of these communities is sensitive to chemical or physical action, and therefore their analysis may serve as a proxy for network health. Indeed, the combination of all these approaches is helping to develop models to quantify damage upon network degradation, with promising applications for the study of neurological disorders in vitro.

  14. Microbial light-activatable proton pumps as neuronal inhibitors to functionally dissect neuronal networks in C. elegans.

    PubMed

    Husson, Steven J; Liewald, Jana F; Schultheis, Christian; Stirman, Jeffrey N; Lu, Hang; Gottschalk, Alexander

    2012-01-01

    Essentially any behavior in simple and complex animals depends on neuronal network function. Currently, the best-defined system to study neuronal circuits is the nematode Caenorhabditis elegans, as the connectivity of its 302 neurons is exactly known. Individual neurons can be activated by photostimulation of Channelrhodopsin-2 (ChR2) using blue light, allowing to directly probe the importance of a particular neuron for the respective behavioral output of the network under study. In analogy, other excitable cells can be inhibited by expressing Halorhodopsin from Natronomonas pharaonis (NpHR) and subsequent illumination with yellow light. However, inhibiting C. elegans neurons using NpHR is difficult. Recently, proton pumps from various sources were established as valuable alternative hyperpolarizers. Here we show that archaerhodopsin-3 (Arch) from Halorubrum sodomense and a proton pump from the fungus Leptosphaeria maculans (Mac) can be utilized to effectively inhibit excitable cells in C. elegans. Arch is the most powerful hyperpolarizer when illuminated with yellow or green light while the action spectrum of Mac is more blue-shifted, as analyzed by light-evoked behaviors and electrophysiology. This allows these tools to be combined in various ways with ChR2 to analyze different subsets of neurons within a circuit. We exemplify this by means of the polymodal aversive sensory ASH neurons, and the downstream command interneurons to which ASH neurons signal to trigger a reversal followed by a directional turn. Photostimulating ASH and subsequently inhibiting command interneurons using two-color illumination of different body segments, allows investigating temporal aspects of signaling downstream of ASH. PMID:22815873

  15. The role of electrical coupling in generating and modulating oscillations in a neuronal network.

    PubMed

    Mouser, Christina; Bose, Amitabha; Nadim, Farzan

    2016-08-01

    A simplified model of the crustacean gastric mill network is considered. Rhythmic activity in this network has largely been attributed to half center oscillations driven by mutual inhibition. We use mathematical modeling and dynamical systems theory to show that rhythmic oscillations in this network may also depend on, or even arise from, a voltage-dependent electrical coupling between one of the cells in the half-center network and a projection neuron that lies outside of the network. This finding uncovers a potentially new mechanism for the generation of oscillations in neuronal networks. PMID:27188714

  16. The effect of noise on a neural network with spiking neurons

    NASA Astrophysics Data System (ADS)

    Inchiosa, Mario E.

    1993-08-01

    We study a class of neural network associative memories which include noise and transmission delays, code information in the timing of spikes, use long-range Hebbian couplings plus local, inhibitory couplings, and feature low, biologically realistic neuronal activity. Recall of a pattern consists of a synchronized, periodic firing of neurons. We find a Lyapunov functional for the noiseless network dynamics, and using statistical mechanics and numerical simulation, we find that noisy dynamics improves the network's ability to discriminate stored from unknown patterns.

  17. Autaptic self-feedback-induced synchronization transitions in Newman-Watts neuronal network with time delays

    NASA Astrophysics Data System (ADS)

    Wang, Qi; Gong, Yubing; Wu, Yanan

    2015-04-01

    Autapse is a special synapse that connects a neuron to itself. In this work, we numerically study the effect of chemical autapse on the synchronization of Newman-Watts Hodgkin-Huxley neuron network with time delays. It is found that the neurons exhibit synchronization transitions as autaptic self-feedback delay is varied, and the phenomenon enhances when autaptic self-feedback strength increases. Moreover, this phenomenon becomes strongest when network time delay or coupling strength is optimal. It is also found that the synchronization transitions by network time delay can be enhanced by autaptic activity and become strongest when autaptic delay is optimal. These results show that autaptic delayed self-feedback activity can intermittently enhance and reduce the synchronization of the neuronal network and hence plays an important role in regulating the synchronization of the neurons. These findings could find potential implications for the information processing and transmission in neural systems.

  18. Effects of glial release and somatic receptors on bursting in synchronized neuronal networks

    NASA Astrophysics Data System (ADS)

    Zhan, Xuan; Lai, Pik-Yin; Chan, C. K.

    2011-07-01

    A model is constructed to study the phenomenon of bursting in cultured neuronal networks by considering the effects of glial release and the extrasynaptic receptors on neurons. In the frequently observed situations of synchronized bursting, the whole neuronal network can be described by a mean-field model. In this model, the dynamics of the synchronized network in the presence of glia is represented by an effective two-compartment neuron with stimulations on both the dendrite and soma. Numerical simulations of this model show that most of the experimental observations in bursting, in particular the high plateau and the slow repolarization, can be reproduced. Our findings suggest that the effects of glia release and extrasynaptic receptors, which are usually neglected in neuronal models, can become important in intense network activities. Furthermore, simulations of the model are also performed for the case of glia-suppressed cultures to compare with recent experimental results.

  19. Effects of channel noise on firing coherence of small-world Hodgkin-Huxley neuronal networks

    NASA Astrophysics Data System (ADS)

    Sun, X. J.; Lei, J. Z.; Perc, M.; Lu, Q. S.; Lv, S. J.

    2011-01-01

    We investigate the effects of channel noise on firing coherence of Watts-Strogatz small-world networks consisting of biophysically realistic HH neurons having a fraction of blocked voltage-gated sodium and potassium ion channels embedded in their neuronal membranes. The intensity of channel noise is determined by the number of non-blocked ion channels, which depends on the fraction of working ion channels and the membrane patch size with the assumption of homogeneous ion channel density. We find that firing coherence of the neuronal network can be either enhanced or reduced depending on the source of channel noise. As shown in this paper, sodium channel noise reduces firing coherence of neuronal networks; in contrast, potassium channel noise enhances it. Furthermore, compared with potassium channel noise, sodium channel noise plays a dominant role in affecting firing coherence of the neuronal network. Moreover, we declare that the observed phenomena are independent of the rewiring probability.

  20. Human Neuron Cultures: Micropatterning Facilitates the Long-Term Growth and Analysis of iPSC-Derived Individual Human Neurons and Neuronal Networks (Adv. Healthcare Mater. 15/2016).

    PubMed

    Burbulla, Lena F; Beaumont, Kristin G; Mrksich, Milan; Krainc, Dimitri

    2016-08-01

    Dimitri Krainc, Milan Mrksich, and co-workers demonstrate the utility of microcontact printing technology for culturing of human neurons in defined patterns over extended periods of time on page 1894. This approach facilitates studies of neuronal development, cellular trafficking, and related mechanisms that require assessment of individual neurons and neuronal networks. PMID:27511952

  1. Parametric Anatomical Modeling: a method for modeling the anatomical layout of neurons and their projections.

    PubMed

    Pyka, Martin; Klatt, Sebastian; Cheng, Sen

    2014-01-01

    Computational models of neural networks can be based on a variety of different parameters. These parameters include, for example, the 3d shape of neuron layers, the neurons' spatial projection patterns, spiking dynamics and neurotransmitter systems. While many well-developed approaches are available to model, for example, the spiking dynamics, there is a lack of approaches for modeling the anatomical layout of neurons and their projections. We present a new method, called Parametric Anatomical Modeling (PAM), to fill this gap. PAM can be used to derive network connectivities and conduction delays from anatomical data, such as the position and shape of the neuronal layers and the dendritic and axonal projection patterns. Within the PAM framework, several mapping techniques between layers can account for a large variety of connection properties between pre- and post-synaptic neuron layers. PAM is implemented as a Python tool and integrated in the 3d modeling software Blender. We demonstrate on a 3d model of the hippocampal formation how PAM can help reveal complex properties of the synaptic connectivity and conduction delays, properties that might be relevant to uncover the function of the hippocampus. Based on these analyses, two experimentally testable predictions arose: (i) the number of neurons and the spread of connections is heterogeneously distributed across the main anatomical axes, (ii) the distribution of connection lengths in CA3-CA1 differ qualitatively from those between DG-CA3 and CA3-CA3. Models created by PAM can also serve as an educational tool to visualize the 3d connectivity of brain regions. The low-dimensional, but yet biologically plausible, parameter space renders PAM suitable to analyse allometric and evolutionary factors in networks and to model the complexity of real networks with comparatively little effort. PMID:25309338

  2. Parametric Anatomical Modeling: a method for modeling the anatomical layout of neurons and their projections

    PubMed Central

    Pyka, Martin; Klatt, Sebastian; Cheng, Sen

    2014-01-01

    Computational models of neural networks can be based on a variety of different parameters. These parameters include, for example, the 3d shape of neuron layers, the neurons' spatial projection patterns, spiking dynamics and neurotransmitter systems. While many well-developed approaches are available to model, for example, the spiking dynamics, there is a lack of approaches for modeling the anatomical layout of neurons and their projections. We present a new method, called Parametric Anatomical Modeling (PAM), to fill this gap. PAM can be used to derive network connectivities and conduction delays from anatomical data, such as the position and shape of the neuronal layers and the dendritic and axonal projection patterns. Within the PAM framework, several mapping techniques between layers can account for a large variety of connection properties between pre- and post-synaptic neuron layers. PAM is implemented as a Python tool and integrated in the 3d modeling software Blender. We demonstrate on a 3d model of the hippocampal formation how PAM can help reveal complex properties of the synaptic connectivity and conduction delays, properties that might be relevant to uncover the function of the hippocampus. Based on these analyses, two experimentally testable predictions arose: (i) the number of neurons and the spread of connections is heterogeneously distributed across the main anatomical axes, (ii) the distribution of connection lengths in CA3-CA1 differ qualitatively from those between DG-CA3 and CA3-CA3. Models created by PAM can also serve as an educational tool to visualize the 3d connectivity of brain regions. The low-dimensional, but yet biologically plausible, parameter space renders PAM suitable to analyse allometric and evolutionary factors in networks and to model the complexity of real networks with comparatively little effort. PMID:25309338

  3. Intermittent chaos, self-organization, and learning from synchronous synaptic activity in model neuron networks.

    PubMed Central

    Hoppensteadt, F C

    1989-01-01

    Self-organization of frequencies is studied by using model neurons called VCONs (voltage-controlled oscillator neuron models). These models give direct access to frequency information, in contrast to all-or-none neuron models, and they generate voltage spikes that phase-lock to oscillatory stimulation, similar to phase-locking of action potentials to oscillatory voltage stimulation observed in Hodgkin-Huxley preparations of squid axons. The rotation vector method is described and used to study how networks synchronize, even in the presence of noise or when damaged; the entropy of ratios of phases is used to construct an energy function that characterizes organized behavior. Computer simulations show that rotation numbers (output frequency/input frequency) describe both chaotic and nonchaotic behavior. Learning occurs when synaptic connections strengthen in response to stimulation that is synchronous with cell activity. It is shown that intermittent chaotic firing is suppressed and simple stable responses are enhanced by such learning in VCON networks. This analysis provides a rigorous basis for further investigation of the ideas of Wiener [Wiener, N. (1961) Cybernetics (MIT Press, Cambridge, MA), p. 191] on the origin of slow brain waves due to "the pulling together of frequencies." PMID:2717606

  4. Granger Causality Network Reconstruction of Conductance-Based Integrate-and-Fire Neuronal Systems

    PubMed Central

    Zhou, Douglas; Xiao, Yanyang; Zhang, Yaoyu; Xu, Zhiqin; Cai, David

    2014-01-01

    Reconstruction of anatomical connectivity from measured dynamical activities of coupled neurons is one of the fundamental issues in the understanding of structure-function relationship of neuronal circuitry. Many approaches have been developed to address this issue based on either electrical or metabolic data observed in experiment. The Granger causality (GC) analysis remains one of the major approaches to explore the dynamical causal connectivity among individual neurons or neuronal populations. However, it is yet to be clarified how such causal connectivity, i.e., the GC connectivity, can be mapped to the underlying anatomical connectivity in neuronal networks. We perform the GC analysis on the conductance-based integrate-and-fire (IF) neuronal networks to obtain their causal connectivity. Through numerical experiments, we find that the underlying synaptic connectivity amongst individual neurons or subnetworks, can be successfully reconstructed by the GC connectivity constructed from voltage time series. Furthermore, this reconstruction is insensitive to dynamical regimes and can be achieved without perturbing systems and prior knowledge of neuronal model parameters. Surprisingly, the synaptic connectivity can even be reconstructed by merely knowing the raster of systems, i.e., spike timing of neurons. Using spike-triggered correlation techniques, we establish a direct mapping between the causal connectivity and the synaptic connectivity for the conductance-based IF neuronal networks, and show the GC is quadratically related to the coupling strength. The theoretical approach we develop here may provide a framework for examining the validity of the GC analysis in other settings. PMID:24586285

  5. The Role of Adult-Born Neurons in the Constantly Changing Olfactory Bulb Network

    PubMed Central

    Malvaut, Sarah; Saghatelyan, Armen

    2016-01-01

    The adult mammalian brain is remarkably plastic and constantly undergoes structurofunctional modifications in response to environmental stimuli. In many regions plasticity is manifested by modifications in the efficacy of existing synaptic connections or synapse formation and elimination. In a few regions, however, plasticity is brought by the addition of new neurons that integrate into established neuronal networks. This type of neuronal plasticity is particularly prominent in the olfactory bulb (OB) where thousands of neuronal progenitors are produced on a daily basis in the subventricular zone (SVZ) and migrate along the rostral migratory stream (RMS) towards the OB. In the OB, these neuronal precursors differentiate into local interneurons, mature, and functionally integrate into the bulbar network by establishing output synapses with principal neurons. Despite continuous progress, it is still not well understood how normal functioning of the OB is preserved in the constantly remodelling bulbar network and what role adult-born neurons play in odor behaviour. In this review we will discuss different levels of morphofunctional plasticity effected by adult-born neurons and their functional role in the adult OB and also highlight the possibility that different subpopulations of adult-born cells may fulfill distinct functions in the OB neuronal network and odor behaviour. PMID:26839709

  6. Developmental Transcriptional Networks Are Required to Maintain Neuronal Subtype Identity in the Mature Nervous System

    PubMed Central

    Eade, Kevin T.; Fancher, Hailey A.; Ridyard, Marc S.; Allan, Douglas W.

    2012-01-01

    During neurogenesis, transcription factors combinatorially specify neuronal fates and then differentiate subtype identities by inducing subtype-specific gene expression profiles. But how is neuronal subtype identity maintained in mature neurons? Modeling this question in two Drosophila neuronal subtypes (Tv1 and Tv4), we test whether the subtype transcription factor networks that direct differentiation during development are required persistently for long-term maintenance of subtype identity. By conditional transcription factor knockdown in adult Tv neurons after normal development, we find that most transcription factors within the Tv1/Tv4 subtype transcription networks are indeed required to maintain Tv1/Tv4 subtype-specific gene expression in adults. Thus, gene expression profiles are not simply “locked-in,” but must be actively maintained by persistent developmental transcription factor networks. We also examined the cross-regulatory relationships between all transcription factors that persisted in adult Tv1/Tv4 neurons. We show that certain critical cross-regulatory relationships that had existed between these transcription factors during development were no longer present in the mature adult neuron. This points to key differences between developmental and maintenance transcriptional regulatory networks in individual neurons. Together, our results provide novel insight showing that the maintenance of subtype identity is an active process underpinned by persistently active, combinatorially-acting, developmental transcription factors. These findings have implications for understanding the maintenance of all long-lived cell types and the functional degeneration of neurons in the aging brain. PMID:22383890

  7. Transition from double coherence resonances to single coherence resonance in a neuronal network with phase noise.

    PubMed

    Jia, Yanbing; Gu, Huaguang

    2015-12-01

    The effect of phase noise on the coherence dynamics of a neuronal network composed of FitzHugh-Nagumo (FHN) neurons is investigated. Phase noise can induce dissimilar coherence resonance (CR) effects for different coupling strength regimes. When the coupling strength is small, phase noise can induce double CRs. One corresponds to the average frequency of phase noise, and the other corresponds to the intrinsic firing frequency of the FHN neuron. When the coupling strength is large enough, phase noise can only induce single CR, and the CR corresponds to the intrinsic firing frequency of the FHN neuron. The results show a transition from double CRs to single CR with the increase in the coupling strength. The transition can be well interpreted based on the dynamics of a single neuron stimulated by both phase noise and the coupling current. When the coupling strength is small, the coupling current is weak, and phase noise mainly determines the dynamics of the neuron. Moreover, the phase-noise-induced double CRs in the neuronal network are similar to the phase-noise-induced double CRs in an isolated FHN neuron. When the coupling strength is large enough, the coupling current is strong and plays a key role in the occurrence of the single CR in the network. The results provide a novel phenomenon and may have important implications in understanding the dynamics of neuronal networks. PMID:26723163

  8. Micro-electrode array recordings reveal reductions in both excitation and inhibition in cultured cortical neuron networks lacking Shank3.

    PubMed

    Lu, C; Chen, Q; Zhou, T; Bozic, D; Fu, Z; Pan, J Q; Feng, G

    2016-02-01

    Numerous risk genes have recently been implicated in susceptibility to autism and schizophrenia. Translating such genetic findings into disease-relevant neurobiological mechanisms is challenging due to the lack of throughput assays that can be used to assess their functions on an appropriate scale. To address this issue, we explored the feasibility of using a micro-electrode array (MEA) as a potentially scalable assay to identify the electrical network phenotypes associated with risk genes. We first characterized local and global network firing in cortical neurons with MEAs, and then developed methods to analyze the alternation between the network active period (NAP) and the network inactive period (NIP), each of which lasts tens of seconds. We then evaluated the electric phenotypes of neurons derived from Shank3 knockout (KO) mice. Cortical neurons cultured on MEAs displayed a rich repertoire of spontaneous firing, and Shank3 deletion led to reduced firing activity. Enhancing excitation with CX546 rescued the deficit in the spike rate in the Shank3 KO network. In addition, the Shank3 KO network produced a shorter NIP, and this altered network firing pattern was normalized by clonazepam, a positive modulator of the GABAA receptor. MEA recordings revealed electric phenotypes that displayed altered excitation and inhibition in the network lacking Shank3. Thus, our study highlights MEAs as an experimental framework for measuring multiple robust neurobiological end points in dynamic networks and as an assay system that could be used to identify electric phenotypes in cultured neuronal networks and to analyze additional risk genes identified in psychiatric genetics. PMID:26598066

  9. On controllability of neuronal networks with constraints on the average of control gains.

    PubMed

    Tang, Yang; Wang, Zidong; Gao, Huijun; Qiao, Hong; Kurths, Jürgen

    2014-12-01

    Control gains play an important role in the control of a natural or a technical system since they reflect how much resource is required to optimize a certain control objective. This paper is concerned with the controllability of neuronal networks with constraints on the average value of the control gains injected in driver nodes, which are in accordance with engineering and biological backgrounds. In order to deal with the constraints on control gains, the controllability problem is transformed into a constrained optimization problem (COP). The introduction of the constraints on the control gains unavoidably leads to substantial difficulty in finding feasible as well as refining solutions. As such, a modified dynamic hybrid framework (MDyHF) is developed to solve this COP, based on an adaptive differential evolution and the concept of Pareto dominance. By comparing with statistical methods and several recently reported constrained optimization evolutionary algorithms (COEAs), we show that our proposed MDyHF is competitive and promising in studying the controllability of neuronal networks. Based on the MDyHF, we proceed to show the controlling regions under different levels of constraints. It is revealed that we should allocate the control gains economically when strong constraints are considered. In addition, it is found that as the constraints become more restrictive, the driver nodes are more likely to be selected from the nodes with a large degree. The results and methods presented in this paper will provide useful insights into developing new techniques to control a realistic complex network efficiently. PMID:24733036

  10. Intrinsic Neuronal Properties Switch the Mode of Information Transmission in Networks

    PubMed Central

    Gjorgjieva, Julijana; Mease, Rebecca A.; Moody, William J.; Fairhall, Adrienne L.

    2014-01-01

    Diverse ion channels and their dynamics endow single neurons with complex biophysical properties. These properties determine the heterogeneity of cell types that make up the brain, as constituents of neural circuits tuned to perform highly specific computations. How do biophysical properties of single neurons impact network function? We study a set of biophysical properties that emerge in cortical neurons during the first week of development, eventually allowing these neurons to adaptively scale the gain of their response to the amplitude of the fluctuations they encounter. During the same time period, these same neurons participate in large-scale waves of spontaneously generated electrical activity. We investigate the potential role of experimentally observed changes in intrinsic neuronal properties in determining the ability of cortical networks to propagate waves of activity. We show that such changes can strongly affect the ability of multi-layered feedforward networks to represent and transmit information on multiple timescales. With properties modeled on those observed at early stages of development, neurons are relatively insensitive to rapid fluctuations and tend to fire synchronously in response to wave-like events of large amplitude. Following developmental changes in voltage-dependent conductances, these same neurons become efficient encoders of fast input fluctuations over few layers, but lose the ability to transmit slower, population-wide input variations across many layers. Depending on the neurons' intrinsic properties, noise plays different roles in modulating neuronal input-output curves, which can dramatically impact network transmission. The developmental change in intrinsic properties supports a transformation of a networks function from the propagation of network-wide information to one in which computations are scaled to local activity. This work underscores the significance of simple changes in conductance parameters in governing how neurons

  11. Intrinsic neuronal properties switch the mode of information transmission in networks.

    PubMed

    Gjorgjieva, Julijana; Mease, Rebecca A; Moody, William J; Fairhall, Adrienne L

    2014-12-01

    Diverse ion channels and their dynamics endow single neurons with complex biophysical properties. These properties determine the heterogeneity of cell types that make up the brain, as constituents of neural circuits tuned to perform highly specific computations. How do biophysical properties of single neurons impact network function? We study a set of biophysical properties that emerge in cortical neurons during the first week of development, eventually allowing these neurons to adaptively scale the gain of their response to the amplitude of the fluctuations they encounter. During the same time period, these same neurons participate in large-scale waves of spontaneously generated electrical activity. We investigate the potential role of experimentally observed changes in intrinsic neuronal properties in determining the ability of cortical networks to propagate waves of activity. We show that such changes can strongly affect the ability of multi-layered feedforward networks to represent and transmit information on multiple timescales. With properties modeled on those observed at early stages of development, neurons are relatively insensitive to rapid fluctuations and tend to fire synchronously in response to wave-like events of large amplitude. Following developmental changes in voltage-dependent conductances, these same neurons become efficient encoders of fast input fluctuations over few layers, but lose the ability to transmit slower, population-wide input variations across many layers. Depending on the neurons' intrinsic properties, noise plays different roles in modulating neuronal input-output curves, which can dramatically impact network transmission. The developmental change in intrinsic properties supports a transformation of a networks function from the propagation of network-wide information to one in which computations are scaled to local activity. This work underscores the significance of simple changes in conductance parameters in governing how neurons

  12. Building a Large-Scale Computational Model of a Cortical Neuronal Network

    NASA Astrophysics Data System (ADS)

    Zemanová, Lucia; Zhou, Changsong; Kurths, Jürgen

    We introduce the general framework of the large-scale neuronal model used in the 5th Helmholtz Summer School — Complex Brain Networks. The main aim is to build a universal large-scale model of a cortical neuronal network, structured as a network of networks, which is flexible enough to implement different kinds of topology and neuronal models and which exhibits behavior in various dynamical regimes. First, we describe important biological aspects of brain topology and use them in the construction of a large-scale cortical network. Second, the general dynamical model is presented together with explanations of the major dynamical properties of neurons. Finally, we discuss the implementation of the model into parallel code and its possible modifications and improvements.

  13. Cryopreserved rat cortical cells develop functional neuronal networks on microelectrode arrays.

    PubMed

    Otto, Frauke; Görtz, Philipp; Fleischer, Wiebke; Siebler, Mario

    2003-09-30

    Neurons growing on microelectrode arrays (MEAs) are promising tools to investigate principal neuronal network mechanisms and network responses to pharmaceutical substances. However, broad application of these tools, e.g. in pharmaceutical substance screening, requires neuronal cells that provide stable activity on MEAs. Cryopreserved cortical neurons (CCx) from embryonic rats were cultured on MEAs and their immunocytochemical and electrophysiological properties were compared with acutely dissociated neurons (Cx). Both cell types formed neuritic networks and expressed the neuron-specific markers microtubule associated protein 2, synaptophysin, neurofilament and gamma-aminobutyric acid (GABA). Spontaneous spike activity (SSA) was recorded after 9 up to 74 days in vitro (DIV) in CCx and from 5 to 30 DIV in Cx, respectively. Cx and CCx exhibited synchronized burst activity with similar spiking characteristics. Tetrodotoxin (TTX) abolished the SSA of both cell types reversibly. In CCx SSA-inhibition occurred with an IC50 of 1.1 nM for TTX, 161 microM for magnesium, 18 microM for D,L-2-amino-5-phosphonovaleric acid (APV) and 1 microM for GABA. CCx cells were easy to handle and developed long living, stable and active neuronal networks on MEAs with similar characteristics as Cx. Thus, these neurochips seem to be suitable for studying neuronal network properties and screening in pharmaceutical research. PMID:12948560

  14. Spontaneous Neuronal Activity in Developing Neocortical Networks: From Single Cells to Large-Scale Interactions

    PubMed Central

    Luhmann, Heiko J.; Sinning, Anne; Yang, Jenq-Wei; Reyes-Puerta, Vicente; Stüttgen, Maik C.; Kirischuk, Sergei; Kilb, Werner

    2016-01-01

    Neuronal activity has been shown to be essential for the proper formation of neuronal circuits, affecting developmental processes like neurogenesis, migration, programmed cell death, cellular differentiation, formation of local and long-range axonal connections, synaptic plasticity or myelination. Accordingly, neocortical areas reveal distinct spontaneous and sensory-driven neuronal activity patterns already at early phases of development. At embryonic stages, when immature neurons start to develop voltage-dependent channels, spontaneous activity is highly synchronized within small neuronal networks and governed by electrical synaptic transmission. Subsequently, spontaneous activity patterns become more complex, involve larger networks and propagate over several neocortical areas. The developmental shift from local to large-scale network activity is accompanied by a gradual shift from electrical to chemical synaptic transmission with an initial excitatory action of chloride-gated channels activated by GABA, glycine and taurine. Transient neuronal populations in the subplate (SP) support temporary circuits that play an important role in tuning early neocortical activity and the formation of mature neuronal networks. Thus, early spontaneous activity patterns control the formation of developing networks in sensory cortices, and disturbances of these activity patterns may lead to long-lasting neuronal deficits. PMID:27252626

  15. Spontaneous Neuronal Activity in Developing Neocortical Networks: From Single Cells to Large-Scale Interactions.

    PubMed

    Luhmann, Heiko J; Sinning, Anne; Yang, Jenq-Wei; Reyes-Puerta, Vicente; Stüttgen, Maik C; Kirischuk, Sergei; Kilb, Werner

    2016-01-01

    Neuronal activity has been shown to be essential for the proper formation of neuronal circuits, affecting developmental processes like neurogenesis, migration, programmed cell death, cellular differentiation, formation of local and long-range axonal connections, synaptic plasticity or myelination. Accordingly, neocortical areas reveal distinct spontaneous and sensory-driven neuronal activity patterns already at early phases of development. At embryonic stages, when immature neurons start to develop voltage-dependent channels, spontaneous activity is highly synchronized within small neuronal networks and governed by electrical synaptic transmission. Subsequently, spontaneous activity patterns become more complex, involve larger networks and propagate over several neocortical areas. The developmental shift from local to large-scale network activity is accompanied by a gradual shift from electrical to chemical synaptic transmission with an initial excitatory action of chloride-gated channels activated by GABA, glycine and taurine. Transient neuronal populations in the subplate (SP) support temporary circuits that play an important role in tuning early neocortical activity and the formation of mature neuronal networks. Thus, early spontaneous activity patterns control the formation of developing networks in sensory cortices, and disturbances of these activity patterns may lead to long-lasting neuronal deficits. PMID:27252626

  16. Investigating local and long-range neuronal network dynamics by simultaneous optogenetics, reverse microdialysis and silicon probe recordings in vivo

    PubMed Central

    Taylor, Hannah; Schmiedt, Joscha T.; Çarçak, Nihan; Onat, Filiz; Di Giovanni, Giuseppe; Lambert, Régis; Leresche, Nathalie; Crunelli, Vincenzo; David, Francois

    2014-01-01

    Background The advent of optogenetics has given neuroscientists the opportunity to excite or inhibit neuronal population activity with high temporal resolution and cellular selectivity. Thus, when combined with recordings of neuronal ensemble activity in freely moving animals optogenetics can provide an unprecedented snapshot of the contribution of neuronal assemblies to (patho)physiological conditions in vivo. Still, the combination of optogenetic and silicone probe (or tetrode) recordings does not allow investigation of the role played by voltage- and transmitter-gated channels of the opsin-transfected neurons and/or other adjacent neurons in controlling neuronal activity. New method and results We demonstrate that optogenetics and silicone probe recordings can be combined with intracerebral reverse microdialysis for the long-term delivery of neuroactive drugs around the optic fiber and silicone probe. In particular, we show the effect of antagonists of T-type Ca2+ channels, hyperpolarization-activated cyclic nucleotide-gated channels and metabotropic glutamate receptors on silicone probe-recorded activity of the local opsin-transfected neurons in the ventrobasal thalamus, and demonstrate the changes that the block of these thalamic channels/receptors brings about in the network dynamics of distant somatotopic cortical neuronal ensembles. Comparison with existing methods This is the first demonstration of successfully combining optogenetics and neuronal ensemble recordings with reverse microdialysis. This combination of techniques overcomes some of the disadvantages that are associated with the use of intracerebral injection of a drug-containing solution at the site of laser activation. Conclusions The combination of reverse microdialysis, silicone probe recordings and optogenetics can unravel the short and long-term effects of specific transmitter- and voltage-gated channels on laser-modulated firing at the site of optogenetic stimulation and the actions that

  17. Effects of distance-dependent delay on small-world neuronal networks

    NASA Astrophysics Data System (ADS)

    Zhu, Jinjie; Chen, Zhen; Liu, Xianbin

    2016-04-01

    We study firing behaviors and the transitions among them in small-world noisy neuronal networks with electrical synapses and information transmission delay. Each neuron is modeled by a two-dimensional Rulkov map neuron. The distance between neurons, which is a main source of the time delay, is taken into consideration. Through spatiotemporal patterns and interspike intervals as well as the interburst intervals, the collective behaviors are revealed. It is found that the networks switch from resting state into intermittent firing state under Gaussian noise excitation. Initially, noise-induced firing behaviors are disturbed by small time delays. Periodic firing behaviors with irregular zigzag patterns emerge with an increase of the delay and become progressively regular after a critical value is exceeded. More interestingly, in accordance with regular patterns, the spiking frequency doubles compared with the former stage for the spiking neuronal network. A growth of frequency persists for a larger delay and a transition to antiphase synchronization is observed. Furthermore, it is proved that these transitions are generic also for the bursting neuronal network and the FitzHugh-Nagumo neuronal network. We show these transitions due to the increase of time delay are robust to the noise strength, coupling strength, network size, and rewiring probability.

  18. Low-Density Neuronal Networks Cultured using Patterned Poly-L-Lysine on Microelectrode Arrays

    PubMed Central

    Jun, Sang Beom; Hynd, Matthew R.; Dowell-Mesfin, Natalie; Smith, Karen L.; Turner, James N.; Shain, William; Kim, Sung June

    2009-01-01

    Synaptic activity recorded from low-density networks of cultured rat hippocampal neurons was monitored using microelectrode arrays (MEAs). Neuronal networks were patterned with poly-L-lysine (PLL) using microcontact printing (µCP). Polydimethysiloxane (PDMS) stamps were fabricated with relief structures resulting in patterns of 2 µm-wide lines for directing process growth and 20 µm-diameter circles for cell soma attachment. These circles were aligned to electrode sites. Different densities of neurons were plated in order to assess the minimal neuron density required for development of an active network. Spontaneous activity was observed at 10–14 days in networks using neuron densities as low as 200 cells/mm2. Immunocytochemistry demonstrated the distribution of dendrites along the lines and the location of foci of the presynaptic protein, synaptophysin, on neuron somas and dendrites. Scanning electron microscopy demonstrated that single fluorescent tracks contained multiple processes. Evoked responses of selected portions of the networks were produced by stimulation of specific electrode sites. In addition, the neuronal excitability of the network was increased by the bath application of high K+ (10–12 mM). Application of DNQX, an AMPA antagonist, blocked all spontaneous activity, suggesting that the activity is excitatory and mediated through glutamate receptors. PMID:17049614

  19. Detection of neuron membranes in electron microscopy images using a serial neural network architecture.

    PubMed

    Jurrus, Elizabeth; Paiva, Antonio R C; Watanabe, Shigeki; Anderson, James R; Jones, Bryan W; Whitaker, Ross T; Jorgensen, Erik M; Marc, Robert E; Tasdizen, Tolga

    2010-12-01

    Study of nervous systems via the connectome, the map of connectivities of all neurons in that system, is a challenging problem in neuroscience. Towards this goal, neurobiologists are acquiring large electron microscopy datasets. However, the shear volume of these datasets renders manual analysis infeasible. Hence, automated image analysis methods are required for reconstructing the connectome from these very large image collections. Segmentation of neurons in these images, an essential step of the reconstruction pipeline, is challenging because of noise, anisotropic shapes and brightness, and the presence of confounding structures. The method described in this paper uses a series of artificial neural networks (ANNs) in a framework combined with a feature vector that is composed of image intensities sampled over a stencil neighborhood. Several ANNs are applied in series allowing each ANN to use the classification context provided by the previous network to improve detection accuracy. We develop the method of serial ANNs and show that the learned context does improve detection over traditional ANNs. We also demonstrate advantages over previous membrane detection methods. The results are a significant step towards an automated system for the reconstruction of the connectome. PMID:20598935

  20. Detection of Neuron Membranes in Electron Microscopy Images using a Serial Neural Network Architecture

    PubMed Central

    Jurrus, Elizabeth; Paiva, Antonio R. C.; Watanabe, Shigeki; Anderson, James R.; Jones, Bryan W.; Whitaker, Ross T.; Jorgensen, Erik M.; Marc, Robert E.; Tasdizen, Tolga

    2010-01-01

    Study of nervous systems via the connectome, the map of connectivities of all neurons in that system, is a challenging problem in neuroscience. Towards this goal, neurobiologists are acquiring large electron microscopy datasets. However, the shear volume of these datasets renders manual analysis infeasible. Hence, automated image analysis methods are required for reconstructing the connectome from these very large image collections. Segmentation of neurons in these images, an essential step of the reconstruction pipeline, is challenging because of noise, anisotropic shapes and brightness, and the presence of confounding structures. The method described in this paper uses a series of artificial neural networks (ANNs) in a framework combined with a feature vector that is composed of image intensities sampled over a stencil neighborhood. Several ANNs are applied in series allowing each ANN to use the classification context provided by the previous network to improve detection accuracy. We develop the method of serial ANNs and show that the learned context does improve detection over traditional ANNs. We also demonstrate advantages over previous membrane detection methods. The results are a significant step towards an automated system for the reconstruction of the connectome. PMID:20598935

  1. In Vitro Reconstruction of Neuronal Networks Derived from Human iPS Cells Using Microfabricated Devices

    PubMed Central

    Takayama, Yuzo; Kida, Yasuyuki S.

    2016-01-01

    Morphology and function of the nervous system is maintained via well-coordinated processes both in central and peripheral nervous tissues, which govern the homeostasis of organs/tissues. Impairments of the nervous system induce neuronal disorders such as peripheral neuropathy or cardiac arrhythmia. Although further investigation is warranted to reveal the molecular mechanisms of progression in such diseases, appropriate model systems mimicking the patient-specific communication between neurons and organs are not established yet. In this study, we reconstructed the neuronal network in vitro either between neurons of the human induced pluripotent stem (iPS) cell derived peripheral nervous system (PNS) and central nervous system (CNS), or between PNS neurons and cardiac cells in a morphologically and functionally compartmentalized manner. Networks were constructed in photolithographically microfabricated devices with two culture compartments connected by 20 microtunnels. We confirmed that PNS and CNS neurons connected via synapses and formed a network. Additionally, calcium-imaging experiments showed that the bundles originating from the PNS neurons were functionally active and responded reproducibly to external stimuli. Next, we confirmed that CNS neurons showed an increase in calcium activity during electrical stimulation of networked bundles from PNS neurons in order to demonstrate the formation of functional cell-cell interactions. We also confirmed the formation of synapses between PNS neurons and mature cardiac cells. These results indicate that compartmentalized culture devices are promising tools for reconstructing network-wide connections between PNS neurons and various organs, and might help to understand patient-specific molecular and functional mechanisms under normal and pathological conditions. PMID:26848955

  2. In Vitro Reconstruction of Neuronal Networks Derived from Human iPS Cells Using Microfabricated Devices.

    PubMed

    Takayama, Yuzo; Kida, Yasuyuki S

    2016-01-01

    Morphology and function of the nervous system is maintained via well-coordinated processes both in central and peripheral nervous tissues, which govern the homeostasis of organs/tissues. Impairments of the nervous system induce neuronal disorders such as peripheral neuropathy or cardiac arrhythmia. Although further investigation is warranted to reveal the molecular mechanisms of progression in such diseases, appropriate model systems mimicking the patient-specific communication between neurons and organs are not established yet. In this study, we reconstructed the neuronal network in vitro either between neurons of the human induced pluripotent stem (iPS) cell derived peripheral nervous system (PNS) and central nervous system (CNS), or between PNS neurons and cardiac cells in a morphologically and functionally compartmentalized manner. Networks were constructed in photolithographically microfabricated devices with two culture compartments connected by 20 microtunnels. We confirmed that PNS and CNS neurons connected via synapses and formed a network. Additionally, calcium-imaging experiments showed that the bundles originating from the PNS neurons were functionally active and responded reproducibly to external stimuli. Next, we confirmed that CNS neurons showed an increase in calcium activity during electrical stimulation of networked bundles from PNS neurons in order to demonstrate the formation of functional cell-cell interactions. We also confirmed the formation of synapses between PNS neurons and mature cardiac cells. These results indicate that compartmentalized culture devices are promising tools for reconstructing network-wide connections between PNS neurons and various organs, and might help to understand patient-specific molecular and functional mechanisms under normal and pathological conditions. PMID:26848955

  3. Physical and Biological Regulation of Neuron Regenerative Growth and Network Formation on Recombinant Dragline Silks

    PubMed Central

    Huang, Wenwen; He, Jiuyang; Jones, Justin; Lewis, Randolph V.; Kaplan, David L.

    2015-01-01

    Recombinant spider silks produced in transgenic goat milk were studied as cell culture matrices for neuronal growth. Major ampullate spidroin 1 (MaSp1) supported neuronal growth, axon extension and network connectivity, with cell morphology comparable to the gold standard poly-lysine. In addition, neurons growing on MaSp1 films had increased neural cell adhesion molecule (NCAM) expression at both mRNA and protein levels. The results indicate that MaSp1 films present useful surface charge and substrate stiffness to support the growth of primary rat cortical neurons. Moreover, a putative neuron-specific surface binding sequence GRGGL within MaSp1 may contribute to the biological regulation of neuron growth. These findings indicate that MaSp1 could regulate neuron growth through its physical and biological features. This dual regulation mode of MaSp1 could provide an alternative strategy for generating functional silk materials for neural tissue engineering. PMID:25701039

  4. Size-dependent regulation of synchronized activity in living neuronal networks

    NASA Astrophysics Data System (ADS)

    Yamamoto, Hideaki; Kubota, Shigeru; Chida, Yudai; Morita, Mayu; Moriya, Satoshi; Akima, Hisanao; Sato, Shigeo; Hirano-Iwata, Ayumi; Tanii, Takashi; Niwano, Michio

    2016-07-01

    We study the effect of network size on synchronized activity in living neuronal networks. Dissociated cortical neurons form synaptic connections in culture and generate synchronized spontaneous activity within 10 days in vitro. Using micropatterned surfaces to extrinsically control the size of neuronal networks, we show that synchronized activity can emerge in a network as small as 12 cells. Furthermore, a detailed comparison of small (˜20 cells), medium (˜100 cells), and large (˜400 cells) networks reveal that synchronized activity becomes destabilized in the small networks. A computational modeling of neural activity is then employed to explore the underlying mechanism responsible for the size effect. We find that the generation and maintenance of the synchronized activity can be minimally described by: (1) the stochastic firing of each neuron in the network, (2) enhancement in the network activity in a positive feedback loop of excitatory synapses, and (3) Ca-dependent suppression of bursting activity. The model further shows that the decrease in total synaptic input to a neuron that drives the positive feedback amplification of correlated activity is a key factor underlying the destabilization of synchrony in smaller networks. Spontaneous neural activity plays a critical role in cortical information processing, and our work constructively clarifies an aspect of the structural basis behind this.

  5. Size-dependent regulation of synchronized activity in living neuronal networks.

    PubMed

    Yamamoto, Hideaki; Kubota, Shigeru; Chida, Yudai; Morita, Mayu; Moriya, Satoshi; Akima, Hisanao; Sato, Shigeo; Hirano-Iwata, Ayumi; Tanii, Takashi; Niwano, Michio

    2016-07-01

    We study the effect of network size on synchronized activity in living neuronal networks. Dissociated cortical neurons form synaptic connections in culture and generate synchronized spontaneous activity within 10 days in vitro. Using micropatterned surfaces to extrinsically control the size of neuronal networks, we show that synchronized activity can emerge in a network as small as 12 cells. Furthermore, a detailed comparison of small (∼20 cells), medium (∼100 cells), and large (∼400 cells) networks reveal that synchronized activity becomes destabilized in the small networks. A computational modeling of neural activity is then employed to explore the underlying mechanism responsible for the size effect. We find that the generation and maintenance of the synchronized activity can be minimally described by: (1) the stochastic firing of each neuron in the network, (2) enhancement in the network activity in a positive feedback loop of excitatory synapses, and (3) Ca-dependent suppression of bursting activity. The model further shows that the decrease in total synaptic input to a neuron that drives the positive feedback amplification of correlated activity is a key factor underlying the destabilization of synchrony in smaller networks. Spontaneous neural activity plays a critical role in cortical information processing, and our work constructively clarifies an aspect of the structural basis behind this. PMID:27575164

  6. Coherent and intermittent ensemble oscillations emerge from networks of irregular spiking neurons.

    PubMed

    Hoseini, Mahmood S; Wessel, Ralf

    2016-01-01

    Local field potential (LFP) recordings from spatially distant cortical circuits reveal episodes of coherent gamma oscillations that are intermittent, and of variable peak frequency and duration. Concurrently, single neuron spiking remains largely irregular and of low rate. The underlying potential mechanisms of this emergent network activity have long been debated. Here we reproduce such intermittent ensemble oscillations in a model network, consisting of excitatory and inhibitory model neurons with the characteristics of regular-spiking (RS) pyramidal neurons, and fast-spiking (FS) and low-threshold spiking (LTS) interneurons. We find that fluctuations in the external inputs trigger reciprocally connected and irregularly spiking RS and FS neurons in episodes of ensemble oscillations, which are terminated by the recruitment of the LTS population with concurrent accumulation of inhibitory conductance in both RS and FS neurons. The model qualitatively reproduces experimentally observed phase drift, oscillation episode duration distributions, variation in the peak frequency, and the concurrent irregular single-neuron spiking at low rate. Furthermore, consistent with previous experimental studies using optogenetic manipulation, periodic activation of FS, but not RS, model neurons causes enhancement of gamma oscillations. In addition, increasing the coupling between two model networks from low to high reveals a transition from independent intermittent oscillations to coherent intermittent oscillations. In conclusion, the model network suggests biologically plausible mechanisms for the generation of episodes of coherent intermittent ensemble oscillations with irregular spiking neurons in cortical circuits. PMID:26561602

  7. Extracellular Recording from Neuronal Networks Cultured on Hydrogel-coated Microelectrode Array

    NASA Astrophysics Data System (ADS)

    Goto, Miho; Moriguchi, Hiroyuki; Takayama, Yuzo; Saito, Aki; Kotani, Kiyoshi; Jimbo, Yasuhiko

    Microelectrode array (MEA) has been widely used for ensemble recording. One of the advantages of MEA recording is its capability of studying correlation between network structures and the ensemble activity-patterns. Simple neuronal networks, from which activities of individual cells can be identified, are promising for this purpose. We have developed a mask-free cell-patterning method named “micropipette drawing”. In this method, a thin hydrogel layer is formed on the surface of MEA substrates, which acts as the support for growth-guidance patterns. Here in this work, we tested whether electrical signals could be detected through this gel layer. Rat cortical neurons were cultured on substrates with guiding patterns. Electrical activities could be detected after 7 days in vitro (DIV) in both patterned and normal cell cultures, though the signal to noise ratio in the normal culture was clearly higher than that in the patterned culture. Frequency analysis demonstrated that the difference of the power spectra between these cultures was particularly significant in high frequency regions. Decreases in high-frequency components were more prominent in the signals obtained from the patterned cultures. This result suggested that the hydrogel layer acted as low-pass filters probably due to its capacitive properties. The next step is to establish a method to form hydrogel layers, which maintain growth-guidance properties and have better frequency characteristics.

  8. Patterning human neuronal networks on photolithographically engineered silicon dioxide substrates functionalized with glial analogues

    PubMed Central

    Hughes, Mark A; Brennan, Paul M; Bunting, Andrew S; Cameron, Katherine; Murray, Alan F; Shipston, Mike J

    2014-01-01

    Interfacing neurons with silicon semiconductors is a challenge being tackled through various bioengineering approaches. Such constructs inform our understanding of neuronal coding and learning and ultimately guide us toward creating intelligent neuroprostheses. A fundamental prerequisite is to dictate the spatial organization of neuronal cells. We sought to pattern neurons using photolithographically defined arrays of polymer parylene-C, activated with fetal calf serum. We used a purified human neuronal cell line [Lund human mesencephalic (LUHMES)] to establish whether neurons remain viable when isolated on-chip or whether they require a supporting cell substrate. When cultured in isolation, LUHMES neurons failed to pattern and did not show any morphological signs of differentiation. We therefore sought a cell type with which to prepattern parylene regions, hypothesizing that this cellular template would enable secondary neuronal adhesion and network formation. From a range of cell lines tested, human embryonal kidney (HEK) 293 cells patterned with highest accuracy. LUHMES neurons adhered to pre-established HEK 293 cell clusters and this coculture environment promoted morphological differentiation of neurons. Neurites extended between islands of adherent cell somata, creating an orthogonally arranged neuronal network. HEK 293 cells appear to fulfill a role analogous to glia, dictating cell adhesion, and generating an environment conducive to neuronal survival. We next replaced HEK 293 cells with slower growing glioma-derived precursors. These primary human cells patterned accurately on parylene and provided a similarly effective scaffold for neuronal adhesion. These findings advance the use of this microfabrication-compatible platform for neuronal patterning. © 2013 The Authors. Journal ofBiomedicalMaterials Research Part APublished byWiley Periodicals, Inc.Wiley Periodicals, Inc. J Biomed Mater Res Part A: 102A: 1350–1360, 2014. PMID:23733444

  9. Autapse-induced target wave, spiral wave in regular network of neurons

    NASA Astrophysics Data System (ADS)

    Qin, HuiXin; Ma, Jun; Wang, ChunNi; Chu, RunTong

    2014-10-01

    Autapse is a type of synapse that connects axon and dendrites of the same neuron, and the effect is often detected by close-loop feedback in axonal action potentials to the owned dendritic tree. An artificial autapse was introduced into the Hindmarsh-Rose neuron model, and a regular network was designed to detect the regular pattern formation induced by autapse. It was found that target wave emerged in the network even when only a single autapse was considered. By increasing the (autapse density) number of neurons with autapse, for example, a regular area (2×2, 3×3, 4×4, 5×5 neurons) under autapse induced target wave by selecting the feedback gain and time-delay in autapse. Spiral waves were also observed under optimized feedback gain and time delay in autapses because of coherence-like resonance in the network induced by some electric autapses connected to some neurons. This confirmed that the electric autapse has a critical role in exciting and regulating the collective behaviors of neurons by generating stable regular waves (target waves, spiral waves) in the network. The wave length of the induced travelling wave (target wave, spiral wave), because of local effect of autapse, was also calculated to understand the waveprofile in the network of neurons.

  10. Synchronization and Partial Synchronization Experiments with Networks of Time-Delay Coupled Hindmarsh-Rose Neurons

    NASA Astrophysics Data System (ADS)

    Steur, Erik; Murguia, Carlos; Fey, Rob H. B.; Nijmeijer, Henk

    2016-06-01

    We study experimentally synchronization and partial synchronization in networks of Hindmarsh-Rose model neurons that interact through linear time-delay couplings. Our experimental setup consists of electric circuit board realizations of the Hindmarsh-Rose model neuron and a coupling interface in which the interaction between the circuits is defined. With this experimental setup we test the predictive value of theoretical results about synchronization and partial synchronization in networks.