Learning only when necessary: better memories of correlated patterns in networks with bounded synapses.

Abstract:

:Learning in a neuronal network is often thought of as a linear superposition of synaptic modifications induced by individual stimuli. However, since biological synapses are naturally bounded, a linear superposition would cause fast forgetting of previously acquired memories. Here we show that this forgetting can be avoided by introducing additional constraints on the synaptic and neural dynamics. We consider Hebbian plasticity of excitatory synapses. A synapse is modified only if the postsynaptic response does not match the desired output. With this learning rule, the original memory performances with unbounded weights are regained, provided that (1) there is some global inhibition, (2) the learning rate is small, and (3) the neurons can discriminate small differences in the total synaptic input (e.g., by making the neuronal threshold small compared to the total postsynaptic input). We prove in the form of a generalized perceptron convergence theorem that under these constraints, a neuron learns to classify any linearly separable set of patterns, including a wide class of highly correlated random patterns. During the learning process, excitation becomes roughly balanced by inhibition, and the neuron classifies the patterns on the basis of small differences around this balance. The fact that synapses saturate has the additional benefit that nonlinearly separable patterns, such as similar patterns with contradicting outputs, eventually generate a subthreshold response, and therefore silence neurons that cannot provide any information.

journal_name

Neural Comput

journal_title

Neural computation

authors

Senn W,Fusi S

doi

10.1162/0899766054615644

subject

Has Abstract

pub_date

2005-10-01 00:00:00

pages

2106-38

issue

10

eissn

0899-7667

issn

1530-888X

journal_volume

17

pub_type

杂志文章
  • Bayesian Filtering with Multiple Internal Models: Toward a Theory of Social Intelligence.

    abstract::To exhibit social intelligence, animals have to recognize whom they are communicating with. One way to make this inference is to select among internal generative models of each conspecific who may be encountered. However, these models also have to be learned via some form of Bayesian belief updating. This induces an i...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01239

    authors: Isomura T,Parr T,Friston K

    更新日期:2019-12-01 00:00:00

  • Scalable hybrid computation with spikes.

    abstract::We outline a hybrid analog-digital scheme for computing with three important features that enable it to scale to systems of large complexity: First, like digital computation, which uses several one-bit precise logical units to collectively compute a precise answer to a computation, the hybrid scheme uses several moder...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976602320263971

    authors: Sarpeshkar R,O'Halloran M

    更新日期:2002-09-01 00:00:00

  • Modeling short-term synaptic depression in silicon.

    abstract::We describe a model of short-term synaptic depression that is derived from a circuit implementation. The dynamics of this circuit model is similar to the dynamics of some theoretical models of short-term depression except that the recovery dynamics of the variable describing the depression is nonlinear and it also dep...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976603762552942

    authors: Boegerhausen M,Suter P,Liu SC

    更新日期:2003-02-01 00:00:00

  • Visual Categorization with Random Projection.

    abstract::Humans learn categories of complex objects quickly and from a few examples. Random projection has been suggested as a means to learn and categorize efficiently. We investigate how random projection affects categorization by humans and by very simple neural networks on the same stimuli and categorization tasks, and how...

    journal_title:Neural computation

    pub_type: 信件

    doi:10.1162/NECO_a_00769

    authors: Arriaga RI,Rutter D,Cakmak M,Vempala SS

    更新日期:2015-10-01 00:00:00

  • Estimating functions of independent component analysis for temporally correlated signals.

    abstract::This article studies a general theory of estimating functions of independent component analysis when the independent source signals are temporarily correlated. Estimating functions are used for deriving both batch and on-line learning algorithms, and they are applicable to blind cases where spatial and temporal probab...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976600300015079

    authors: Amari S

    更新日期:2000-09-01 00:00:00

  • Topographic mapping of large dissimilarity data sets.

    abstract::Topographic maps such as the self-organizing map (SOM) or neural gas (NG) constitute powerful data mining techniques that allow simultaneously clustering data and inferring their topological structure, such that additional features, for example, browsing, become available. Both methods have been introduced for vectori...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00012

    authors: Hammer B,Hasenfuss A

    更新日期:2010-09-01 00:00:00

  • Neural coding: higher-order temporal patterns in the neurostatistics of cell assemblies.

    abstract::Recent advances in the technology of multiunit recordings make it possible to test Hebb's hypothesis that neurons do not function in isolation but are organized in assemblies. This has created the need for statistical approaches to detecting the presence of spatiotemporal patterns of more than two neurons in neuron sp...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976600300014872

    authors: Martignon L,Deco G,Laskey K,Diamond M,Freiwald W,Vaadia E

    更新日期:2000-11-01 00:00:00

  • Time-varying perturbations can distinguish among integrate-to-threshold models for perceptual decision making in reaction time tasks.

    abstract::Several integrate-to-threshold models with differing temporal integration mechanisms have been proposed to describe the accumulation of sensory evidence to a prescribed level prior to motor response in perceptual decision-making tasks. An experiment and simulation studies have shown that the introduction of time-varyi...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2009.07-08-817

    authors: Zhou X,Wong-Lin K,Philip H

    更新日期:2009-08-01 00:00:00

  • Scalable Semisupervised Functional Neurocartography Reveals Canonical Neurons in Behavioral Networks.

    abstract::Large-scale data collection efforts to map the brain are underway at multiple spatial and temporal scales, but all face fundamental problems posed by high-dimensional data and intersubject variability. Even seemingly simple problems, such as identifying a neuron/brain region across animals/subjects, become exponential...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00852

    authors: Frady EP,Kapoor A,Horvitz E,Kristan WB Jr

    更新日期:2016-08-01 00:00:00

  • State-Space Representations of Deep Neural Networks.

    abstract::This letter deals with neural networks as dynamical systems governed by finite difference equations. It shows that the introduction of k -many skip connections into network architectures, such as residual networks and additive dense n...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01165

    authors: Hauser M,Gunn S,Saab S Jr,Ray A

    更新日期:2019-03-01 00:00:00

  • The Ornstein-Uhlenbeck process does not reproduce spiking statistics of neurons in prefrontal cortex.

    abstract::Cortical neurons of behaving animals generate irregular spike sequences. Recently, there has been a heated discussion about the origin of this irregularity. Softky and Koch (1993) pointed out the inability of standard single-neuron models to reproduce the irregularity of the observed spike sequences when the model par...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976699300016511

    authors: Shinomoto S,Sakai Y,Funahashi S

    更新日期:1999-05-15 00:00:00

  • Some sampling properties of common phase estimators.

    abstract::The instantaneous phase of neural rhythms is important to many neuroscience-related studies. In this letter, we show that the statistical sampling properties of three instantaneous phase estimators commonly employed to analyze neuroscience data share common features, allowing an analytical investigation into their beh...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00422

    authors: Lepage KQ,Kramer MA,Eden UT

    更新日期:2013-04-01 00:00:00

  • A general probability estimation approach for neural comp.

    abstract::We describe an analytical framework for the adaptations of neural systems that adapt its internal structure on the basis of subjective probabilities constructed by computation of randomly received input signals. A principled approach is provided with the key property that it defines a probability density model that al...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976600300015862

    authors: Khaikine M,Holthausen K

    更新日期:2000-02-01 00:00:00

  • The neuronal replicator hypothesis.

    abstract::We propose that replication (with mutation) of patterns of neuronal activity can occur within the brain using known neurophysiological processes. Thereby evolutionary algorithms implemented by neuro- nal circuits can play a role in cognition. Replication of structured neuronal representations is assumed in several cog...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00031

    authors: Fernando C,Goldstein R,Szathmáry E

    更新日期:2010-11-01 00:00:00

  • Bayesian model assessment and comparison using cross-validation predictive densities.

    abstract::In this work, we discuss practical methods for the assessment, comparison, and selection of complex hierarchical Bayesian models. A natural way to assess the goodness of the model is to estimate its future predictive capability by estimating expected utilities. Instead of just making a point estimate, it is important ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/08997660260293292

    authors: Vehtari A,Lampinen J

    更新日期:2002-10-01 00:00:00

  • Computing sparse representations of multidimensional signals using Kronecker bases.

    abstract::Recently there has been great interest in sparse representations of signals under the assumption that signals (data sets) can be well approximated by a linear combination of few elements of a known basis (dictionary). Many algorithms have been developed to find such representations for one-dimensional signals (vectors...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00385

    authors: Caiafa CF,Cichocki A

    更新日期:2013-01-01 00:00:00

  • Making the error-controlling algorithm of observable operator models constructive.

    abstract::Observable operator models (OOMs) are a class of models for stochastic processes that properly subsumes the class that can be modeled by finite-dimensional hidden Markov models (HMMs). One of the main advantages of OOMs over HMMs is that they admit asymptotically correct learning algorithms. A series of learning algor...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2009.10-08-878

    authors: Zhao MJ,Jaeger H,Thon M

    更新日期:2009-12-01 00:00:00

  • A Mathematical Analysis of Memory Lifetime in a Simple Network Model of Memory.

    abstract::We study the learning of an external signal by a neural network and the time to forget it when this network is submitted to noise. The presentation of an external stimulus to the recurrent network of binary neurons may change the state of the synapses. Multiple presentations of a unique signal lead to its learning. Th...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01286

    authors: Helson P

    更新日期:2020-07-01 00:00:00

  • Fast recursive filters for simulating nonlinear dynamic systems.

    abstract::A fast and accurate computational scheme for simulating nonlinear dynamic systems is presented. The scheme assumes that the system can be represented by a combination of components of only two different types: first-order low-pass filters and static nonlinearities. The parameters of these filters and nonlinearities ma...

    journal_title:Neural computation

    pub_type: 信件

    doi:10.1162/neco.2008.04-07-506

    authors: van Hateren JH

    更新日期:2008-07-01 00:00:00

  • Gaussian process approach to spiking neurons for inhomogeneous Poisson inputs.

    abstract::This article presents a new theoretical framework to consider the dynamics of a stochastic spiking neuron model with general membrane response to input spike. We assume that the input spikes obey an inhomogeneous Poisson process. The stochastic process of the membrane potential then becomes a gaussian process. When a ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976601317098529

    authors: Amemori KI,Ishii S

    更新日期:2001-12-01 00:00:00

  • Formal modeling of robot behavior with learning.

    abstract::We present formal specification and verification of a robot moving in a complex network, using temporal sequence learning to avoid obstacles. Our aim is to demonstrate the benefit of using a formal approach to analyze such a system as a complementary approach to simulation. We first describe a classical closed-loop si...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00493

    authors: Kirwan R,Miller A,Porr B,Di Prodi P

    更新日期:2013-11-01 00:00:00

  • Transmission of population-coded information.

    abstract::As neural activity is transmitted through the nervous system, neuronal noise degrades the encoded information and limits performance. It is therefore important to know how information loss can be prevented. We study this question in the context of neural population codes. Using Fisher information, we show how informat...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00227

    authors: Renart A,van Rossum MC

    更新日期:2012-02-01 00:00:00

  • Active Learning for Enumerating Local Minima Based on Gaussian Process Derivatives.

    abstract::We study active learning (AL) based on gaussian processes (GPs) for efficiently enumerating all of the local minimum solutions of a black-box function. This problem is challenging because local solutions are characterized by their zero gradient and positive-definite Hessian properties, but those derivatives cannot be ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01307

    authors: Inatsu Y,Sugita D,Toyoura K,Takeuchi I

    更新日期:2020-10-01 00:00:00

  • Asynchronous Event-Based Motion Processing: From Visual Events to Probabilistic Sensory Representation.

    abstract::In this work, we propose a two-layered descriptive model for motion processing from retina to the cortex, with an event-based input from the asynchronous time-based image sensor (ATIS) camera. Spatial and spatiotemporal filtering of visual scenes by motion energy detectors has been implemented in two steps in a simple...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01191

    authors: Khoei MA,Ieng SH,Benosman R

    更新日期:2019-06-01 00:00:00

  • The Deterministic Information Bottleneck.

    abstract::Lossy compression and clustering fundamentally involve a decision about which features are relevant and which are not. The information bottleneck method (IB) by Tishby, Pereira, and Bialek ( 1999 ) formalized this notion as an information-theoretic optimization problem and proposed an optimal trade-off between throwin...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00961

    authors: Strouse DJ,Schwab DJ

    更新日期:2017-06-01 00:00:00

  • Dynamic Neural Turing Machine with Continuous and Discrete Addressing Schemes.

    abstract::We extend the neural Turing machine (NTM) model into a dynamic neural Turing machine (D-NTM) by introducing trainable address vectors. This addressing scheme maintains for each memory cell two separate vectors, content and address vectors. This allows the D-NTM to learn a wide variety of location-based addressing stra...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01060

    authors: Gulcehre C,Chandar S,Cho K,Bengio Y

    更新日期:2018-04-01 00:00:00

  • Patterns of synchrony in neural networks with spike adaptation.

    abstract::We study the emergence of synchronized burst activity in networks of neurons with spike adaptation. We show that networks of tonically firing adapting excitatory neurons can evolve to a state where the neurons burst in a synchronized manner. The mechanism leading to this burst activity is analyzed in a network of inte...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/08997660151134280

    authors: van Vreeswijk C,Hansel D

    更新日期:2001-05-01 00:00:00

  • An extended analytic expression for the membrane potential distribution of conductance-based synaptic noise.

    abstract::Synaptically generated subthreshold membrane potential (Vm) fluctuations can be characterized within the framework of stochastic calculus. It is possible to obtain analytic expressions for the steady-state Vm distribution, even in the case of conductance-based synaptic currents. However, as we show here, the analytic ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/0899766054796932

    authors: Rudolph M,Destexhe A

    更新日期:2005-11-01 00:00:00

  • Supervised Determined Source Separation with Multichannel Variational Autoencoder.

    abstract::This letter proposes a multichannel source separation technique, the multichannel variational autoencoder (MVAE) method, which uses a conditional VAE (CVAE) to model and estimate the power spectrograms of the sources in a mixture. By training the CVAE using the spectrograms of training examples with source-class label...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01217

    authors: Kameoka H,Li L,Inoue S,Makino S

    更新日期:2019-09-01 00:00:00

  • Classification of temporal patterns in dynamic biological networks.

    abstract::A general method is presented to classify temporal patterns generated by rhythmic biological networks when synaptic connections and cellular properties are known. The method is discrete in nature and relies on algebraic properties of state transitions and graph theory. Elements of the set of rhythms generated by a net...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976698300017160

    authors: Roberts PD

    更新日期:1998-10-01 00:00:00