A Reservoir Computing Model of Reward-Modulated Motor Learning and Automaticity.

Abstract:

:Reservoir computing is a biologically inspired class of learning algorithms in which the intrinsic dynamics of a recurrent neural network are mined to produce target time series. Most existing reservoir computing algorithms rely on fully supervised learning rules, which require access to an exact copy of the target response, greatly reducing the utility of the system. Reinforcement learning rules have been developed for reservoir computing, but we find that they fail to converge on complex motor tasks. Current theories of biological motor learning pose that early learning is controlled by dopamine-modulated plasticity in the basal ganglia that trains parallel cortical pathways through unsupervised plasticity as a motor task becomes well learned. We developed a novel learning algorithm for reservoir computing that models the interaction between reinforcement and unsupervised learning observed in experiments. This novel learning algorithm converges on simulated motor tasks on which previous reservoir computing algorithms fail and reproduces experimental findings that relate Parkinson's disease and its treatments to motor learning. Hence, incorporating biological theories of motor learning improves the effectiveness and biological relevance of reservoir computing models.

journal_name

Neural Comput

journal_title

Neural computation

authors

Pyle R,Rosenbaum R

doi

10.1162/neco_a_01198

subject

Has Abstract

pub_date

2019-07-01 00:00:00

pages

1430-1461

issue

7

eissn

0899-7667

issn

1530-888X

journal_volume

31

pub_type

杂志文章
  • Incremental active learning for optimal generalization.

    abstract::The problem of designing input signals for optimal generalization is called active learning. In this article, we give a two-stage sampling scheme for reducing both the bias and variance, and based on this scheme, we propose two active learning methods. One is the multipoint search method applicable to arbitrary models...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976600300014773

    authors: Sugiyama M,Ogawa H

    更新日期:2000-12-01 00:00:00

  • The Deterministic Information Bottleneck.

    abstract::Lossy compression and clustering fundamentally involve a decision about which features are relevant and which are not. The information bottleneck method (IB) by Tishby, Pereira, and Bialek ( 1999 ) formalized this notion as an information-theoretic optimization problem and proposed an optimal trade-off between throwin...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00961

    authors: Strouse DJ,Schwab DJ

    更新日期:2017-06-01 00:00:00

  • A neurocomputational model for cocaine addiction.

    abstract::Based on the dopamine hypotheses of cocaine addiction and the assumption of decrement of brain reward system sensitivity after long-term drug exposure, we propose a computational model for cocaine addiction. Utilizing average reward temporal difference reinforcement learning, we incorporate the elevation of basal rewa...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2009.10-08-882

    authors: Dezfouli A,Piray P,Keramati MM,Ekhtiari H,Lucas C,Mokri A

    更新日期:2009-10-01 00:00:00

  • Statistical procedures for spatiotemporal neuronal data with applications to optical recording of the auditory cortex.

    abstract::This article presents new procedures for multisite spatiotemporal neuronal data analysis. A new statistical model - the diffusion model - is considered, whose parameters can be estimated from experimental data thanks to mean-field approximations. This work has been applied to optical recording of the guinea pig's audi...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976600300015150

    authors: François O,Abdallahi LM,Horikawa J,Taniguchi I,Hervé T

    更新日期:2000-08-01 00:00:00

  • Neuronal assembly dynamics in supervised and unsupervised learning scenarios.

    abstract::The dynamic formation of groups of neurons--neuronal assemblies--is believed to mediate cognitive phenomena at many levels, but their detailed operation and mechanisms of interaction are still to be uncovered. One hypothesis suggests that synchronized oscillations underpin their formation and functioning, with a focus...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00502

    authors: Moioli RC,Husbands P

    更新日期:2013-11-01 00:00:00

  • Abstract stimulus-specific adaptation models.

    abstract::Many neurons that initially respond to a stimulus stop responding if the stimulus is presented repeatedly but recover their response if a different stimulus is presented. This phenomenon is referred to as stimulus-specific adaptation (SSA). SSA has been investigated extensively using oddball experiments, which measure...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00077

    authors: Mill R,Coath M,Wennekers T,Denham SL

    更新日期:2011-02-01 00:00:00

  • Positive Neural Networks in Discrete Time Implement Monotone-Regular Behaviors.

    abstract::We study the expressive power of positive neural networks. The model uses positive connection weights and multiple input neurons. Different behaviors can be expressed by varying the connection weights. We show that in discrete time and in the absence of noise, the class of positive neural networks captures the so-call...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00789

    authors: Ameloot TJ,Van den Bussche J

    更新日期:2015-12-01 00:00:00

  • Distributed control of uncertain systems using superpositions of linear operators.

    abstract::Control in the natural environment is difficult in part because of uncertainty in the effect of actions. Uncertainty can be due to added motor or sensory noise, unmodeled dynamics, or quantization of sensory feedback. Biological systems are faced with further difficulties, since control must be performed by networks o...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00151

    authors: Sanger TD

    更新日期:2011-08-01 00:00:00

  • Spiking neural P systems with a generalized use of rules.

    abstract::Spiking neural P systems (SN P systems) are a class of distributed parallel computing devices inspired by spiking neurons, where the spiking rules are usually used in a sequential way (an applicable rule is applied one time at a step) or an exhaustive way (an applicable rule is applied as many times as possible at a s...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00665

    authors: Zhang X,Wang B,Pan L

    更新日期:2014-12-01 00:00:00

  • Gaussian process approach to spiking neurons for inhomogeneous Poisson inputs.

    abstract::This article presents a new theoretical framework to consider the dynamics of a stochastic spiking neuron model with general membrane response to input spike. We assume that the input spikes obey an inhomogeneous Poisson process. The stochastic process of the membrane potential then becomes a gaussian process. When a ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976601317098529

    authors: Amemori KI,Ishii S

    更新日期:2001-12-01 00:00:00

  • Neutral stability, rate propagation, and critical branching in feedforward networks.

    abstract::Recent experimental and computational evidence suggests that several dynamical properties may characterize the operating point of functioning neural networks: critical branching, neutral stability, and production of a wide range of firing patterns. We seek the simplest setting in which these properties emerge, clarify...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00461

    authors: Cayco-Gajic NA,Shea-Brown E

    更新日期:2013-07-01 00:00:00

  • Temporal sequence learning, prediction, and control: a review of different models and their relation to biological mechanisms.

    abstract::In this review, we compare methods for temporal sequence learning (TSL) across the disciplines machine-control, classical conditioning, neuronal models for TSL as well as spike-timing-dependent plasticity (STDP). This review introduces the most influential models and focuses on two questions: To what degree are reward...

    journal_title:Neural computation

    pub_type: 杂志文章,评审

    doi:10.1162/0899766053011555

    authors: Wörgötter F,Porr B

    更新日期:2005-02-01 00:00:00

  • Similarity, connectionism, and the problem of representation in vision.

    abstract::A representational scheme under which the ranking between represented similarities is isomorphic to the ranking between the corresponding shape similarities can support perfectly correct shape classification because it preserves the clustering of shapes according to the natural kinds prevailing in the external world. ...

    journal_title:Neural computation

    pub_type: 杂志文章,评审

    doi:10.1162/neco.1997.9.4.701

    authors: Edelman S,Duvdevani-Bar S

    更新日期:1997-05-15 00:00:00

  • Variations on the Theme of Synaptic Filtering: A Comparison of Integrate-and-Express Models of Synaptic Plasticity for Memory Lifetimes.

    abstract::Integrate-and-express models of synaptic plasticity propose that synapses integrate plasticity induction signals before expressing synaptic plasticity. By discerning trends in their induction signals, synapses can control destabilizing fluctuations in synaptic strength. In a feedforward perceptron framework with binar...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00889

    authors: Elliott T

    更新日期:2016-11-01 00:00:00

  • Adaptive Learning Algorithm Convergence in Passive and Reactive Environments.

    abstract::Although the number of artificial neural network and machine learning architectures is growing at an exponential pace, more attention needs to be paid to theoretical guarantees of asymptotic convergence for novel, nonlinear, high-dimensional adaptive learning algorithms. When properly understood, such guarantees can g...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01117

    authors: Golden RM

    更新日期:2018-10-01 00:00:00

  • Analysis of cluttered scenes using an elastic matching approach for stereo images.

    abstract::We present a system for the automatic interpretation of cluttered scenes containing multiple partly occluded objects in front of unknown, complex backgrounds. The system is based on an extended elastic graph matching algorithm that allows the explicit modeling of partial occlusions. Our approach extends an earlier sys...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2006.18.6.1441

    authors: Eckes C,Triesch J,von der Malsburg C

    更新日期:2006-06-01 00:00:00

  • Learning only when necessary: better memories of correlated patterns in networks with bounded synapses.

    abstract::Learning in a neuronal network is often thought of as a linear superposition of synaptic modifications induced by individual stimuli. However, since biological synapses are naturally bounded, a linear superposition would cause fast forgetting of previously acquired memories. Here we show that this forgetting can be av...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/0899766054615644

    authors: Senn W,Fusi S

    更新日期:2005-10-01 00:00:00

  • Optimality of Upper-Arm Reaching Trajectories Based on the Expected Value of the Metabolic Energy Cost.

    abstract::When we move our body to perform a movement task, our central nervous system selects a movement trajectory from an infinite number of possible trajectories under constraints that have been acquired through evolution and learning. Minimization of the energy cost has been suggested as a potential candidate for a constra...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00757

    authors: Taniai Y,Nishii J

    更新日期:2015-08-01 00:00:00

  • Constraint on the number of synaptic inputs to a visual cortical neuron controls receptive field formation.

    abstract::To date, Hebbian learning combined with some form of constraint on synaptic inputs has been demonstrated to describe well the development of neural networks. The previous models revealed mathematically the importance of synaptic constraints to reproduce orientation selectivity in the visual cortical neurons, but biolo...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2009.04-08-752

    authors: Tanaka S,Miyashita M

    更新日期:2009-09-01 00:00:00

  • On the performance of voltage stepping for the simulation of adaptive, nonlinear integrate-and-fire neuronal networks.

    abstract::In traditional event-driven strategies, spike timings are analytically given or calculated with arbitrary precision (up to machine precision). Exact computation is possible only for simplified neuron models, mainly the leaky integrate-and-fire model. In a recent paper, Zheng, Tonnelier, and Martinez (2009) introduced ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00112

    authors: Kaabi MG,Tonnelier A,Martinez D

    更新日期:2011-05-01 00:00:00

  • The dynamics of discrete-time computation, with application to recurrent neural networks and finite state machine extraction.

    abstract::Recurrent neural networks (RNNs) can learn to perform finite state computations. It is shown that an RNN performing a finite state computation must organize its state space to mimic the states in the minimal deterministic finite state machine that can perform that computation, and a precise description of the attracto...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.1996.8.6.1135

    authors: Casey M

    更新日期:1996-08-15 00:00:00

  • Estimating spiking irregularities under changing environments.

    abstract::We considered a gamma distribution of interspike intervals as a statistical model for neuronal spike generation. A gamma distribution is a natural extension of the Poisson process taking the effect of a refractory period into account. The model is specified by two parameters: a time-dependent firing rate and a shape p...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2006.18.10.2359

    authors: Miura K,Okada M,Amari S

    更新日期:2006-10-01 00:00:00

  • Information loss in an optimal maximum likelihood decoding.

    abstract::The mutual information between a set of stimuli and the elicited neural responses is compared to the corresponding decoded information. The decoding procedure is presented as an artificial distortion of the joint probabilities between stimuli and responses. The information loss is quantified. Whenever the probabilitie...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976602317318947

    authors: Samengo I

    更新日期:2002-04-01 00:00:00

  • Generalization and multirate models of motor adaptation.

    abstract::When subjects adapt their reaching movements in the setting of a systematic force or visual perturbation, generalization of adaptation can be assessed psychophysically in two ways: by testing untrained locations in the work space at the end of adaptation (slow postadaptation generalization) or by determining the influ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00262

    authors: Tanaka H,Krakauer JW,Sejnowski TJ

    更新日期:2012-04-01 00:00:00

  • Neural associative memory with optimal Bayesian learning.

    abstract::Neural associative memories are perceptron-like single-layer networks with fast synaptic learning typically storing discrete associations between pairs of neural activity patterns. Previous work optimized the memory capacity for various models of synaptic learning: linear Hopfield-type rules, the Willshaw model employ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00127

    authors: Knoblauch A

    更新日期:2011-06-01 00:00:00

  • Nonmonotonic generalization bias of Gaussian mixture models.

    abstract::Theories of learning and generalization hold that the generalization bias, defined as the difference between the training error and the generalization error, increases on average with the number of adaptive parameters. This article, however, shows that this general tendency is violated for a gaussian mixture model. Fo...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976600300015439

    authors: Akaho S,Kappen HJ

    更新日期:2000-06-01 00:00:00

  • Minimal model for intracellular calcium oscillations and electrical bursting in melanotrope cells of Xenopus laevis.

    abstract::A minimal model is presented to explain changes in frequency, shape, and amplitude of Ca2+ oscillations in the neuroendocrine melanotrope cell of Xenopus Laevis. It describes the cell as a plasma membrane oscillator with influx of extracellular Ca2+ via voltage-gated Ca2+ channels in the plasma membrane. The Ca2+ osci...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976601300014655

    authors: Cornelisse LN,Scheenen WJ,Koopman WJ,Roubos EW,Gielen SC

    更新日期:2001-01-01 00:00:00

  • Mean First Passage Memory Lifetimes by Reducing Complex Synapses to Simple Synapses.

    abstract::Memory models that store new memories by forgetting old ones have memory lifetimes that are rather short and grow only logarithmically in the number of synapses. Attempts to overcome these deficits include "complex" models of synaptic plasticity in which synapses possess internal states governing the expression of syn...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00956

    authors: Elliott T

    更新日期:2017-06-01 00:00:00

  • Transmission of population-coded information.

    abstract::As neural activity is transmitted through the nervous system, neuronal noise degrades the encoded information and limits performance. It is therefore important to know how information loss can be prevented. We study this question in the context of neural population codes. Using Fisher information, we show how informat...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00227

    authors: Renart A,van Rossum MC

    更新日期:2012-02-01 00:00:00

  • Modeling slowly bursting neurons via calcium store and voltage-independent calcium current.

    abstract::Recent experiments indicate that the calcium store (e.g., endoplasmic reticulum) is involved in electrical bursting and [Ca2+]i oscillation in bursting neuronal cells. In this paper, we formulate a mathematical model for bursting neurons, which includes Ca2+ in the intracellular Ca2+ stores and a voltage-independent c...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.1996.8.5.951

    authors: Chay TR

    更新日期:1996-07-01 00:00:00