Abstract:
:We develop theoretical foundations of resonator networks, a new type of recurrent neural network introduced in Frady, Kent, Olshausen, and Sommer (2020), a companion article in this issue, to solve a high-dimensional vector factorization problem arising in Vector Symbolic Architectures. Given a composite vector formed by the Hadamard product between a discrete set of high-dimensional vectors, a resonator network can efficiently decompose the composite into these factors. We compare the performance of resonator networks against optimization-based methods, including Alternating Least Squares and several gradient-based algorithms, showing that resonator networks are superior in several important ways. This advantage is achieved by leveraging a combination of nonlinear dynamics and searching in superposition, by which estimates of the correct solution are formed from a weighted superposition of all possible solutions. While the alternative methods also search in superposition, the dynamics of resonator networks allow them to strike a more effective balance between exploring the solution space and exploiting local information to drive the network toward probable solutions. Resonator networks are not guaranteed to converge, but within a particular regime they almost always do. In exchange for relaxing the guarantee of global convergence, resonator networks are dramatically more effective at finding factorizations than all alternative approaches considered.
journal_name
Neural Computjournal_title
Neural computationauthors
Kent SJ,Frady EP,Sommer FT,Olshausen BAdoi
10.1162/neco_a_01329subject
Has Abstractpub_date
2020-12-01 00:00:00pages
2332-2388issue
12eissn
0899-7667issn
1530-888Xjournal_volume
32pub_type
杂志文章abstract::We present a reduction of a Hodgkin-Huxley (HH)--style bursting model to a hybridized integrate-and-fire (IF) formalism based on a thorough bifurcation analysis of the neuron's dynamics. The model incorporates HH--style equations to evolve the subthreshold currents and includes IF mechanisms to characterize spike even...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976603322518768
更新日期:2003-12-01 00:00:00
abstract::In learning theory, the training and test sets are assumed to be drawn from the same probability distribution. This assumption is also followed in practical situations, where matching the training and test distributions is considered desirable. Contrary to conventional wisdom, we show that mismatched training and test...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00697
更新日期:2015-02-01 00:00:00
abstract::A mathematical theory of interacting hypercolumns in primary visual cortex (V1) is presented that incorporates details concerning the anisotropic nature of long-range lateral connections. Each hypercolumn is modeled as a ring of interacting excitatory and inhibitory neural populations with orientation preferences over...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976602317250870
更新日期:2002-03-01 00:00:00
abstract::We study the effect of competition between short-term synaptic depression and facilitation on the dynamic properties of attractor neural networks, using Monte Carlo simulation and a mean-field analysis. Depending on the balance of depression, facilitation, and the underlying noise, the network displays different behav...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.2007.19.10.2739
更新日期:2007-10-01 00:00:00
abstract::This article presents a reinforcement learning framework for continuous-time dynamical systems without a priori discretization of time, state, and action. Based on the Hamilton-Jacobi-Bellman (HJB) equation for infinite-horizon, discounted reward problems, we derive algorithms for estimating value functions and improv...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976600300015961
更新日期:2000-01-01 00:00:00
abstract::Physiological signals such as neural spikes and heartbeats are discrete events in time, driven by continuous underlying systems. A recently introduced data-driven model to analyze such a system is a state-space model with point process observations, parameters of which and the underlying state sequence are simultaneou...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.2010.07-09-1047
更新日期:2010-08-01 00:00:00
abstract::The perceptron (also referred to as McCulloch-Pitts neuron, or linear threshold gate) is commonly used as a simplified model for the discrimination and learning capability of a biological neuron. Criteria that tell us when a perceptron can implement (or learn to implement) all possible dichotomies over a given set of ...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.2008.20.1.288
更新日期:2008-01-01 00:00:00
abstract::In this work, we propose a two-layered descriptive model for motion processing from retina to the cortex, with an event-based input from the asynchronous time-based image sensor (ATIS) camera. Spatial and spatiotemporal filtering of visual scenes by motion energy detectors has been implemented in two steps in a simple...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco_a_01191
更新日期:2019-06-01 00:00:00
abstract::Decision making is a complex task, and its underlying mechanisms that regulate behavior, such as the implementation of the coupling between physiological states and neural networks, are hard to decipher. To gain more insight into neural computations underlying ongoing binary decision-making tasks, we consider a neural...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco_a_01185
更新日期:2019-05-01 00:00:00
abstract::In a pioneering classic, Warren McCulloch and Walter Pitts proposed a model of the central nervous system. Motivated by EEG recordings of normal brain activity, Chvátal and Goldsmith asked whether these dynamical systems can be engineered to produce trajectories that are irregular, disorderly, and apparently unpredict...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00841
更新日期:2016-06-01 00:00:00
abstract::Firing rates and synchronous firing are often simultaneously relevant signals, and they independently or cooperatively represent external sensory inputs, cognitive events, and environmental situations such as body position. However, how rates and synchrony comodulate and which aspects of inputs are effectively encoded...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976606774841521
更新日期:2006-01-01 00:00:00
abstract::We present a graphical model framework for decoding in the visual ERP-based speller system. The proposed framework allows researchers to build generative models from which the decoding rules are obtained in a straightforward manner. We suggest two models for generating brain signals conditioned on the stimulus events....
journal_title:Neural computation
pub_type: 信件
doi:10.1162/NECO_a_00066
更新日期:2011-01-01 00:00:00
abstract::Theories of learning and generalization hold that the generalization bias, defined as the difference between the training error and the generalization error, increases on average with the number of adaptive parameters. This article, however, shows that this general tendency is violated for a gaussian mixture model. Fo...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976600300015439
更新日期:2000-06-01 00:00:00
abstract::Reservoir computing is a biologically inspired class of learning algorithms in which the intrinsic dynamics of a recurrent neural network are mined to produce target time series. Most existing reservoir computing algorithms rely on fully supervised learning rules, which require access to an exact copy of the target re...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco_a_01198
更新日期:2019-07-01 00:00:00
abstract::The successor representation was introduced into reinforcement learning by Dayan ( 1993 ) as a means of facilitating generalization between states with similar successors. Although reinforcement learning in general has been used extensively as a model of psychological and neural processes, the psychological validity o...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00282
更新日期:2012-06-01 00:00:00
abstract::In this review, we compare methods for temporal sequence learning (TSL) across the disciplines machine-control, classical conditioning, neuronal models for TSL as well as spike-timing-dependent plasticity (STDP). This review introduces the most influential models and focuses on two questions: To what degree are reward...
journal_title:Neural computation
pub_type: 杂志文章,评审
doi:10.1162/0899766053011555
更新日期:2005-02-01 00:00:00
abstract::The need to reason about uncertainty in large, complex, and multimodal data sets has become increasingly common across modern scientific environments. The ability to transform samples from one distribution journal_title:Neural computation pub_type: 杂志文章 doi:10.1162/neco_a_01172 更新日期:2019-04-01 00:00:00
abstract::Pairwise correlations among spike trains recorded in vivo have been frequently reported. It has been argued that correlated activity could play an important role in the brain, because it efficiently modulates the response of a postsynaptic neuron. We show here that a neuron's output firing rate critically depends on t...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976603321043702
更新日期:2003-01-01 00:00:00
abstract::We present a new supervised learning procedure for ensemble machines, in which outputs of predictors, trained on different distributions, are combined by a dynamic classifier combination model. This procedure may be viewed as either a version of mixture of experts (Jacobs, Jordan, Nowlan, & Hintnon, 1991), applied to ...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976699300016737
更新日期:1999-02-15 00:00:00
abstract::We describe a model of short-term synaptic depression that is derived from a circuit implementation. The dynamics of this circuit model is similar to the dynamics of some theoretical models of short-term depression except that the recovery dynamics of the variable describing the depression is nonlinear and it also dep...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976603762552942
更新日期:2003-02-01 00:00:00
abstract::This article studies a general theory of estimating functions of independent component analysis when the independent source signals are temporarily correlated. Estimating functions are used for deriving both batch and on-line learning algorithms, and they are applicable to blind cases where spatial and temporal probab...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976600300015079
更新日期:2000-09-01 00:00:00
abstract::Characterizing neural spiking activity as a function of intrinsic and extrinsic factors is important in neuroscience. Point process models are valuable for capturing such information; however, the process of fully applying these models is not always obvious. A complete model application has four broad steps: specifica...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00198
更新日期:2011-11-01 00:00:00
abstract::Recently there has been great interest in sparse representations of signals under the assumption that signals (data sets) can be well approximated by a linear combination of few elements of a known basis (dictionary). Many algorithms have been developed to find such representations for one-dimensional signals (vectors...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00385
更新日期:2013-01-01 00:00:00
abstract::This letter deals with neural networks as dynamical systems governed by finite difference equations. It shows that the introduction of
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco_a_01165
更新日期:2019-03-01 00:00:00
abstract::The hypothesis of invariant maximization of interaction (IMI) is formulated within the setting of random fields. According to this hypothesis, learning processes maximize the stochastic interaction of the neurons subject to constraints. We consider the extrinsic constraint in terms of a fixed input distribution on the...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976602760805368
更新日期:2002-12-01 00:00:00
abstract::We study active learning (AL) based on gaussian processes (GPs) for efficiently enumerating all of the local minimum solutions of a black-box function. This problem is challenging because local solutions are characterized by their zero gradient and positive-definite Hessian properties, but those derivatives cannot be ...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco_a_01307
更新日期:2020-10-01 00:00:00
abstract::Neuroscience is progressing vigorously, and knowledge at different levels of description is rapidly accumulating. To establish relationships between results found at these different levels is one of the central challenges. In this simulation study, we demonstrate how microscopic cellular properties, taking the example...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976699300016377
更新日期:1999-07-01 00:00:00
abstract::We address the problem of detecting the presence of a recurring stimulus by monitoring the voltage on a multiunit electrode located in a brain region densely populated by stimulus reactive neurons. Published experimental results suggest that under these conditions, when a stimulus is present, the measurements are gaus...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00257
更新日期:2012-04-01 00:00:00
abstract::The problem of designing input signals for optimal generalization is called active learning. In this article, we give a two-stage sampling scheme for reducing both the bias and variance, and based on this scheme, we propose two active learning methods. One is the multipoint search method applicable to arbitrary models...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976600300014773
更新日期:2000-12-01 00:00:00
abstract::To exhibit social intelligence, animals have to recognize whom they are communicating with. One way to make this inference is to select among internal generative models of each conspecific who may be encountered. However, these models also have to be learned via some form of Bayesian belief updating. This induces an i...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco_a_01239
更新日期:2019-12-01 00:00:00