Abstract:
:We develop a group-theoretical analysis of slow feature analysis for the case where the input data are generated by applying a set of continuous transformations to static templates. As an application of the theory, we analytically derive nonlinear visual receptive fields and show that their optimal stimuli, as well as the orientation and frequency tuning, are in good agreement with previous simulations of complex cells in primary visual cortex (Berkes and Wiskott, 2005). The theory suggests that side and end stopping can be interpreted as a weak breaking of translation invariance. Direction selectivity is also discussed.
journal_name
Neural Computjournal_title
Neural computationauthors
Sprekeler H,Wiskott Ldoi
10.1162/NECO_a_00072subject
Has Abstractpub_date
2011-02-01 00:00:00pages
303-35issue
2eissn
0899-7667issn
1530-888Xjournal_volume
23pub_type
杂志文章abstract::The prevalence of coherent oscillations in various frequency ranges in the central nervous system raises the question of the mechanisms that synchronize large populations of neurons. We study synchronization in models of large networks of spiking neurons with random sparse connectivity. Synchrony occurs only when the ...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976600300015529
更新日期:2000-05-01 00:00:00
abstract::We study active learning (AL) based on gaussian processes (GPs) for efficiently enumerating all of the local minimum solutions of a black-box function. This problem is challenging because local solutions are characterized by their zero gradient and positive-definite Hessian properties, but those derivatives cannot be ...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco_a_01307
更新日期:2020-10-01 00:00:00
abstract::Primary visual cortical complex cells are thought to serve as invariant feature detectors and to provide input to higher cortical areas. We propose a single model for learning the connectivity required by complex cells that integrates two factors that have been hypothesized to play a role in the development of invaria...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00743
更新日期:2015-07-01 00:00:00
abstract::Recent work suggests that synchronization of neuronal activity could serve to define functionally relevant relationships between spatially distributed cortical neurons. At present, it is not known to what extent this hypothesis is compatible with the widely supported notion of coarse coding, which assumes that feature...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.1995.7.3.469
更新日期:1995-05-01 00:00:00
abstract::In this letter, we investigate the impact of choosing different loss functions from the viewpoint of statistical learning theory. We introduce a convexity assumption, which is met by all loss functions commonly used in the literature, and study how the bound on the estimation error changes with the loss. We also deriv...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976604773135104
更新日期:2004-05-01 00:00:00
abstract::We propose a scalable semiparametric Bayesian model to capture dependencies among multiple neurons by detecting their cofiring (possibly with some lag time) patterns over time. After discretizing time so there is at most one spike at each interval, the resulting sequence of 1s (spike) and 0s (silence) for each neuron ...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00631
更新日期:2014-09-01 00:00:00
abstract::For the paradigmatic case of bimanual coordination, we review levels of organization of behavioral dynamics and present a description in terms of modes of behavior. We briefly review a recently developed model of spatiotemporal brain activity that is based on short- and long-range connectivity of neural ensembles. Thi...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976698300016954
更新日期:1998-11-15 00:00:00
abstract::Firing rates and synchronous firing are often simultaneously relevant signals, and they independently or cooperatively represent external sensory inputs, cognitive events, and environmental situations such as body position. However, how rates and synchrony comodulate and which aspects of inputs are effectively encoded...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976606774841521
更新日期:2006-01-01 00:00:00
abstract::Inner-product operators, often referred to as kernels in statistical learning, define a mapping from some input space into a feature space. The focus of this letter is the construction of biologically motivated kernels for cortical activities. The kernels we derive, termed Spikernels, map spike count sequences into an...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/0899766053019944
更新日期:2005-03-01 00:00:00
abstract::The role of correlations between neuronal responses is crucial to understanding the neural code. A framework used to study this role comprises a breakdown of the mutual information between stimuli and responses into terms that aim to account for different coding modalities and the distinction between different notions...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00588
更新日期:2014-06-01 00:00:00
abstract::Most conventional policy gradient reinforcement learning (PGRL) algorithms neglect (or do not explicitly make use of) a term in the average reward gradient with respect to the policy parameter. That term involves the derivative of the stationary state distribution that corresponds to the sensitivity of its distributio...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.2009.12-08-922
更新日期:2010-02-01 00:00:00
abstract::It has been suggested that reactivation of previously acquired experiences or stored information in declarative memories in the hippocampus and neocortex contributes to memory consolidation and learning. Understanding memory consolidation depends crucially on the development of robust statistical methods for assessing...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco_a_01090
更新日期:2018-08-01 00:00:00
abstract::We argue that when faced with big data sets, learning and inference algorithms should compute updates using only subsets of data items. We introduce algorithms that use sequential hypothesis tests to adaptively select such a subset of data points. The statistical properties of this subsampling process can be used to c...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00796
更新日期:2016-01-01 00:00:00
abstract::For any memoryless communication channel with a binary-valued input and a one-dimensional real-valued output, we introduce a probabilistic lower bound on the mutual information given empirical observations on the channel. The bound is built on the Dvoretzky-Kiefer-Wolfowitz inequality and is distribution free. A quadr...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00144
更新日期:2011-07-01 00:00:00
abstract::Humans learn categories of complex objects quickly and from a few examples. Random projection has been suggested as a means to learn and categorize efficiently. We investigate how random projection affects categorization by humans and by very simple neural networks on the same stimuli and categorization tasks, and how...
journal_title:Neural computation
pub_type: 信件
doi:10.1162/NECO_a_00769
更新日期:2015-10-01 00:00:00
abstract::The goal of sufficient dimension reduction in supervised learning is to find the low-dimensional subspace of input features that contains all of the information about the output values that the input features possess. In this letter, we propose a novel sufficient dimension-reduction method using a squared-loss variant...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00407
更新日期:2013-03-01 00:00:00
abstract::When subjects adapt their reaching movements in the setting of a systematic force or visual perturbation, generalization of adaptation can be assessed psychophysically in two ways: by testing untrained locations in the work space at the end of adaptation (slow postadaptation generalization) or by determining the influ...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00262
更新日期:2012-04-01 00:00:00
abstract::Decision making is a complex task, and its underlying mechanisms that regulate behavior, such as the implementation of the coupling between physiological states and neural networks, are hard to decipher. To gain more insight into neural computations underlying ongoing binary decision-making tasks, we consider a neural...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco_a_01185
更新日期:2019-05-01 00:00:00
abstract::The pyloric network of the stomatogastric ganglion in crustacea is a central pattern generator that can produce the same basic rhythm over a wide frequency range. Three electrically coupled neurons, the anterior burster (AB) neuron and two pyloric dilator (PD) neurons, act as a pacemaker unit for the pyloric network. ...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.1991.3.4.487
更新日期:1991-01-01 00:00:00
abstract::Memory models that store new memories by forgetting old ones have memory lifetimes that are rather short and grow only logarithmically in the number of synapses. Attempts to overcome these deficits include "complex" models of synaptic plasticity in which synapses possess internal states governing the expression of syn...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00956
更新日期:2017-06-01 00:00:00
abstract::In a recent paper, Poggio and Girosi (1990) proposed a class of neural networks obtained from the theory of regularization. Regularized networks are capable of approximating arbitrarily well any continuous function on a compactum. In this paper we consider in detail the learning problem for the one-dimensional case. W...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.1995.7.6.1225
更新日期:1995-11-01 00:00:00
abstract::Inspired by recent studies regarding dendritic computation, we constructed a recurrent neural network model incorporating dendritic lateral inhibition. Our model consists of an input layer and a neuron layer that includes excitatory cells and an inhibitory cell; this inhibitory cell is activated by the pooled activiti...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.2007.19.7.1798
更新日期:2007-07-01 00:00:00
abstract::Although the commonly used quadratic Hebbian-anti-Hebbian rules lead to successful models of plasticity and learning, they are inconsistent with neurophysiology. Other rules, more physiologically plausible, fail to specify the biological mechanism of bidirectionality and the biological mechanism that prevents synapses...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976698300017629
更新日期:1998-04-01 00:00:00
abstract::We propose a modular reinforcement learning architecture for nonlinear, nonstationary control tasks, which we call multiple model-based reinforcement learning (MMRL). The basic idea is to decompose a complex task into multiple domains in space and time based on the predictability of the environmental dynamics. The sys...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976602753712972
更新日期:2002-06-01 00:00:00
abstract::This article studies a general theory of estimating functions of independent component analysis when the independent source signals are temporarily correlated. Estimating functions are used for deriving both batch and on-line learning algorithms, and they are applicable to blind cases where spatial and temporal probab...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976600300015079
更新日期:2000-09-01 00:00:00
abstract::We consider learning a causal ordering of variables in a linear nongaussian acyclic model called LiNGAM. Several methods have been shown to consistently estimate a causal ordering assuming that all the model assumptions are correct. But the estimation results could be distorted if some assumptions are violated. In thi...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00533
更新日期:2014-01-01 00:00:00
abstract::We propose a new principle for replicating receptive field properties of neurons in the primary visual cortex. We derive a learning rule for a feedforward network, which maintains a low firing rate for the output neurons (resulting in temporal sparseness) and allows only a small subset of the neurons in the network to...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00341
更新日期:2012-10-01 00:00:00
abstract::The statistical dependencies that independent component analysis (ICA) cannot remove often provide rich information beyond the linear independent components. It would thus be very useful to estimate the dependency structure from data. While such models have been proposed, they have usually concentrated on higher-order...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco_a_01006
更新日期:2017-11-01 00:00:00
abstract::We study both analytically and numerically the effect of presynaptic noise on the transmission of information in attractor neural networks. The noise occurs on a very short timescale compared to that for the neuron dynamics and it produces short-time synaptic depression. This is inspired in recent neurobiological find...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976606775623342
更新日期:2006-03-01 00:00:00
abstract::A single-layered Hough transform network is proposed that accepts image coordinates of each object pixel as input and produces a set of outputs that indicate the belongingness of the pixel to a particular structure (e.g., a straight line). The network is able to learn adaptively the parametric forms of the linear segm...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976601300014501
更新日期:2001-03-01 00:00:00