Abstract:
:We study a model of the cortical macrocolumn consisting of a collection of inhibitorily coupled minicolumns. The proposed system overcomes several severe deficits of systems based on single neurons as cerebral functional units, notably limited robustness to damage and unrealistically large computation time. Motivated by neuroanatomical and neurophysiological findings, the utilized dynamics is based on a simple model of a spiking neuron with refractory period, fixed random excitatory interconnection within minicolumns, and instantaneous inhibition within one macrocolumn. A stability analysis of the system's dynamical equations shows that minicolumns can act as monolithic functional units for purposes of critical, fast decisions and learning. Oscillating inhibition (in the gamma frequency range) leads to a phase-coupled population rate code and high sensitivity to small imbalances in minicolumn inputs. Minicolumns are shown to be able to organize their collective inputs without supervision by Hebbian plasticity into selective receptive field shapes, thereby becoming classifiers for input patterns. Using the bars test, we critically compare our system's performance with that of others and demonstrate its ability for distributed neural coding.
journal_name
Neural Computjournal_title
Neural computationauthors
Lücke J,von der Malsburg Cdoi
10.1162/089976604772744893subject
Has Abstractpub_date
2004-03-01 00:00:00pages
501-33issue
3eissn
0899-7667issn
1530-888Xjournal_volume
16pub_type
杂志文章abstract::We describe a model of short-term synaptic depression that is derived from a circuit implementation. The dynamics of this circuit model is similar to the dynamics of some theoretical models of short-term depression except that the recovery dynamics of the variable describing the depression is nonlinear and it also dep...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976603762552942
更新日期:2003-02-01 00:00:00
abstract::In a recent paper, Poggio and Girosi (1990) proposed a class of neural networks obtained from the theory of regularization. Regularized networks are capable of approximating arbitrarily well any continuous function on a compactum. In this paper we consider in detail the learning problem for the one-dimensional case. W...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.1995.7.6.1225
更新日期:1995-11-01 00:00:00
abstract::A fast and accurate computational scheme for simulating nonlinear dynamic systems is presented. The scheme assumes that the system can be represented by a combination of components of only two different types: first-order low-pass filters and static nonlinearities. The parameters of these filters and nonlinearities ma...
journal_title:Neural computation
pub_type: 信件
doi:10.1162/neco.2008.04-07-506
更新日期:2008-07-01 00:00:00
abstract::In this letter, we perform a complete and in-depth analysis of Lorentzian noises, such as those arising from [Formula: see text] and [Formula: see text] channel kinetics, in order to identify the source of [Formula: see text]-type noise in neurological membranes. We prove that the autocovariance of Lorentzian noise de...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_01067
更新日期:2018-07-01 00:00:00
abstract::Slightly modified versions of an early Hebbian/anti-Hebbian neural network are shown to be capable of extracting the sparse, independent linear components of a prefiltered natural image set. An explanation for this capability in terms of a coupling between two hypothetical networks is presented. The simple networks pr...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976606775093891
更新日期:2006-02-01 00:00:00
abstract::The past decade has seen a rise of interest in Laplacian eigenmaps (LEMs) for nonlinear dimensionality reduction. LEMs have been used in spectral clustering, in semisupervised learning, and for providing efficient state representations for reinforcement learning. Here, we show that LEMs are closely related to slow fea...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00214
更新日期:2011-12-01 00:00:00
abstract::We show that Langevin Markov chain Monte Carlo inference in an energy-based model with latent variables has the property that the early steps of inference, starting from a stationary point, correspond to propagating error gradients into internal layers, similar to backpropagation. The backpropagated error is with resp...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00934
更新日期:2017-03-01 00:00:00
abstract::Most conventional policy gradient reinforcement learning (PGRL) algorithms neglect (or do not explicitly make use of) a term in the average reward gradient with respect to the policy parameter. That term involves the derivative of the stationary state distribution that corresponds to the sensitivity of its distributio...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.2009.12-08-922
更新日期:2010-02-01 00:00:00
abstract::A variant of spiking neural P systems with positive or negative weights on synapses is introduced, where the rules of a neuron fire when the potential of that neuron equals a given value. The involved values-weights, firing thresholds, potential consumed by each rule-can be real (computable) numbers, rational numbers,...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00022
更新日期:2010-10-01 00:00:00
abstract::A hierarchical dynamical map is proposed as the basic framework for sensory cortical mapping. To show how the hierarchical dynamical map works in cognitive processes, we applied it to a typical cognitive task known as priming, in which cognitive performance is facilitated as a consequence of prior experience. Prior to...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/08997660152469341
更新日期:2001-08-01 00:00:00
abstract::The Nyström method is a well-known sampling-based technique for approximating the eigensystem of large kernel matrices. However, the chosen samples in the Nyström method are all assumed to be of equal importance, which deviates from the integral equation that defines the kernel eigenfunctions. Motivated by this observ...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.2008.11-07-651
更新日期:2009-01-01 00:00:00
abstract::Regression aims at estimating the conditional mean of output given input. However, regression is not informative enough if the conditional density is multimodal, heteroskedastic, and asymmetric. In such a case, estimating the conditional density itself is preferable, but conditional density estimation (CDE) is challen...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00683
更新日期:2015-01-01 00:00:00
abstract::Firing rates and synchronous firing are often simultaneously relevant signals, and they independently or cooperatively represent external sensory inputs, cognitive events, and environmental situations such as body position. However, how rates and synchrony comodulate and which aspects of inputs are effectively encoded...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976606774841521
更新日期:2006-01-01 00:00:00
abstract::In this letter, we investigate the impact of choosing different loss functions from the viewpoint of statistical learning theory. We introduce a convexity assumption, which is met by all loss functions commonly used in the literature, and study how the bound on the estimation error changes with the loss. We also deriv...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976604773135104
更新日期:2004-05-01 00:00:00
abstract::We propose that replication (with mutation) of patterns of neuronal activity can occur within the brain using known neurophysiological processes. Thereby evolutionary algorithms implemented by neuro- nal circuits can play a role in cognition. Replication of structured neuronal representations is assumed in several cog...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00031
更新日期:2010-11-01 00:00:00
abstract::A mathematical theory of interacting hypercolumns in primary visual cortex (V1) is presented that incorporates details concerning the anisotropic nature of long-range lateral connections. Each hypercolumn is modeled as a ring of interacting excitatory and inhibitory neural populations with orientation preferences over...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976602317250870
更新日期:2002-03-01 00:00:00
abstract::How do multiple feature maps that coexist in the same region of cerebral cortex align with each other? We hypothesize that such alignment is governed by temporal correlations: features in one map that are temporally correlated with those in another come to occupy the same spatial locations in cortex over time. To exam...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.1996.8.4.731
更新日期:1996-05-15 00:00:00
abstract::The Kalman filter provides a simple and efficient algorithm to compute the posterior distribution for state-space models where both the latent state and measurement models are linear and gaussian. Extensions to the Kalman filter, including the extended and unscented Kalman filters, incorporate linearizations for model...
journal_title:Neural computation
pub_type: 信件
doi:10.1162/neco_a_01275
更新日期:2020-05-01 00:00:00
abstract::Disparity tuning of visual cells in the brain depends on the structure of their binocular receptive fields (RFs). Freeman and coworkers have found that binocular RFs of a typical simple cell can be quantitatively described by two Gabor functions with the same gaussian envelope but different phase parameters in the sin...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.1996.8.8.1611
更新日期:1996-11-15 00:00:00
abstract::Field models provide an elegant mathematical framework to analyze large-scale patterns of neural activity. On the microscopic level, these models are usually based on either a firing-rate picture or integrate-and-fire dynamics. This article shows that in spite of the large conceptual differences between the two types ...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/08997660260028656
更新日期:2002-07-01 00:00:00
abstract::Humans learn categories of complex objects quickly and from a few examples. Random projection has been suggested as a means to learn and categorize efficiently. We investigate how random projection affects categorization by humans and by very simple neural networks on the same stimuli and categorization tasks, and how...
journal_title:Neural computation
pub_type: 信件
doi:10.1162/NECO_a_00769
更新日期:2015-10-01 00:00:00
abstract::A mathematical model, of general character for the dynamic description of coupled neural oscillators is presented. The population approach that is employed applies equally to coupled cells as to populations of such coupled cells. The formulation includes stochasticity and preserves details of precisely firing neurons....
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.2007.03-07-482
更新日期:2008-05-01 00:00:00
abstract::Theories of learning and generalization hold that the generalization bias, defined as the difference between the training error and the generalization error, increases on average with the number of adaptive parameters. This article, however, shows that this general tendency is violated for a gaussian mixture model. Fo...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976600300015439
更新日期:2000-06-01 00:00:00
abstract::Humans have the ability to learn novel motor tasks while manipulating the environment. Several models of motor learning have been proposed in the literature, but few of them address the problem of retention and interference of motor memory. The modular selection and identification for control (MOSAIC) model, originall...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.2009.03-08-721
更新日期:2009-07-01 00:00:00
abstract::Generalized discriminant analysis (GDA) is an extension of the classical linear discriminant analysis (LDA) from linear domain to a nonlinear domain via the kernel trick. However, in the previous algorithm of GDA, the solutions may suffer from the degenerate eigenvalue problem (i.e., several eigenvectors with the same...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976604773717612
更新日期:2004-06-01 00:00:00
abstract::Observable operator models (OOMs) are a class of models for stochastic processes that properly subsumes the class that can be modeled by finite-dimensional hidden Markov models (HMMs). One of the main advantages of OOMs over HMMs is that they admit asymptotically correct learning algorithms. A series of learning algor...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.2009.10-08-878
更新日期:2009-12-01 00:00:00
abstract::Neuromorphic engineering combines the architectural and computational principles of systems neuroscience with semiconductor electronics, with the aim of building efficient and compact devices that mimic the synaptic and neural machinery of the brain. The energy consumptions promised by neuromorphic engineering are ext...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00882
更新日期:2016-10-01 00:00:00
abstract::Event-driven simulation strategies were proposed recently to simulate integrate-and-fire (IF) type neuronal models. These strategies can lead to computationally efficient algorithms for simulating large-scale networks of neurons; most important, such approaches are more precise than traditional clock-driven numerical ...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.2006.18.9.2146
更新日期:2006-09-01 00:00:00
abstract::The motion of an object (such as a wheel rotating) is seen as consistent independent of its position and size on the retina. Neurons in higher cortical visual areas respond to these global motion stimuli invariantly, but neurons in early cortical areas with small receptive fields cannot represent this motion, not only...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.2007.19.1.139
更新日期:2007-01-01 00:00:00
abstract::One standard interpretation of networks of cortical neurons is that they form dynamical attractors. Computations such as stimulus estimation are performed by mapping inputs to points on the networks' attractive manifolds. These points represent population codes for the stimulus values. However, this standard interpret...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00051
更新日期:2010-12-01 00:00:00