An amplitude equation approach to contextual effects in visual cortex.

Abstract:

:A mathematical theory of interacting hypercolumns in primary visual cortex (V1) is presented that incorporates details concerning the anisotropic nature of long-range lateral connections. Each hypercolumn is modeled as a ring of interacting excitatory and inhibitory neural populations with orientation preferences over the range 0 to 180 degrees. Analytical methods from bifurcation theory are used to derive nonlinear equations for the amplitude and phase of the population tuning curves in which the effective lateral interactions are linear in the amplitudes. These amplitude equations describe how mutual interactions between hypercolumns via lateral connections modify the response of each hypercolumn to modulated inputs from the lateral geniculate nucleus; such interactions form the basis of contextual effects. The coupled ring model is shown to reproduce a number of orientation-dependent and contrast-dependent features observed in center-surround experiments. A major prediction of the model is that the anisotropy in lateral connections results in a nonuniform modulatory effect of the surround that is correlated with the orientation of the center.

journal_name

Neural Comput

journal_title

Neural computation

authors

Bressloff PC,Cowan JD

doi

10.1162/089976602317250870

subject

Has Abstract

pub_date

2002-03-01 00:00:00

pages

493-525

issue

3

eissn

0899-7667

issn

1530-888X

journal_volume

14

pub_type

杂志文章
  • Hidden Quantum Processes, Quantum Ion Channels, and 1/ fθ-Type Noise.

    abstract::In this letter, we perform a complete and in-depth analysis of Lorentzian noises, such as those arising from [Formula: see text] and [Formula: see text] channel kinetics, in order to identify the source of [Formula: see text]-type noise in neurological membranes. We prove that the autocovariance of Lorentzian noise de...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_01067

    authors: Paris A,Vosoughi A,Berman SA,Atia G

    更新日期:2018-07-01 00:00:00

  • Mirror symmetric topographic maps can arise from activity-dependent synaptic changes.

    abstract::Multiple adjacent, roughly mirror-image topographic maps are commonly observed in the sensory neocortex of many species. The cortical regions occupied by these maps are generally believed to be determined initially by genetically controlled chemical markers during development, with thalamocortical afferent activity su...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/0899766053491904

    authors: Schulz R,Reggia JA

    更新日期:2005-05-01 00:00:00

  • Sequential triangle strip generator based on Hopfield networks.

    abstract::The important task of generating the minimum number of sequential triangle strips (tristrips) for a given triangulated surface model is motivated by applications in computer graphics. This hard combinatorial optimization problem is reduced to the minimum energy problem in Hopfield nets by a linear-size construction. I...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2008.10-07-623

    authors: Síma J,Lnĕnicka R

    更新日期:2009-02-01 00:00:00

  • Are loss functions all the same?

    abstract::In this letter, we investigate the impact of choosing different loss functions from the viewpoint of statistical learning theory. We introduce a convexity assumption, which is met by all loss functions commonly used in the literature, and study how the bound on the estimation error changes with the loss. We also deriv...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976604773135104

    authors: Rosasco L,De Vito E,Caponnetto A,Piana M,Verri A

    更新日期:2004-05-01 00:00:00

  • ISO learning approximates a solution to the inverse-controller problem in an unsupervised behavioral paradigm.

    abstract::In "Isotropic Sequence Order Learning" (pp. 831-864 in this issue), we introduced a novel algorithm for temporal sequence learning (ISO learning). Here, we embed this algorithm into a formal nonevaluating (teacher free) environment, which establishes a sensor-motor feedback. The system is initially guided by a fixed r...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/08997660360581930

    authors: Porr B,von Ferber C,Wörgötter F

    更新日期:2003-04-01 00:00:00

  • An oscillatory Hebbian network model of short-term memory.

    abstract::Recurrent neural architectures having oscillatory dynamics use rhythmic network activity to represent patterns stored in short-term memory. Multiple stored patterns can be retained in memory over the same neural substrate because the network's state persistently switches between them. Here we present a simple oscillat...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2008.02-08-715

    authors: Winder RK,Reggia JA,Weems SA,Bunting MF

    更新日期:2009-03-01 00:00:00

  • Conditional density estimation with dimensionality reduction via squared-loss conditional entropy minimization.

    abstract::Regression aims at estimating the conditional mean of output given input. However, regression is not informative enough if the conditional density is multimodal, heteroskedastic, and asymmetric. In such a case, estimating the conditional density itself is preferable, but conditional density estimation (CDE) is challen...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00683

    authors: Tangkaratt V,Xie N,Sugiyama M

    更新日期:2015-01-01 00:00:00

  • Variations on the Theme of Synaptic Filtering: A Comparison of Integrate-and-Express Models of Synaptic Plasticity for Memory Lifetimes.

    abstract::Integrate-and-express models of synaptic plasticity propose that synapses integrate plasticity induction signals before expressing synaptic plasticity. By discerning trends in their induction signals, synapses can control destabilizing fluctuations in synaptic strength. In a feedforward perceptron framework with binar...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00889

    authors: Elliott T

    更新日期:2016-11-01 00:00:00

  • Analytical integrate-and-fire neuron models with conductance-based dynamics for event-driven simulation strategies.

    abstract::Event-driven simulation strategies were proposed recently to simulate integrate-and-fire (IF) type neuronal models. These strategies can lead to computationally efficient algorithms for simulating large-scale networks of neurons; most important, such approaches are more precise than traditional clock-driven numerical ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2006.18.9.2146

    authors: Rudolph M,Destexhe A

    更新日期:2006-09-01 00:00:00

  • A neurocomputational model for cocaine addiction.

    abstract::Based on the dopamine hypotheses of cocaine addiction and the assumption of decrement of brain reward system sensitivity after long-term drug exposure, we propose a computational model for cocaine addiction. Utilizing average reward temporal difference reinforcement learning, we incorporate the elevation of basal rewa...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2009.10-08-882

    authors: Dezfouli A,Piray P,Keramati MM,Ekhtiari H,Lucas C,Mokri A

    更新日期:2009-10-01 00:00:00

  • Analyzing and Accelerating the Bottlenecks of Training Deep SNNs With Backpropagation.

    abstract::Spiking neural networks (SNNs) with the event-driven manner of transmitting spikes consume ultra-low power on neuromorphic chips. However, training deep SNNs is still challenging compared to convolutional neural networks (CNNs). The SNN training algorithms have not achieved the same performance as CNNs. In this letter...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01319

    authors: Chen R,Li L

    更新日期:2020-12-01 00:00:00

  • Boosted mixture of experts: an ensemble learning scheme.

    abstract::We present a new supervised learning procedure for ensemble machines, in which outputs of predictors, trained on different distributions, are combined by a dynamic classifier combination model. This procedure may be viewed as either a version of mixture of experts (Jacobs, Jordan, Nowlan, & Hintnon, 1991), applied to ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976699300016737

    authors: Avnimelech R,Intrator N

    更新日期:1999-02-15 00:00:00

  • Synchrony in heterogeneous networks of spiking neurons.

    abstract::The emergence of synchrony in the activity of large, heterogeneous networks of spiking neurons is investigated. We define the robustness of synchrony by the critical disorder at which the asynchronous state becomes linearly unstable. We show that at low firing rates, synchrony is more robust in excitatory networks tha...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976600300015286

    authors: Neltner L,Hansel D,Mato G,Meunier C

    更新日期:2000-07-01 00:00:00

  • Dynamics of learning near singularities in layered networks.

    abstract::We explicitly analyze the trajectories of learning near singularities in hierarchical networks, such as multilayer perceptrons and radial basis function networks, which include permutation symmetry of hidden nodes, and show their general properties. Such symmetry induces singularities in their parameter space, where t...

    journal_title:Neural computation

    pub_type: 信件

    doi:10.1162/neco.2007.12-06-414

    authors: Wei H,Zhang J,Cousseau F,Ozeki T,Amari S

    更新日期:2008-03-01 00:00:00

  • Neural integrator: a sandpile model.

    abstract::We investigated a model for the neural integrator based on hysteretic units connected by positive feedback. Hysteresis is assumed to emerge from the intrinsic properties of the cells. We consider the recurrent networks containing either bistable or multistable neurons. We apply our analysis to the oculomotor velocity-...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2008.12-06-416

    authors: Nikitchenko M,Koulakov A

    更新日期:2008-10-01 00:00:00

  • On the use of analytical expressions for the voltage distribution to analyze intracellular recordings.

    abstract::Different analytical expressions for the membrane potential distribution of membranes subject to synaptic noise have been proposed and can be very helpful in analyzing experimental data. However, all of these expressions are either approximations or limit cases, and it is not clear how they compare and which expressio...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2006.18.12.2917

    authors: Rudolph M,Destexhe A

    更新日期:2006-12-01 00:00:00

  • Distributed control of uncertain systems using superpositions of linear operators.

    abstract::Control in the natural environment is difficult in part because of uncertainty in the effect of actions. Uncertainty can be due to added motor or sensory noise, unmodeled dynamics, or quantization of sensory feedback. Biological systems are faced with further difficulties, since control must be performed by networks o...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00151

    authors: Sanger TD

    更新日期:2011-08-01 00:00:00

  • Hybrid integrate-and-fire model of a bursting neuron.

    abstract::We present a reduction of a Hodgkin-Huxley (HH)--style bursting model to a hybridized integrate-and-fire (IF) formalism based on a thorough bifurcation analysis of the neuron's dynamics. The model incorporates HH--style equations to evolve the subthreshold currents and includes IF mechanisms to characterize spike even...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976603322518768

    authors: Breen BJ,Gerken WC,Butera RJ Jr

    更新日期:2003-12-01 00:00:00

  • Neural associative memory with optimal Bayesian learning.

    abstract::Neural associative memories are perceptron-like single-layer networks with fast synaptic learning typically storing discrete associations between pairs of neural activity patterns. Previous work optimized the memory capacity for various models of synaptic learning: linear Hopfield-type rules, the Willshaw model employ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00127

    authors: Knoblauch A

    更新日期:2011-06-01 00:00:00

  • Bayesian framework for least-squares support vector machine classifiers, gaussian processes, and kernel Fisher discriminant analysis.

    abstract::The Bayesian evidence framework has been successfully applied to the design of multilayer perceptrons (MLPs) in the work of MacKay. Nevertheless, the training of MLPs suffers from drawbacks like the nonconvex optimization problem and the choice of the number of hidden units. In support vector machines (SVMs) for class...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976602753633411

    authors: Van Gestel T,Suykens JA,Lanckriet G,Lambrechts A,De Moor B,Vandewalle J

    更新日期:2002-05-01 00:00:00

  • A Distributed Framework for the Construction of Transport Maps.

    abstract::The need to reason about uncertainty in large, complex, and multimodal data sets has become increasingly common across modern scientific environments. The ability to transform samples from one distribution P to another distribution

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01172

    authors: Mesa DA,Tantiongloc J,Mendoza M,Kim S,P Coleman T

    更新日期:2019-04-01 00:00:00

  • Information loss in an optimal maximum likelihood decoding.

    abstract::The mutual information between a set of stimuli and the elicited neural responses is compared to the corresponding decoded information. The decoding procedure is presented as an artificial distortion of the joint probabilities between stimuli and responses. The information loss is quantified. Whenever the probabilitie...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976602317318947

    authors: Samengo I

    更新日期:2002-04-01 00:00:00

  • Evaluating auditory performance limits: II. One-parameter discrimination with random-level variation.

    abstract::Previous studies have combined analytical models of stochastic neural responses with signal detection theory (SDT) to predict psychophysical performance limits; however, these studies have typically been limited to simple models and simple psychophysical tasks. A companion article in this issue ("Evaluating Auditory P...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976601750541813

    authors: Heinz MG,Colburn HS,Carney LH

    更新日期:2001-10-01 00:00:00

  • Making the error-controlling algorithm of observable operator models constructive.

    abstract::Observable operator models (OOMs) are a class of models for stochastic processes that properly subsumes the class that can be modeled by finite-dimensional hidden Markov models (HMMs). One of the main advantages of OOMs over HMMs is that they admit asymptotically correct learning algorithms. A series of learning algor...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2009.10-08-878

    authors: Zhao MJ,Jaeger H,Thon M

    更新日期:2009-12-01 00:00:00

  • General Poisson exact breakdown of the mutual information to study the role of correlations in populations of neurons.

    abstract::We present an integrative formalism of mutual information expansion, the general Poisson exact breakdown, which explicitly evaluates the informational contribution of correlations in the spike counts both between and within neurons. The formalism was validated on simulated data and applied to real neurons recorded fro...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2010.04-09-989

    authors: Scaglione A,Moxon KA,Foffani G

    更新日期:2010-06-01 00:00:00

  • Mismatched training and test distributions can outperform matched ones.

    abstract::In learning theory, the training and test sets are assumed to be drawn from the same probability distribution. This assumption is also followed in practical situations, where matching the training and test distributions is considered desirable. Contrary to conventional wisdom, we show that mismatched training and test...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00697

    authors: González CR,Abu-Mostafa YS

    更新日期:2015-02-01 00:00:00

  • Computing sparse representations of multidimensional signals using Kronecker bases.

    abstract::Recently there has been great interest in sparse representations of signals under the assumption that signals (data sets) can be well approximated by a linear combination of few elements of a known basis (dictionary). Many algorithms have been developed to find such representations for one-dimensional signals (vectors...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00385

    authors: Caiafa CF,Cichocki A

    更新日期:2013-01-01 00:00:00

  • The successor representation and temporal context.

    abstract::The successor representation was introduced into reinforcement learning by Dayan ( 1993 ) as a means of facilitating generalization between states with similar successors. Although reinforcement learning in general has been used extensively as a model of psychological and neural processes, the psychological validity o...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00282

    authors: Gershman SJ,Moore CD,Todd MT,Norman KA,Sederberg PB

    更新日期:2012-06-01 00:00:00

  • Information recall using relative spike timing in a spiking neural network.

    abstract::We present a neural network that is capable of completing and correcting a spiking pattern given only a partial, noisy version. It operates in continuous time and represents information using the relative timing of individual spikes. The network is capable of correcting and recalling multiple patterns simultaneously. ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00306

    authors: Sterne P

    更新日期:2012-08-01 00:00:00

  • On the relation of slow feature analysis and Laplacian eigenmaps.

    abstract::The past decade has seen a rise of interest in Laplacian eigenmaps (LEMs) for nonlinear dimensionality reduction. LEMs have been used in spectral clustering, in semisupervised learning, and for providing efficient state representations for reinforcement learning. Here, we show that LEMs are closely related to slow fea...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00214

    authors: Sprekeler H

    更新日期:2011-12-01 00:00:00