Cortical spatiotemporal dimensionality reduction for visual grouping.

Abstract:

:The visual systems of many mammals, including humans, are able to integrate the geometric information of visual stimuli and perform cognitive tasks at the first stages of the cortical processing. This is thought to be the result of a combination of mechanisms, which include feature extraction at the single cell level and geometric processing by means of cell connectivity. We present a geometric model of such connectivities in the space of detected features associated with spatiotemporal visual stimuli and show how they can be used to obtain low-level object segmentation. The main idea is to define a spectral clustering procedure with anisotropic affinities over data sets consisting of embeddings of the visual stimuli into higher-dimensional spaces. Neural plausibility of the proposed arguments will be discussed.

journal_name

Neural Comput

journal_title

Neural computation

authors

Cocci G,Barbieri D,Citti G,Sarti A

doi

10.1162/NECO_a_00738

subject

Has Abstract

pub_date

2015-06-01 00:00:00

pages

1252-93

issue

6

eissn

0899-7667

issn

1530-888X

journal_volume

27

pub_type

杂志文章
  • On the performance of voltage stepping for the simulation of adaptive, nonlinear integrate-and-fire neuronal networks.

    abstract::In traditional event-driven strategies, spike timings are analytically given or calculated with arbitrary precision (up to machine precision). Exact computation is possible only for simplified neuron models, mainly the leaky integrate-and-fire model. In a recent paper, Zheng, Tonnelier, and Martinez (2009) introduced ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00112

    authors: Kaabi MG,Tonnelier A,Martinez D

    更新日期:2011-05-01 00:00:00

  • Synchronized firings in the networks of class 1 excitable neurons with excitatory and inhibitory connections and their dependences on the forms of interactions.

    abstract::Synchronized firings in the networks of class 1 excitable neurons with excitatory and inhibitory connections are investigated, and their dependences on the forms of interactions are analyzed. As the forms of interactions, we treat the double exponential coupling and the interactions derived from it: pulse coupling, ex...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/0899766053630387

    authors: Kanamaru T,Sekine M

    更新日期:2005-06-01 00:00:00

  • Nonlinear complex-valued extensions of Hebbian learning: an essay.

    abstract::The Hebbian paradigm is perhaps the best-known unsupervised learning theory in connectionism. It has inspired wide research activity in the artificial neural network field because it embodies some interesting properties such as locality and the capability of being applicable to the basic weight-and-sum structure of ne...

    journal_title:Neural computation

    pub_type: 杂志文章,评审

    doi:10.1162/0899766053429381

    authors: Fiori S

    更新日期:2005-04-01 00:00:00

  • NMDA Receptor Alterations After Mild Traumatic Brain Injury Induce Deficits in Memory Acquisition and Recall.

    abstract::Mild traumatic brain injury (mTBI) presents a significant health concern with potential persisting deficits that can last decades. Although a growing body of literature improves our understanding of the brain network response and corresponding underlying cellular alterations after injury, the effects of cellular disru...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01343

    authors: Gabrieli D,Schumm SN,Vigilante NF,Meaney DF

    更新日期:2021-01-01 00:00:00

  • Linking Neuromodulated Spike-Timing Dependent Plasticity with the Free-Energy Principle.

    abstract::The free-energy principle is a candidate unified theory for learning and memory in the brain that predicts that neurons, synapses, and neuromodulators work in a manner that minimizes free energy. However, electrophysiological data elucidating the neural and synaptic bases for this theory are lacking. Here, we propose ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00862

    authors: Isomura T,Sakai K,Kotani K,Jimbo Y

    更新日期:2016-09-01 00:00:00

  • Statistical computer model analysis of the reciprocal and recurrent inhibitions of the Ia-EPSP in α-motoneurons.

    abstract::We simulate the inhibition of Ia-glutamatergic excitatory postsynaptic potential (EPSP) by preceding it with glycinergic recurrent (REN) and reciprocal (REC) inhibitory postsynaptic potentials (IPSPs). The inhibition is evaluated in the presence of voltage-dependent conductances of sodium, delayed rectifier potassium,...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00375

    authors: Gradwohl G,Grossman Y

    更新日期:2013-01-01 00:00:00

  • Making the error-controlling algorithm of observable operator models constructive.

    abstract::Observable operator models (OOMs) are a class of models for stochastic processes that properly subsumes the class that can be modeled by finite-dimensional hidden Markov models (HMMs). One of the main advantages of OOMs over HMMs is that they admit asymptotically correct learning algorithms. A series of learning algor...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2009.10-08-878

    authors: Zhao MJ,Jaeger H,Thon M

    更新日期:2009-12-01 00:00:00

  • Bayesian Filtering with Multiple Internal Models: Toward a Theory of Social Intelligence.

    abstract::To exhibit social intelligence, animals have to recognize whom they are communicating with. One way to make this inference is to select among internal generative models of each conspecific who may be encountered. However, these models also have to be learned via some form of Bayesian belief updating. This induces an i...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01239

    authors: Isomura T,Parr T,Friston K

    更新日期:2019-12-01 00:00:00

  • Oscillating Networks: Control of Burst Duration by Electrically Coupled Neurons.

    abstract::The pyloric network of the stomatogastric ganglion in crustacea is a central pattern generator that can produce the same basic rhythm over a wide frequency range. Three electrically coupled neurons, the anterior burster (AB) neuron and two pyloric dilator (PD) neurons, act as a pacemaker unit for the pyloric network. ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.1991.3.4.487

    authors: Abbott LF,Marder E,Hooper SL

    更新日期:1991-01-01 00:00:00

  • Selectivity and stability via dendritic nonlinearity.

    abstract::Inspired by recent studies regarding dendritic computation, we constructed a recurrent neural network model incorporating dendritic lateral inhibition. Our model consists of an input layer and a neuron layer that includes excitatory cells and an inhibitory cell; this inhibitory cell is activated by the pooled activiti...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2007.19.7.1798

    authors: Morita K,Okada M,Aihara K

    更新日期:2007-07-01 00:00:00

  • Are loss functions all the same?

    abstract::In this letter, we investigate the impact of choosing different loss functions from the viewpoint of statistical learning theory. We introduce a convexity assumption, which is met by all loss functions commonly used in the literature, and study how the bound on the estimation error changes with the loss. We also deriv...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976604773135104

    authors: Rosasco L,De Vito E,Caponnetto A,Piana M,Verri A

    更新日期:2004-05-01 00:00:00

  • Scalable Semisupervised Functional Neurocartography Reveals Canonical Neurons in Behavioral Networks.

    abstract::Large-scale data collection efforts to map the brain are underway at multiple spatial and temporal scales, but all face fundamental problems posed by high-dimensional data and intersubject variability. Even seemingly simple problems, such as identifying a neuron/brain region across animals/subjects, become exponential...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00852

    authors: Frady EP,Kapoor A,Horvitz E,Kristan WB Jr

    更新日期:2016-08-01 00:00:00

  • Dynamic Neural Turing Machine with Continuous and Discrete Addressing Schemes.

    abstract::We extend the neural Turing machine (NTM) model into a dynamic neural Turing machine (D-NTM) by introducing trainable address vectors. This addressing scheme maintains for each memory cell two separate vectors, content and address vectors. This allows the D-NTM to learn a wide variety of location-based addressing stra...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01060

    authors: Gulcehre C,Chandar S,Cho K,Bengio Y

    更新日期:2018-04-01 00:00:00

  • Neuronal assembly dynamics in supervised and unsupervised learning scenarios.

    abstract::The dynamic formation of groups of neurons--neuronal assemblies--is believed to mediate cognitive phenomena at many levels, but their detailed operation and mechanisms of interaction are still to be uncovered. One hypothesis suggests that synchronized oscillations underpin their formation and functioning, with a focus...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00502

    authors: Moioli RC,Husbands P

    更新日期:2013-11-01 00:00:00

  • The Discriminative Kalman Filter for Bayesian Filtering with Nonlinear and Nongaussian Observation Models.

    abstract::The Kalman filter provides a simple and efficient algorithm to compute the posterior distribution for state-space models where both the latent state and measurement models are linear and gaussian. Extensions to the Kalman filter, including the extended and unscented Kalman filters, incorporate linearizations for model...

    journal_title:Neural computation

    pub_type: 信件

    doi:10.1162/neco_a_01275

    authors: Burkhart MC,Brandman DM,Franco B,Hochberg LR,Harrison MT

    更新日期:2020-05-01 00:00:00

  • Effects of fast presynaptic noise in attractor neural networks.

    abstract::We study both analytically and numerically the effect of presynaptic noise on the transmission of information in attractor neural networks. The noise occurs on a very short timescale compared to that for the neuron dynamics and it produces short-time synaptic depression. This is inspired in recent neurobiological find...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976606775623342

    authors: Cortes JM,Torres JJ,Marro J,Garrido PL,Kappen HJ

    更新日期:2006-03-01 00:00:00

  • Determining Burst Firing Time Distributions from Multiple Spike Trains.

    abstract::Recent experimental findings have shown the presence of robust and cell-type-specific intraburst firing patterns in bursting neurons. We address the problem of characterizing these patterns under the assumption that the bursts exhibit well-defined firing time distributions. We propose a method for estimating these dis...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2008.07-07-571

    authors: Lago-Fernández LF,Szücs A,Varona P

    更新日期:2009-04-01 00:00:00

  • Computing sparse representations of multidimensional signals using Kronecker bases.

    abstract::Recently there has been great interest in sparse representations of signals under the assumption that signals (data sets) can be well approximated by a linear combination of few elements of a known basis (dictionary). Many algorithms have been developed to find such representations for one-dimensional signals (vectors...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00385

    authors: Caiafa CF,Cichocki A

    更新日期:2013-01-01 00:00:00

  • Hidden Quantum Processes, Quantum Ion Channels, and 1/ fθ-Type Noise.

    abstract::In this letter, we perform a complete and in-depth analysis of Lorentzian noises, such as those arising from [Formula: see text] and [Formula: see text] channel kinetics, in order to identify the source of [Formula: see text]-type noise in neurological membranes. We prove that the autocovariance of Lorentzian noise de...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_01067

    authors: Paris A,Vosoughi A,Berman SA,Atia G

    更新日期:2018-07-01 00:00:00

  • Similarity, connectionism, and the problem of representation in vision.

    abstract::A representational scheme under which the ranking between represented similarities is isomorphic to the ranking between the corresponding shape similarities can support perfectly correct shape classification because it preserves the clustering of shapes according to the natural kinds prevailing in the external world. ...

    journal_title:Neural computation

    pub_type: 杂志文章,评审

    doi:10.1162/neco.1997.9.4.701

    authors: Edelman S,Duvdevani-Bar S

    更新日期:1997-05-15 00:00:00

  • ParceLiNGAM: a causal ordering method robust against latent confounders.

    abstract::We consider learning a causal ordering of variables in a linear nongaussian acyclic model called LiNGAM. Several methods have been shown to consistently estimate a causal ordering assuming that all the model assumptions are correct. But the estimation results could be distorted if some assumptions are violated. In thi...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00533

    authors: Tashiro T,Shimizu S,Hyvärinen A,Washio T

    更新日期:2014-01-01 00:00:00

  • Temporal coding: assembly formation through constructive interference.

    abstract::Temporal coding is studied for an oscillatory neural network model with synchronization and acceleration. The latter mechanism refers to increasing (decreasing) the phase velocity of each unit for stronger (weaker) or more coherent (decoherent) input from the other units. It has been demonstrated that acceleration gen...

    journal_title:Neural computation

    pub_type: 信件

    doi:10.1162/neco.2008.09-06-342

    authors: Burwick T

    更新日期:2008-07-01 00:00:00

  • Training nu-support vector classifiers: theory and algorithms.

    abstract::The nu-support vector machine (nu-SVM) for classification proposed by Schölkopf, Smola, Williamson, and Bartlett (2000) has the advantage of using a parameter nu on controlling the number of support vectors. In this article, we investigate the relation between nu-SVM and C-SVM in detail. We show that in general they a...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976601750399335

    authors: Chang CC,Lin CJ

    更新日期:2001-09-01 00:00:00

  • Temporal sequence learning, prediction, and control: a review of different models and their relation to biological mechanisms.

    abstract::In this review, we compare methods for temporal sequence learning (TSL) across the disciplines machine-control, classical conditioning, neuronal models for TSL as well as spike-timing-dependent plasticity (STDP). This review introduces the most influential models and focuses on two questions: To what degree are reward...

    journal_title:Neural computation

    pub_type: 杂志文章,评审

    doi:10.1162/0899766053011555

    authors: Wörgötter F,Porr B

    更新日期:2005-02-01 00:00:00

  • Populations of tightly coupled neurons: the RGC/LGN system.

    abstract::A mathematical model, of general character for the dynamic description of coupled neural oscillators is presented. The population approach that is employed applies equally to coupled cells as to populations of such coupled cells. The formulation includes stochasticity and preserves details of precisely firing neurons....

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2007.03-07-482

    authors: Sirovich L

    更新日期:2008-05-01 00:00:00

  • The time-organized map algorithm: extending the self-organizing map to spatiotemporal signals.

    abstract::The new time-organized map (TOM) is presented for a better understanding of the self-organization and geometric structure of cortical signal representations. The algorithm extends the common self-organizing map (SOM) from the processing of purely spatial signals to the processing of spatiotemporal signals. The main ad...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976603765202695

    authors: Wiemer JC

    更新日期:2003-05-01 00:00:00

  • Derivatives of logarithmic stationary distributions for policy gradient reinforcement learning.

    abstract::Most conventional policy gradient reinforcement learning (PGRL) algorithms neglect (or do not explicitly make use of) a term in the average reward gradient with respect to the policy parameter. That term involves the derivative of the stationary state distribution that corresponds to the sensitivity of its distributio...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2009.12-08-922

    authors: Morimura T,Uchibe E,Yoshimoto J,Peters J,Doya K

    更新日期:2010-02-01 00:00:00

  • Estimation and marginalization using the Kikuchi approximation methods.

    abstract::In this letter, we examine a general method of approximation, known as the Kikuchi approximation method, for finding the marginals of a product distribution, as well as the corresponding partition function. The Kikuchi approximation method defines a certain constrained optimization problem, called the Kikuchi problem,...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/0899766054026693

    authors: Pakzad P,Anantharam V

    更新日期:2005-08-01 00:00:00

  • On the use of analytical expressions for the voltage distribution to analyze intracellular recordings.

    abstract::Different analytical expressions for the membrane potential distribution of membranes subject to synaptic noise have been proposed and can be very helpful in analyzing experimental data. However, all of these expressions are either approximations or limit cases, and it is not clear how they compare and which expressio...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2006.18.12.2917

    authors: Rudolph M,Destexhe A

    更新日期:2006-12-01 00:00:00

  • A simple Hebbian/anti-Hebbian network learns the sparse, independent components of natural images.

    abstract::Slightly modified versions of an early Hebbian/anti-Hebbian neural network are shown to be capable of extracting the sparse, independent linear components of a prefiltered natural image set. An explanation for this capability in terms of a coupling between two hypothetical networks is presented. The simple networks pr...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976606775093891

    authors: Falconbridge MS,Stamps RL,Badcock DR

    更新日期:2006-02-01 00:00:00