Physiological gain leads to high ISI variability in a simple model of a cortical regular spiking cell.

Abstract:

:To understand the interspike interval (ISI) variability displayed by visual cortical neurons (Softky & Koch, 1993), it is critical to examine the dynamics of their neuronal integration, as well as the variability in their synaptic input current. Most previous models have focused on the latter factor. We match a simple integrate-and-fire model to the experimentally measured integrative properties of cortical regular spiking cells (McCormick, Connors, Lighthall, & Prince, 1985). After setting RC parameters, the post-spike voltage reset is set to match experimental measurements of neuronal gain (obtained from in vitro plots of firing frequency versus injected current). Examination of the resulting model leads to an intuitive picture of neuronal integration that unifies the seemingly contradictory 1/square root of N and random walk pictures that have previously been proposed. When ISIs are dominated by postspike recovery, 1/square root of N arguments hold and spiking is regular; after the "memory" of the last spike becomes negligible, spike threshold crossing is caused by input variance around a steady state and spiking is Poisson. In integrate-and-fire neurons matched to cortical cell physiology, steady-state behavior is predominant, and ISIs are highly variable at all physiological firing rates and for a wide range of inhibitory and excitatory inputs.

journal_name

Neural Comput

journal_title

Neural computation

authors

Troyer TW,Miller KD

doi

10.1162/neco.1997.9.5.971

subject

Has Abstract

pub_date

1997-07-01 00:00:00

pages

971-83

issue

5

eissn

0899-7667

issn

1530-888X

journal_volume

9

pub_type

杂志文章
  • Modeling sensorimotor learning with linear dynamical systems.

    abstract::Recent studies have employed simple linear dynamical systems to model trial-by-trial dynamics in various sensorimotor learning tasks. Here we explore the theoretical and practical considerations that arise when employing the general class of linear dynamical systems (LDS) as a model for sensorimotor learning. In this ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976606775774651

    authors: Cheng S,Sabes PN

    更新日期:2006-04-01 00:00:00

  • Hebbian learning of recurrent connections: a geometrical perspective.

    abstract::We show how a Hopfield network with modifiable recurrent connections undergoing slow Hebbian learning can extract the underlying geometry of an input space. First, we use a slow and fast analysis to derive an averaged system whose dynamics derives from an energy function and therefore always converges to equilibrium p...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00322

    authors: Galtier MN,Faugeras OD,Bressloff PC

    更新日期:2012-09-01 00:00:00

  • Dynamic Neural Turing Machine with Continuous and Discrete Addressing Schemes.

    abstract::We extend the neural Turing machine (NTM) model into a dynamic neural Turing machine (D-NTM) by introducing trainable address vectors. This addressing scheme maintains for each memory cell two separate vectors, content and address vectors. This allows the D-NTM to learn a wide variety of location-based addressing stra...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01060

    authors: Gulcehre C,Chandar S,Cho K,Bengio Y

    更新日期:2018-04-01 00:00:00

  • Characterization of minimum error linear coding with sensory and neural noise.

    abstract::Robust coding has been proposed as a solution to the problem of minimizing decoding error in the presence of neural noise. Many real-world problems, however, have degradation in the input signal, not just in neural representations. This generalized problem is more relevant to biological sensory coding where internal n...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00181

    authors: Doi E,Lewicki MS

    更新日期:2011-10-01 00:00:00

  • Spiking neural P systems with astrocytes.

    abstract::In a biological nervous system, astrocytes play an important role in the functioning and interaction of neurons, and astrocytes have excitatory and inhibitory influence on synapses. In this work, with this biological inspiration, a class of computation devices that consist of neurons and astrocytes is introduced, call...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00238

    authors: Pan L,Wang J,Hoogeboom HJ

    更新日期:2012-03-01 00:00:00

  • Optimality of Upper-Arm Reaching Trajectories Based on the Expected Value of the Metabolic Energy Cost.

    abstract::When we move our body to perform a movement task, our central nervous system selects a movement trajectory from an infinite number of possible trajectories under constraints that have been acquired through evolution and learning. Minimization of the energy cost has been suggested as a potential candidate for a constra...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00757

    authors: Taniai Y,Nishii J

    更新日期:2015-08-01 00:00:00

  • Convergence of the IRWLS Procedure to the Support Vector Machine Solution.

    abstract::An iterative reweighted least squares (IRWLS) procedure recently proposed is shown to converge to the support vector machine solution. The convergence to a stationary point is ensured by modifying the original IRWLS procedure. ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/0899766052530875

    authors: Pérez-Cruz F,Bousoño-Calzón C,Artés-Rodríguez A

    更新日期:2005-01-01 00:00:00

  • A Mean-Field Description of Bursting Dynamics in Spiking Neural Networks with Short-Term Adaptation.

    abstract::Bursting plays an important role in neural communication. At the population level, macroscopic bursting has been identified in populations of neurons that do not express intrinsic bursting mechanisms. For the analysis of phase transitions between bursting and non-bursting states, mean-field descriptions of macroscopic...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01300

    authors: Gast R,Schmidt H,Knösche TR

    更新日期:2020-09-01 00:00:00

  • Parameter learning for alpha integration.

    abstract::In pattern recognition, data integration is an important issue, and when properly done, it can lead to improved performance. Also, data integration can be used to help model and understand multimodal processing in the brain. Amari proposed α-integration as a principled way of blending multiple positive measures (e.g.,...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00445

    authors: Choi H,Choi S,Choe Y

    更新日期:2013-06-01 00:00:00

  • Computation in a single neuron: Hodgkin and Huxley revisited.

    abstract::A spiking neuron "computes" by transforming a complex dynamical input into a train of action potentials, or spikes. The computation performed by the neuron can be formulated as dimensional reduction, or feature detection, followed by a nonlinear decision function over the low-dimensional space. Generalizations of the ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/08997660360675017

    authors: Agüera y Arcas B,Fairhall AL,Bialek W

    更新日期:2003-08-01 00:00:00

  • Fast recursive filters for simulating nonlinear dynamic systems.

    abstract::A fast and accurate computational scheme for simulating nonlinear dynamic systems is presented. The scheme assumes that the system can be represented by a combination of components of only two different types: first-order low-pass filters and static nonlinearities. The parameters of these filters and nonlinearities ma...

    journal_title:Neural computation

    pub_type: 信件

    doi:10.1162/neco.2008.04-07-506

    authors: van Hateren JH

    更新日期:2008-07-01 00:00:00

  • A Novel Reconstruction Framework for Time-Encoded Signals with Integrate-and-Fire Neurons.

    abstract::Integrate-and-fire neurons are time encoding machines that convert the amplitude of an analog signal into a nonuniform, strictly increasing sequence of spike times. Under certain conditions, the encoded signals can be reconstructed from the nonuniform spike time sequences using a time decoding machine. Time encoding a...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00764

    authors: Florescu D,Coca D

    更新日期:2015-09-01 00:00:00

  • Range-based ICA using a nonsmooth quasi-newton optimizer for electroencephalographic source localization in focal epilepsy.

    abstract::Independent component analysis (ICA) aims at separating a multivariate signal into independent nongaussian signals by optimizing a contrast function with no knowledge on the mixing mechanism. Despite the availability of a constellation of contrast functions, a Hartley-entropy-based ICA contrast endowed with the discri...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00700

    authors: Selvan SE,George ST,Balakrishnan R

    更新日期:2015-03-01 00:00:00

  • Reinforcement learning in continuous time and space.

    abstract::This article presents a reinforcement learning framework for continuous-time dynamical systems without a priori discretization of time, state, and action. Based on the Hamilton-Jacobi-Bellman (HJB) equation for infinite-horizon, discounted reward problems, we derive algorithms for estimating value functions and improv...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976600300015961

    authors: Doya K

    更新日期:2000-01-01 00:00:00

  • Delay Differential Analysis of Seizures in Multichannel Electrocorticography Data.

    abstract::High-density electrocorticogram (ECoG) electrodes are capable of recording neurophysiological data with high temporal resolution with wide spatial coverage. These recordings are a window to understanding how the human brain processes information and subsequently behaves in healthy and pathologic states. Here, we descr...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01009

    authors: Lainscsek C,Weyhenmeyer J,Cash SS,Sejnowski TJ

    更新日期:2017-12-01 00:00:00

  • Analysis of cluttered scenes using an elastic matching approach for stereo images.

    abstract::We present a system for the automatic interpretation of cluttered scenes containing multiple partly occluded objects in front of unknown, complex backgrounds. The system is based on an extended elastic graph matching algorithm that allows the explicit modeling of partial occlusions. Our approach extends an earlier sys...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2006.18.6.1441

    authors: Eckes C,Triesch J,von der Malsburg C

    更新日期:2006-06-01 00:00:00

  • On the Convergence of the LMS Algorithm with Adaptive Learning Rate for Linear Feedforward Networks.

    abstract::We consider the problem of training a linear feedforward neural network by using a gradient descent-like LMS learning algorithm. The objective is to find a weight matrix for the network, by repeatedly presenting to it a finite set of examples, so that the sum of the squares of the errors is minimized. Kohonen showed t...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.1991.3.2.226

    authors: Luo ZQ

    更新日期:1991-07-01 00:00:00

  • Random embedding machines for pattern recognition.

    abstract::Real classification problems involve structured data that can be essentially grouped into a relatively small number of clusters. It is shown that, under a local clustering condition, a set of points of a given class, embedded in binary space by a set of randomly parameterized surfaces, is linearly separable from other...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976601753196012

    authors: Baram Y

    更新日期:2001-11-01 00:00:00

  • Bias/Variance Decompositions for Likelihood-Based Estimators.

    abstract::The bias/variance decomposition of mean-squared error is well understood and relatively straightforward. In this note, a similar simple decomposition is derived, valid for any kind of error measure that, when using the appropriate probability model, can be derived from a Kullback-Leibler divergence or log-likelihood. ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976698300017232

    authors: Heskes T

    更新日期:1998-07-28 00:00:00

  • Gaussian process approach to spiking neurons for inhomogeneous Poisson inputs.

    abstract::This article presents a new theoretical framework to consider the dynamics of a stochastic spiking neuron model with general membrane response to input spike. We assume that the input spikes obey an inhomogeneous Poisson process. The stochastic process of the membrane potential then becomes a gaussian process. When a ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976601317098529

    authors: Amemori KI,Ishii S

    更新日期:2001-12-01 00:00:00

  • On the problem in model selection of neural network regression in overrealizable scenario.

    abstract::In considering a statistical model selection of neural networks and radial basis functions under an overrealizable case, the problem of unidentifiability emerges. Because the model selection criterion is an unbiased estimator of the generalization error based on the training error, this article analyzes the expected t...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976602760128090

    authors: Hagiwara K

    更新日期:2002-08-01 00:00:00

  • Fast population coding.

    abstract::Uncertainty coming from the noise in its neurons and the ill-posed nature of many tasks plagues neural computations. Maybe surprisingly, many studies show that the brain manipulates these forms of uncertainty in a probabilistically consistent and normative manner, and there is now a rich theoretical literature on the ...

    journal_title:Neural computation

    pub_type: 信件

    doi:10.1162/neco.2007.19.2.404

    authors: Huys QJ,Zemel RS,Natarajan R,Dayan P

    更新日期:2007-02-01 00:00:00

  • Invariant global motion recognition in the dorsal visual system: a unifying theory.

    abstract::The motion of an object (such as a wheel rotating) is seen as consistent independent of its position and size on the retina. Neurons in higher cortical visual areas respond to these global motion stimuli invariantly, but neurons in early cortical areas with small receptive fields cannot represent this motion, not only...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2007.19.1.139

    authors: Rolls ET,Stringer SM

    更新日期:2007-01-01 00:00:00

  • A hierarchical dynamical map as a basic frame for cortical mapping and its application to priming.

    abstract::A hierarchical dynamical map is proposed as the basic framework for sensory cortical mapping. To show how the hierarchical dynamical map works in cognitive processes, we applied it to a typical cognitive task known as priming, in which cognitive performance is facilitated as a consequence of prior experience. Prior to...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/08997660152469341

    authors: Hoshino O,Inoue S,Kashimori Y,Kambara T

    更新日期:2001-08-01 00:00:00

  • Scalable hybrid computation with spikes.

    abstract::We outline a hybrid analog-digital scheme for computing with three important features that enable it to scale to systems of large complexity: First, like digital computation, which uses several one-bit precise logical units to collectively compute a precise answer to a computation, the hybrid scheme uses several moder...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976602320263971

    authors: Sarpeshkar R,O'Halloran M

    更新日期:2002-09-01 00:00:00

  • Learning Hough transform: a neural network model.

    abstract::A single-layered Hough transform network is proposed that accepts image coordinates of each object pixel as input and produces a set of outputs that indicate the belongingness of the pixel to a particular structure (e.g., a straight line). The network is able to learn adaptively the parametric forms of the linear segm...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976601300014501

    authors: Basak J

    更新日期:2001-03-01 00:00:00

  • Hidden Quantum Processes, Quantum Ion Channels, and 1/ fθ-Type Noise.

    abstract::In this letter, we perform a complete and in-depth analysis of Lorentzian noises, such as those arising from [Formula: see text] and [Formula: see text] channel kinetics, in order to identify the source of [Formula: see text]-type noise in neurological membranes. We prove that the autocovariance of Lorentzian noise de...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_01067

    authors: Paris A,Vosoughi A,Berman SA,Atia G

    更新日期:2018-07-01 00:00:00

  • When Not to Classify: Anomaly Detection of Attacks (ADA) on DNN Classifiers at Test Time.

    abstract::A significant threat to the recent, wide deployment of machine learning-based systems, including deep neural networks (DNNs), is adversarial learning attacks. The main focus here is on evasion attacks against DNN-based classifiers at test time. While much work has focused on devising attacks that make small perturbati...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01209

    authors: Miller D,Wang Y,Kesidis G

    更新日期:2019-08-01 00:00:00

  • Synchrony of neuronal oscillations controlled by GABAergic reversal potentials.

    abstract::GABAergic synapse reversal potential is controlled by the concentration of chloride. This concentration can change significantly during development and as a function of neuronal activity. Thus, GABA inhibition can be hyperpolarizing, shunting, or partially depolarizing. Previous results pinpointed the conditions under...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2007.19.3.706

    authors: Jeong HY,Gutkin B

    更新日期:2007-03-01 00:00:00

  • Robustness of connectionist swimming controllers against random variation in neural connections.

    abstract::The ability to achieve high swimming speed and efficiency is very important to both the real lamprey and its robotic implementation. In previous studies, we used evolutionary algorithms to evolve biologically plausible connectionist swimming controllers for a simulated lamprey. This letter investigates the robustness ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2007.19.6.1568

    authors: Or J

    更新日期:2007-06-01 00:00:00