Fast recursive filters for simulating nonlinear dynamic systems.

Abstract:

:A fast and accurate computational scheme for simulating nonlinear dynamic systems is presented. The scheme assumes that the system can be represented by a combination of components of only two different types: first-order low-pass filters and static nonlinearities. The parameters of these filters and nonlinearities may depend on system variables, and the topology of the system may be complex, including feedback. Several examples taken from neuroscience are given: phototransduction, photopigment bleaching, and spike generation according to the Hodgkin-Huxley equations. The scheme uses two slightly different forms of autoregressive filters, with an implicit delay of zero for feedforward control and an implicit delay of half a sample distance for feedback control. On a fairly complex model of the macaque retinal horizontal cell, it computes, for a given level of accuracy, one to two orders of magnitude faster than the fourth-order Runge-Kutta. The computational scheme has minimal memory requirements and is also suited for computation on a stream processor, such as a graphical processing unit.

journal_name

Neural Comput

journal_title

Neural computation

authors

van Hateren JH

doi

10.1162/neco.2008.04-07-506

subject

Has Abstract

pub_date

2008-07-01 00:00:00

pages

1821-46

issue

7

eissn

0899-7667

issn

1530-888X

journal_volume

20

pub_type

信件
  • The effects of input rate and synchrony on a coincidence detector: analytical solution.

    abstract::We derive analytically the solution for the output rate of the ideal coincidence detector. The solution is for an arbitrary number of input spike trains with identical binomial count distributions (which includes Poisson statistics as a special case) and identical arbitrary pairwise cross-correlations, from zero corre...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976603321192068

    authors: Mikula S,Niebur E

    更新日期:2003-03-01 00:00:00

  • Dependence of neuronal correlations on filter characteristics and marginal spike train statistics.

    abstract::Correlated neural activity has been observed at various signal levels (e.g., spike count, membrane potential, local field potential, EEG, fMRI BOLD). Most of these signals can be considered as superpositions of spike trains filtered by components of the neural system (synapses, membranes) and the measurement process. ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2008.05-07-525

    authors: Tetzlaff T,Rotter S,Stark E,Abeles M,Aertsen A,Diesmann M

    更新日期:2008-09-01 00:00:00

  • Whence the Expected Free Energy?

    abstract::The expected free energy (EFE) is a central quantity in the theory of active inference. It is the quantity that all active inference agents are mandated to minimize through action, and its decomposition into extrinsic and intrinsic value terms is key to the balance of exploration and exploitation that active inference...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01354

    authors: Millidge B,Tschantz A,Buckley CL

    更新日期:2021-01-05 00:00:00

  • Long-term reward prediction in TD models of the dopamine system.

    abstract::This article addresses the relationship between long-term reward predictions and slow-timescale neural activity in temporal difference (TD) models of the dopamine system. Such models attempt to explain how the activity of dopamine (DA) neurons relates to errors in the prediction of future rewards. Previous models have...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976602760407973

    authors: Daw ND,Touretzky DS

    更新日期:2002-11-01 00:00:00

  • A unifying view of wiener and volterra theory and polynomial kernel regression.

    abstract::Volterra and Wiener series are perhaps the best-understood nonlinear system representations in signal processing. Although both approaches have enjoyed a certain popularity in the past, their application has been limited to rather low-dimensional and weakly nonlinear systems due to the exponential growth of the number...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2006.18.12.3097

    authors: Franz MO,Schölkopf B

    更新日期:2006-12-01 00:00:00

  • Visual Categorization with Random Projection.

    abstract::Humans learn categories of complex objects quickly and from a few examples. Random projection has been suggested as a means to learn and categorize efficiently. We investigate how random projection affects categorization by humans and by very simple neural networks on the same stimuli and categorization tasks, and how...

    journal_title:Neural computation

    pub_type: 信件

    doi:10.1162/NECO_a_00769

    authors: Arriaga RI,Rutter D,Cakmak M,Vempala SS

    更新日期:2015-10-01 00:00:00

  • The time-organized map algorithm: extending the self-organizing map to spatiotemporal signals.

    abstract::The new time-organized map (TOM) is presented for a better understanding of the self-organization and geometric structure of cortical signal representations. The algorithm extends the common self-organizing map (SOM) from the processing of purely spatial signals to the processing of spatiotemporal signals. The main ad...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976603765202695

    authors: Wiemer JC

    更新日期:2003-05-01 00:00:00

  • Nonlinear Time&hyphenSeries Prediction with Missing and Noisy Data

    abstract::We derive solutions for the problem of missing and noisy data in nonlinear time&hyphenseries prediction from a probabilistic point of view. We discuss different approximations to the solutions &hyphen in particular, approximations that require either stochastic simulation or the substitution of a single estimate for t...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976698300017728

    authors: Tresp V V,Hofmann R

    更新日期:1998-03-23 00:00:00

  • A modified algorithm for generalized discriminant analysis.

    abstract::Generalized discriminant analysis (GDA) is an extension of the classical linear discriminant analysis (LDA) from linear domain to a nonlinear domain via the kernel trick. However, in the previous algorithm of GDA, the solutions may suffer from the degenerate eigenvalue problem (i.e., several eigenvectors with the same...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976604773717612

    authors: Zheng W,Zhao L,Zou C

    更新日期:2004-06-01 00:00:00

  • Convergence of the IRWLS Procedure to the Support Vector Machine Solution.

    abstract::An iterative reweighted least squares (IRWLS) procedure recently proposed is shown to converge to the support vector machine solution. The convergence to a stationary point is ensured by modifying the original IRWLS procedure. ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/0899766052530875

    authors: Pérez-Cruz F,Bousoño-Calzón C,Artés-Rodríguez A

    更新日期:2005-01-01 00:00:00

  • Training nu-support vector classifiers: theory and algorithms.

    abstract::The nu-support vector machine (nu-SVM) for classification proposed by Schölkopf, Smola, Williamson, and Bartlett (2000) has the advantage of using a parameter nu on controlling the number of support vectors. In this article, we investigate the relation between nu-SVM and C-SVM in detail. We show that in general they a...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976601750399335

    authors: Chang CC,Lin CJ

    更新日期:2001-09-01 00:00:00

  • Similarity, connectionism, and the problem of representation in vision.

    abstract::A representational scheme under which the ranking between represented similarities is isomorphic to the ranking between the corresponding shape similarities can support perfectly correct shape classification because it preserves the clustering of shapes according to the natural kinds prevailing in the external world. ...

    journal_title:Neural computation

    pub_type: 杂志文章,评审

    doi:10.1162/neco.1997.9.4.701

    authors: Edelman S,Duvdevani-Bar S

    更新日期:1997-05-15 00:00:00

  • Online adaptive decision trees.

    abstract::Decision trees and neural networks are widely used tools for pattern classification. Decision trees provide highly localized representation, whereas neural networks provide a distributed but compact representation of the decision space. Decision trees cannot be induced in the online mode, and they are not adaptive to ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/0899766041336396

    authors: Basak J

    更新日期:2004-09-01 00:00:00

  • Normalization enables robust validation of disparity estimates from neural populations.

    abstract::Binocular fusion takes place over a limited region smaller than one degree of visual angle (Panum's fusional area), which is on the order of the range of preferred disparities measured in populations of disparity-tuned neurons in the visual cortex. However, the actual range of binocular disparities encountered in natu...

    journal_title:Neural computation

    pub_type: 信件

    doi:10.1162/neco.2008.05-07-532

    authors: Tsang EK,Shi BE

    更新日期:2008-10-01 00:00:00

  • Statistical computer model analysis of the reciprocal and recurrent inhibitions of the Ia-EPSP in α-motoneurons.

    abstract::We simulate the inhibition of Ia-glutamatergic excitatory postsynaptic potential (EPSP) by preceding it with glycinergic recurrent (REN) and reciprocal (REC) inhibitory postsynaptic potentials (IPSPs). The inhibition is evaluated in the presence of voltage-dependent conductances of sodium, delayed rectifier potassium,...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00375

    authors: Gradwohl G,Grossman Y

    更新日期:2013-01-01 00:00:00

  • Solving stereo transparency with an extended coarse-to-fine disparity energy model.

    abstract::Modeling stereo transparency with physiologically plausible mechanisms is challenging because in such frameworks, large receptive fields mix up overlapping disparities, whereas small receptive fields can reliably compute only small disparities. It seems necessary to combine information across scales. A coarse-to-fine ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00722

    authors: Li Z,Qian N

    更新日期:2015-05-01 00:00:00

  • Generalization and selection of examples in feedforward neural networks.

    abstract::In this work, we study how the selection of examples affects the learning procedure in a boolean neural network and its relationship with the complexity of the function under study and its architecture. We analyze the generalization capacity for different target functions with particular architectures through an analy...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976600300014999

    authors: Franco L,Cannas SA

    更新日期:2000-10-01 00:00:00

  • Extraction of Synaptic Input Properties in Vivo.

    abstract::Knowledge of synaptic input is crucial for understanding synaptic integration and ultimately neural function. However, in vivo, the rates at which synaptic inputs arrive are high, so that it is typically impossible to detect single events. We show here that it is nevertheless possible to extract the properties of the ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00975

    authors: Puggioni P,Jelitai M,Duguid I,van Rossum MCW

    更新日期:2017-07-01 00:00:00

  • STDP-Compatible Approximation of Backpropagation in an Energy-Based Model.

    abstract::We show that Langevin Markov chain Monte Carlo inference in an energy-based model with latent variables has the property that the early steps of inference, starting from a stationary point, correspond to propagating error gradients into internal layers, similar to backpropagation. The backpropagated error is with resp...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00934

    authors: Bengio Y,Mesnard T,Fischer A,Zhang S,Wu Y

    更新日期:2017-03-01 00:00:00

  • A neural-network-based approach to the double traveling salesman problem.

    abstract::The double traveling salesman problem is a variation of the basic traveling salesman problem where targets can be reached by two salespersons operating in parallel. The real problem addressed by this work concerns the optimization of the harvest sequence for the two independent arms of a fruit-harvesting robot. This a...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/08997660252741194

    authors: Plebe A,Anile AM

    更新日期:2002-02-01 00:00:00

  • Online Reinforcement Learning Using a Probability Density Estimation.

    abstract::Function approximation in online, incremental, reinforcement learning needs to deal with two fundamental problems: biased sampling and nonstationarity. In this kind of task, biased sampling occurs because samples are obtained from specific trajectories dictated by the dynamics of the environment and are usually concen...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00906

    authors: Agostini A,Celaya E

    更新日期:2017-01-01 00:00:00

  • Including long-range dependence in integrate-and-fire models of the high interspike-interval variability of cortical neurons.

    abstract::Many different types of integrate-and-fire models have been designed in order to explain how it is possible for a cortical neuron to integrate over many independent inputs while still producing highly variable spike trains. Within this context, the variability of spike trains has been almost exclusively measured using...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/0899766041732413

    authors: Jackson BS

    更新日期:2004-10-01 00:00:00

  • McCulloch-Pitts Brains and Pseudorandom Functions.

    abstract::In a pioneering classic, Warren McCulloch and Walter Pitts proposed a model of the central nervous system. Motivated by EEG recordings of normal brain activity, Chvátal and Goldsmith asked whether these dynamical systems can be engineered to produce trajectories that are irregular, disorderly, and apparently unpredict...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00841

    authors: Chvátal V,Goldsmith M,Yang N

    更新日期:2016-06-01 00:00:00

  • Robust Closed-Loop Control of a Cursor in a Person with Tetraplegia using Gaussian Process Regression.

    abstract::Intracortical brain computer interfaces can enable individuals with paralysis to control external devices through voluntarily modulated brain activity. Decoding quality has been previously shown to degrade with signal nonstationarities-specifically, the changes in the statistics of the data between training and testin...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01129

    authors: Brandman DM,Burkhart MC,Kelemen J,Franco B,Harrison MT,Hochberg LR

    更新日期:2018-11-01 00:00:00

  • Making the error-controlling algorithm of observable operator models constructive.

    abstract::Observable operator models (OOMs) are a class of models for stochastic processes that properly subsumes the class that can be modeled by finite-dimensional hidden Markov models (HMMs). One of the main advantages of OOMs over HMMs is that they admit asymptotically correct learning algorithms. A series of learning algor...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2009.10-08-878

    authors: Zhao MJ,Jaeger H,Thon M

    更新日期:2009-12-01 00:00:00

  • Local and global gating of synaptic plasticity.

    abstract::Mechanisms influencing learning in neural networks are usually investigated on either a local or a global scale. The former relates to synaptic processes, the latter to unspecific modulatory systems. Here we study the interaction of a local learning rule that evaluates coincidences of pre- and postsynaptic action pote...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976600300015682

    authors: Sánchez-Montañés MA,Verschure PF,König P

    更新日期:2000-03-01 00:00:00

  • A Mean-Field Description of Bursting Dynamics in Spiking Neural Networks with Short-Term Adaptation.

    abstract::Bursting plays an important role in neural communication. At the population level, macroscopic bursting has been identified in populations of neurons that do not express intrinsic bursting mechanisms. For the analysis of phase transitions between bursting and non-bursting states, mean-field descriptions of macroscopic...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01300

    authors: Gast R,Schmidt H,Knösche TR

    更新日期:2020-09-01 00:00:00

  • Feature selection for ordinal text classification.

    abstract::Ordinal classification (also known as ordinal regression) is a supervised learning task that consists of estimating the rating of a data item on a fixed, discrete rating scale. This problem is receiving increased attention from the sentiment analysis and opinion mining community due to the importance of automatically ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00558

    authors: Baccianella S,Esuli A,Sebastiani F

    更新日期:2014-03-01 00:00:00

  • Bias/Variance Decompositions for Likelihood-Based Estimators.

    abstract::The bias/variance decomposition of mean-squared error is well understood and relatively straightforward. In this note, a similar simple decomposition is derived, valid for any kind of error measure that, when using the appropriate probability model, can be derived from a Kullback-Leibler divergence or log-likelihood. ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976698300017232

    authors: Heskes T

    更新日期:1998-07-28 00:00:00

  • Selectivity and stability via dendritic nonlinearity.

    abstract::Inspired by recent studies regarding dendritic computation, we constructed a recurrent neural network model incorporating dendritic lateral inhibition. Our model consists of an input layer and a neuron layer that includes excitatory cells and an inhibitory cell; this inhibitory cell is activated by the pooled activiti...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2007.19.7.1798

    authors: Morita K,Okada M,Aihara K

    更新日期:2007-07-01 00:00:00