Estimating spiking irregularities under changing environments.


:We considered a gamma distribution of interspike intervals as a statistical model for neuronal spike generation. A gamma distribution is a natural extension of the Poisson process taking the effect of a refractory period into account. The model is specified by two parameters: a time-dependent firing rate and a shape parameter that characterizes spiking irregularities of individual neurons. Because the environment changes over time, observed data are generated from a model with a time-dependent firing rate, which is an unknown function. A statistical model with an unknown function is called a semiparametric model and is generally very difficult to solve. We used a novel method of estimating functions in information geometry to estimate the shape parameter without estimating the unknown function. We obtained an optimal estimating function analytically for the shape parameter independent of the functional form of the firing rate. This estimation is efficient without Fisher information loss and better than maximum likelihood estimation. We suggest a measure of spiking irregularity based on the estimating function, which may be useful for characterizing individual neurons in changing environments.


Neural Comput


Neural computation


Miura K,Okada M,Amari S




Has Abstract


2006-10-01 00:00:00












  • Including long-range dependence in integrate-and-fire models of the high interspike-interval variability of cortical neurons.

    abstract::Many different types of integrate-and-fire models have been designed in order to explain how it is possible for a cortical neuron to integrate over many independent inputs while still producing highly variable spike trains. Within this context, the variability of spike trains has been almost exclusively measured using...

    journal_title:Neural computation

    pub_type: 杂志文章


    authors: Jackson BS

    更新日期:2004-10-01 00:00:00

  • Synchrony of neuronal oscillations controlled by GABAergic reversal potentials.

    abstract::GABAergic synapse reversal potential is controlled by the concentration of chloride. This concentration can change significantly during development and as a function of neuronal activity. Thus, GABA inhibition can be hyperpolarizing, shunting, or partially depolarizing. Previous results pinpointed the conditions under...

    journal_title:Neural computation

    pub_type: 杂志文章


    authors: Jeong HY,Gutkin B

    更新日期:2007-03-01 00:00:00

  • ISO learning approximates a solution to the inverse-controller problem in an unsupervised behavioral paradigm.

    abstract::In "Isotropic Sequence Order Learning" (pp. 831-864 in this issue), we introduced a novel algorithm for temporal sequence learning (ISO learning). Here, we embed this algorithm into a formal nonevaluating (teacher free) environment, which establishes a sensor-motor feedback. The system is initially guided by a fixed r...

    journal_title:Neural computation

    pub_type: 杂志文章


    authors: Porr B,von Ferber C,Wörgötter F

    更新日期:2003-04-01 00:00:00

  • Distributed control of uncertain systems using superpositions of linear operators.

    abstract::Control in the natural environment is difficult in part because of uncertainty in the effect of actions. Uncertainty can be due to added motor or sensory noise, unmodeled dynamics, or quantization of sensory feedback. Biological systems are faced with further difficulties, since control must be performed by networks o...

    journal_title:Neural computation

    pub_type: 杂志文章


    authors: Sanger TD

    更新日期:2011-08-01 00:00:00

  • Attractive periodic sets in discrete-time recurrent networks (with emphasis on fixed-point stability and bifurcations in two-neuron networks).

    abstract::We perform a detailed fixed-point analysis of two-unit recurrent neural networks with sigmoid-shaped transfer functions. Using geometrical arguments in the space of transfer function derivatives, we partition the network state-space into distinct regions corresponding to stability types of the fixed points. Unlike in ...

    journal_title:Neural computation

    pub_type: 杂志文章


    authors: Tino P,Horne BG,Giles CL

    更新日期:2001-06-01 00:00:00

  • Bayesian active learning of neural firing rate maps with transformed gaussian process priors.

    abstract::A firing rate map, also known as a tuning curve, describes the nonlinear relationship between a neuron's spike rate and a low-dimensional stimulus (e.g., orientation, head direction, contrast, color). Here we investigate Bayesian active learning methods for estimating firing rate maps in closed-loop neurophysiology ex...

    journal_title:Neural computation

    pub_type: 杂志文章


    authors: Park M,Weller JP,Horwitz GD,Pillow JW

    更新日期:2014-08-01 00:00:00

  • Why Does Large Batch Training Result in Poor Generalization? A Comprehensive Explanation and a Better Strategy from the Viewpoint of Stochastic Optimization.

    abstract::We present a comprehensive framework of search methods, such as simulated annealing and batch training, for solving nonconvex optimization problems. These methods search a wider range by gradually decreasing the randomness added to the standard gradient descent method. The formulation that we define on the basis of th...

    journal_title:Neural computation

    pub_type: 杂志文章


    authors: Takase T,Oyama S,Kurihara M

    更新日期:2018-07-01 00:00:00

  • Synaptic runaway in associative networks and the pathogenesis of schizophrenia.

    abstract::Synaptic runaway denotes the formation of erroneous synapses and premature functional decline accompanying activity-dependent learning in neural networks. This work studies synaptic runaway both analytically and numerically in binary-firing associative memory networks. It turns out that synaptic runaway is of fairly m...

    journal_title:Neural computation

    pub_type: 杂志文章,评审


    authors: Greenstein-Messica A,Ruppin E

    更新日期:1998-02-15 00:00:00

  • Parameter learning for alpha integration.

    abstract::In pattern recognition, data integration is an important issue, and when properly done, it can lead to improved performance. Also, data integration can be used to help model and understand multimodal processing in the brain. Amari proposed α-integration as a principled way of blending multiple positive measures (e.g.,...

    journal_title:Neural computation

    pub_type: 杂志文章


    authors: Choi H,Choi S,Choe Y

    更新日期:2013-06-01 00:00:00

  • Fast population coding.

    abstract::Uncertainty coming from the noise in its neurons and the ill-posed nature of many tasks plagues neural computations. Maybe surprisingly, many studies show that the brain manipulates these forms of uncertainty in a probabilistically consistent and normative manner, and there is now a rich theoretical literature on the ...

    journal_title:Neural computation

    pub_type: 信件


    authors: Huys QJ,Zemel RS,Natarajan R,Dayan P

    更新日期:2007-02-01 00:00:00

  • Independent component analysis: A flexible nonlinearity and decorrelating manifold approach.

    abstract::Independent component analysis (ICA) finds a linear transformation to variables that are maximally statistically independent. We examine ICA and algorithms for finding the best transformation from the point of view of maximizing the likelihood of the data. In particular, we discuss the way in which scaling of the unmi...

    journal_title:Neural computation

    pub_type: 杂志文章


    authors: Everson R,Roberts S

    更新日期:1999-11-15 00:00:00

  • A modified algorithm for generalized discriminant analysis.

    abstract::Generalized discriminant analysis (GDA) is an extension of the classical linear discriminant analysis (LDA) from linear domain to a nonlinear domain via the kernel trick. However, in the previous algorithm of GDA, the solutions may suffer from the degenerate eigenvalue problem (i.e., several eigenvectors with the same...

    journal_title:Neural computation

    pub_type: 杂志文章


    authors: Zheng W,Zhao L,Zou C

    更新日期:2004-06-01 00:00:00

  • Propagating distributions up directed acyclic graphs.

    abstract::In a previous article, we considered game trees as graphical models. Adopting an evaluation function that returned a probability distribution over values likely to be taken at a given position, we described how to build a model of uncertainty and use it for utility-directed growth of the search tree and for deciding o...

    journal_title:Neural computation

    pub_type: 杂志文章


    authors: Baum EB,Smith WD

    更新日期:1999-01-01 00:00:00

  • Information recall using relative spike timing in a spiking neural network.

    abstract::We present a neural network that is capable of completing and correcting a spiking pattern given only a partial, noisy version. It operates in continuous time and represents information using the relative timing of individual spikes. The network is capable of correcting and recalling multiple patterns simultaneously. ...

    journal_title:Neural computation

    pub_type: 杂志文章


    authors: Sterne P

    更新日期:2012-08-01 00:00:00

  • Dissociable forms of repetition priming: a computational model.

    abstract::Nondeclarative memory and novelty processing in the brain is an actively studied field of neuroscience, and reducing neural activity with repetition of a stimulus (repetition suppression) is a commonly observed phenomenon. Recent findings of an opposite trend-specifically, rising activity for unfamiliar stimuli-questi...

    journal_title:Neural computation

    pub_type: 杂志文章


    authors: Makukhin K,Bolland S

    更新日期:2014-04-01 00:00:00

  • Adaptive Learning Algorithm Convergence in Passive and Reactive Environments.

    abstract::Although the number of artificial neural network and machine learning architectures is growing at an exponential pace, more attention needs to be paid to theoretical guarantees of asymptotic convergence for novel, nonlinear, high-dimensional adaptive learning algorithms. When properly understood, such guarantees can g...

    journal_title:Neural computation

    pub_type: 杂志文章


    authors: Golden RM

    更新日期:2018-10-01 00:00:00

  • Robustness of connectionist swimming controllers against random variation in neural connections.

    abstract::The ability to achieve high swimming speed and efficiency is very important to both the real lamprey and its robotic implementation. In previous studies, we used evolutionary algorithms to evolve biologically plausible connectionist swimming controllers for a simulated lamprey. This letter investigates the robustness ...

    journal_title:Neural computation

    pub_type: 杂志文章


    authors: Or J

    更新日期:2007-06-01 00:00:00

  • Invariant global motion recognition in the dorsal visual system: a unifying theory.

    abstract::The motion of an object (such as a wheel rotating) is seen as consistent independent of its position and size on the retina. Neurons in higher cortical visual areas respond to these global motion stimuli invariantly, but neurons in early cortical areas with small receptive fields cannot represent this motion, not only...

    journal_title:Neural computation

    pub_type: 杂志文章


    authors: Rolls ET,Stringer SM

    更新日期:2007-01-01 00:00:00

  • A causal perspective on the analysis of signal and noise correlations and their role in population coding.

    abstract::The role of correlations between neuronal responses is crucial to understanding the neural code. A framework used to study this role comprises a breakdown of the mutual information between stimuli and responses into terms that aim to account for different coding modalities and the distinction between different notions...

    journal_title:Neural computation

    pub_type: 杂志文章


    authors: Chicharro D

    更新日期:2014-06-01 00:00:00

  • The time-organized map algorithm: extending the self-organizing map to spatiotemporal signals.

    abstract::The new time-organized map (TOM) is presented for a better understanding of the self-organization and geometric structure of cortical signal representations. The algorithm extends the common self-organizing map (SOM) from the processing of purely spatial signals to the processing of spatiotemporal signals. The main ad...

    journal_title:Neural computation

    pub_type: 杂志文章


    authors: Wiemer JC

    更新日期:2003-05-01 00:00:00

  • Feature selection for ordinal text classification.

    abstract::Ordinal classification (also known as ordinal regression) is a supervised learning task that consists of estimating the rating of a data item on a fixed, discrete rating scale. This problem is receiving increased attention from the sentiment analysis and opinion mining community due to the importance of automatically ...

    journal_title:Neural computation

    pub_type: 杂志文章


    authors: Baccianella S,Esuli A,Sebastiani F

    更新日期:2014-03-01 00:00:00

  • Visual Categorization with Random Projection.

    abstract::Humans learn categories of complex objects quickly and from a few examples. Random projection has been suggested as a means to learn and categorize efficiently. We investigate how random projection affects categorization by humans and by very simple neural networks on the same stimuli and categorization tasks, and how...

    journal_title:Neural computation

    pub_type: 信件


    authors: Arriaga RI,Rutter D,Cakmak M,Vempala SS

    更新日期:2015-10-01 00:00:00

  • Information loss in an optimal maximum likelihood decoding.

    abstract::The mutual information between a set of stimuli and the elicited neural responses is compared to the corresponding decoded information. The decoding procedure is presented as an artificial distortion of the joint probabilities between stimuli and responses. The information loss is quantified. Whenever the probabilitie...

    journal_title:Neural computation

    pub_type: 杂志文章


    authors: Samengo I

    更新日期:2002-04-01 00:00:00

  • Linking Neuromodulated Spike-Timing Dependent Plasticity with the Free-Energy Principle.

    abstract::The free-energy principle is a candidate unified theory for learning and memory in the brain that predicts that neurons, synapses, and neuromodulators work in a manner that minimizes free energy. However, electrophysiological data elucidating the neural and synaptic bases for this theory are lacking. Here, we propose ...

    journal_title:Neural computation

    pub_type: 杂志文章


    authors: Isomura T,Sakai K,Kotani K,Jimbo Y

    更新日期:2016-09-01 00:00:00

  • Nonlinear complex-valued extensions of Hebbian learning: an essay.

    abstract::The Hebbian paradigm is perhaps the best-known unsupervised learning theory in connectionism. It has inspired wide research activity in the artificial neural network field because it embodies some interesting properties such as locality and the capability of being applicable to the basic weight-and-sum structure of ne...

    journal_title:Neural computation

    pub_type: 杂志文章,评审


    authors: Fiori S

    更新日期:2005-04-01 00:00:00

  • On the problem in model selection of neural network regression in overrealizable scenario.

    abstract::In considering a statistical model selection of neural networks and radial basis functions under an overrealizable case, the problem of unidentifiability emerges. Because the model selection criterion is an unbiased estimator of the generalization error based on the training error, this article analyzes the expected t...

    journal_title:Neural computation

    pub_type: 杂志文章


    authors: Hagiwara K

    更新日期:2002-08-01 00:00:00

  • Positive Neural Networks in Discrete Time Implement Monotone-Regular Behaviors.

    abstract::We study the expressive power of positive neural networks. The model uses positive connection weights and multiple input neurons. Different behaviors can be expressed by varying the connection weights. We show that in discrete time and in the absence of noise, the class of positive neural networks captures the so-call...

    journal_title:Neural computation

    pub_type: 杂志文章


    authors: Ameloot TJ,Van den Bussche J

    更新日期:2015-12-01 00:00:00

  • Long-term reward prediction in TD models of the dopamine system.

    abstract::This article addresses the relationship between long-term reward predictions and slow-timescale neural activity in temporal difference (TD) models of the dopamine system. Such models attempt to explain how the activity of dopamine (DA) neurons relates to errors in the prediction of future rewards. Previous models have...

    journal_title:Neural computation

    pub_type: 杂志文章


    authors: Daw ND,Touretzky DS

    更新日期:2002-11-01 00:00:00

  • Random embedding machines for pattern recognition.

    abstract::Real classification problems involve structured data that can be essentially grouped into a relatively small number of clusters. It is shown that, under a local clustering condition, a set of points of a given class, embedded in binary space by a set of randomly parameterized surfaces, is linearly separable from other...

    journal_title:Neural computation

    pub_type: 杂志文章


    authors: Baram Y

    更新日期:2001-11-01 00:00:00

  • Constraint on the number of synaptic inputs to a visual cortical neuron controls receptive field formation.

    abstract::To date, Hebbian learning combined with some form of constraint on synaptic inputs has been demonstrated to describe well the development of neural networks. The previous models revealed mathematically the importance of synaptic constraints to reproduce orientation selectivity in the visual cortical neurons, but biolo...

    journal_title:Neural computation

    pub_type: 杂志文章


    authors: Tanaka S,Miyashita M

    更新日期:2009-09-01 00:00:00