Abstract:
:We consider the effect of the effective timing of a delayed feedback on the excitatory neuron in a recurrent inhibitory loop, when biological realities of firing and absolute refractory period are incorporated into a phenomenological spiking linear or quadratic integrate-and-fire neuron model. We show that such models are capable of generating a large number of asymptotically stable periodic solutions with predictable patterns of oscillations. We observe that the number of fixed points of the so-called phase resetting map coincides with the number of distinct periods of all stable periodic solutions rather than the number of stable patterns. We demonstrate how configurational information corresponding to these distinct periods can be explored to calculate and predict the number of stable patterns.
journal_name
Neural Computjournal_title
Neural computationauthors
Ma J,Wu Jdoi
10.1162/neco.2007.19.8.2124subject
Has Abstractpub_date
2007-08-01 00:00:00pages
2124-48issue
8eissn
0899-7667issn
1530-888Xjournal_volume
19pub_type
杂志文章abstract::We simulate the inhibition of Ia-glutamatergic excitatory postsynaptic potential (EPSP) by preceding it with glycinergic recurrent (REN) and reciprocal (REC) inhibitory postsynaptic potentials (IPSPs). The inhibition is evaluated in the presence of voltage-dependent conductances of sodium, delayed rectifier potassium,...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00375
更新日期:2013-01-01 00:00:00
abstract::Neuromorphic engineering combines the architectural and computational principles of systems neuroscience with semiconductor electronics, with the aim of building efficient and compact devices that mimic the synaptic and neural machinery of the brain. The energy consumptions promised by neuromorphic engineering are ext...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00882
更新日期:2016-10-01 00:00:00
abstract::The perceptron (also referred to as McCulloch-Pitts neuron, or linear threshold gate) is commonly used as a simplified model for the discrimination and learning capability of a biological neuron. Criteria that tell us when a perceptron can implement (or learn to implement) all possible dichotomies over a given set of ...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.2008.20.1.288
更新日期:2008-01-01 00:00:00
abstract::In "Isotropic Sequence Order Learning" (pp. 831-864 in this issue), we introduced a novel algorithm for temporal sequence learning (ISO learning). Here, we embed this algorithm into a formal nonevaluating (teacher free) environment, which establishes a sensor-motor feedback. The system is initially guided by a fixed r...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/08997660360581930
更新日期:2003-04-01 00:00:00
abstract::In a pioneering classic, Warren McCulloch and Walter Pitts proposed a model of the central nervous system. Motivated by EEG recordings of normal brain activity, Chvátal and Goldsmith asked whether these dynamical systems can be engineered to produce trajectories that are irregular, disorderly, and apparently unpredict...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00841
更新日期:2016-06-01 00:00:00
abstract::The statistical dependencies that independent component analysis (ICA) cannot remove often provide rich information beyond the linear independent components. It would thus be very useful to estimate the dependency structure from data. While such models have been proposed, they have usually concentrated on higher-order...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco_a_01006
更新日期:2017-11-01 00:00:00
abstract::The motion of an object (such as a wheel rotating) is seen as consistent independent of its position and size on the retina. Neurons in higher cortical visual areas respond to these global motion stimuli invariantly, but neurons in early cortical areas with small receptive fields cannot represent this motion, not only...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.2007.19.1.139
更新日期:2007-01-01 00:00:00
abstract::We show how a Hopfield network with modifiable recurrent connections undergoing slow Hebbian learning can extract the underlying geometry of an input space. First, we use a slow and fast analysis to derive an averaged system whose dynamics derives from an energy function and therefore always converges to equilibrium p...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00322
更新日期:2012-09-01 00:00:00
abstract::Correlated neural activity has been observed at various signal levels (e.g., spike count, membrane potential, local field potential, EEG, fMRI BOLD). Most of these signals can be considered as superpositions of spike trains filtered by components of the neural system (synapses, membranes) and the measurement process. ...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.2008.05-07-525
更新日期:2008-09-01 00:00:00
abstract::The Hebbian paradigm is perhaps the best-known unsupervised learning theory in connectionism. It has inspired wide research activity in the artificial neural network field because it embodies some interesting properties such as locality and the capability of being applicable to the basic weight-and-sum structure of ne...
journal_title:Neural computation
pub_type: 杂志文章,评审
doi:10.1162/0899766053429381
更新日期:2005-04-01 00:00:00
abstract::The new time-organized map (TOM) is presented for a better understanding of the self-organization and geometric structure of cortical signal representations. The algorithm extends the common self-organizing map (SOM) from the processing of purely spatial signals to the processing of spatiotemporal signals. The main ad...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976603765202695
更新日期:2003-05-01 00:00:00
abstract::Humans learn categories of complex objects quickly and from a few examples. Random projection has been suggested as a means to learn and categorize efficiently. We investigate how random projection affects categorization by humans and by very simple neural networks on the same stimuli and categorization tasks, and how...
journal_title:Neural computation
pub_type: 信件
doi:10.1162/NECO_a_00769
更新日期:2015-10-01 00:00:00
abstract::The expected free energy (EFE) is a central quantity in the theory of active inference. It is the quantity that all active inference agents are mandated to minimize through action, and its decomposition into extrinsic and intrinsic value terms is key to the balance of exploration and exploitation that active inference...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco_a_01354
更新日期:2021-01-05 00:00:00
abstract::We outline a hybrid analog-digital scheme for computing with three important features that enable it to scale to systems of large complexity: First, like digital computation, which uses several one-bit precise logical units to collectively compute a precise answer to a computation, the hybrid scheme uses several moder...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976602320263971
更新日期:2002-09-01 00:00:00
abstract::Activities of sensory-specific cortices are known to be suppressed when presented with a different sensory modality stimulus. This is referred to as cross-modal inhibition, for which the conventional synaptic mechanism is unlikely to work. Interestingly, the cross-modal inhibition could be eliminated when presented wi...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00356
更新日期:2012-11-01 00:00:00
abstract::Modeling stereo transparency with physiologically plausible mechanisms is challenging because in such frameworks, large receptive fields mix up overlapping disparities, whereas small receptive fields can reliably compute only small disparities. It seems necessary to combine information across scales. A coarse-to-fine ...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00722
更新日期:2015-05-01 00:00:00
abstract::We present an integrative formalism of mutual information expansion, the general Poisson exact breakdown, which explicitly evaluates the informational contribution of correlations in the spike counts both between and within neurons. The formalism was validated on simulated data and applied to real neurons recorded fro...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.2010.04-09-989
更新日期:2010-06-01 00:00:00
abstract::This article presents a reinforcement learning framework for continuous-time dynamical systems without a priori discretization of time, state, and action. Based on the Hamilton-Jacobi-Bellman (HJB) equation for infinite-horizon, discounted reward problems, we derive algorithms for estimating value functions and improv...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976600300015961
更新日期:2000-01-01 00:00:00
abstract::A fast and accurate computational scheme for simulating nonlinear dynamic systems is presented. The scheme assumes that the system can be represented by a combination of components of only two different types: first-order low-pass filters and static nonlinearities. The parameters of these filters and nonlinearities ma...
journal_title:Neural computation
pub_type: 信件
doi:10.1162/neco.2008.04-07-506
更新日期:2008-07-01 00:00:00
abstract::We investigate a recently proposed model for decision learning in a population of spiking neurons where synaptic plasticity is modulated by a population signal in addition to reward feedback. For the basic model, binary population decision making based on spike/no-spike coding, a detailed computational analysis is giv...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.2010.05-09-1010
更新日期:2010-07-01 00:00:00
abstract::Temporal coding is studied for an oscillatory neural network model with synchronization and acceleration. The latter mechanism refers to increasing (decreasing) the phase velocity of each unit for stronger (weaker) or more coherent (decoherent) input from the other units. It has been demonstrated that acceleration gen...
journal_title:Neural computation
pub_type: 信件
doi:10.1162/neco.2008.09-06-342
更新日期:2008-07-01 00:00:00
abstract::We discuss robustness against mislabeling in multiclass labels for classification problems and propose two algorithms of boosting, the normalized Eta-Boost.M and Eta-Boost.M, based on the Eta-divergence. Those two boosting algorithms are closely related to models of mislabeling in which the label is erroneously exchan...
journal_title:Neural computation
pub_type: 信件
doi:10.1162/neco.2007.11-06-400
更新日期:2008-06-01 00:00:00
abstract::Neural networks are often employed as tools in classification tasks. The use of large networks increases the likelihood of the task's being learned, although it may also lead to increased complexity. Pruning is an effective way of reducing the complexity of large networks. We present discriminant components pruning (D...
journal_title:Neural computation
pub_type: 杂志文章,评审
doi:10.1162/089976699300016665
更新日期:1999-04-01 00:00:00
abstract::Synchronized firings in the networks of class 1 excitable neurons with excitatory and inhibitory connections are investigated, and their dependences on the forms of interactions are analyzed. As the forms of interactions, we treat the double exponential coupling and the interactions derived from it: pulse coupling, ex...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/0899766053630387
更新日期:2005-06-01 00:00:00
abstract::This article reviews statistical techniques for combining multiple probability distributions. The framework is that of a decision maker who consults several experts regarding some events. The experts express their opinions in the form of probability distributions. The decision maker must aggregate the experts' distrib...
journal_title:Neural computation
pub_type: 杂志文章,评审
doi:10.1162/neco.1995.7.5.867
更新日期:1995-09-01 00:00:00
abstract::In learning theory, the training and test sets are assumed to be drawn from the same probability distribution. This assumption is also followed in practical situations, where matching the training and test distributions is considered desirable. Contrary to conventional wisdom, we show that mismatched training and test...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00697
更新日期:2015-02-01 00:00:00
abstract::The bias/variance decomposition of mean-squared error is well understood and relatively straightforward. In this note, a similar simple decomposition is derived, valid for any kind of error measure that, when using the appropriate probability model, can be derived from a Kullback-Leibler divergence or log-likelihood. ...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976698300017232
更新日期:1998-07-28 00:00:00
abstract::In this letter, we propose a noisy nonlinear version of independent component analysis (ICA). Assuming that the probability density function (p. d. f.) of sources is known, a learning rule is derived based on maximum likelihood estimation (MLE). Our model involves some algorithms of noisy linear ICA (e. g., Bermond & ...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/0899766052530866
更新日期:2005-01-01 00:00:00
abstract::We analyze convergence of the expectation maximization (EM) and variational Bayes EM (VBEM) schemes for parameter estimation in noisy linear models. The analysis shows that both schemes are inefficient in the low-noise limit. The linear model with additive noise includes as special cases independent component analysis...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/0899766054322991
更新日期:2005-09-01 00:00:00
abstract::This article presents new procedures for multisite spatiotemporal neuronal data analysis. A new statistical model - the diffusion model - is considered, whose parameters can be estimated from experimental data thanks to mean-field approximations. This work has been applied to optical recording of the guinea pig's audi...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976600300015150
更新日期:2000-08-01 00:00:00