Abstract:
:In this letter, we perform a complete and in-depth analysis of Lorentzian noises, such as those arising from [Formula: see text] and [Formula: see text] channel kinetics, in order to identify the source of [Formula: see text]-type noise in neurological membranes. We prove that the autocovariance of Lorentzian noise depends solely on the eigenvalues (time constants) of the kinetic matrix but that the Lorentzian weighting coefficients depend entirely on the eigenvectors of this matrix. We then show that there are rotations of the kinetic eigenvectors that send any initial weights to any target weights without altering the time constants. In particular, we show there are target weights for which the resulting Lorenztian noise has an approximately [Formula: see text]-type spectrum. We justify these kinetic rotations by introducing a quantum mechanical formulation of membrane stochastics, called hidden quantum activated-measurement models, and prove that these quantum models are probabilistically indistinguishable from the classical hidden Markov models typically used for ion channel stochastics. The quantum dividend obtained by replacing classical with quantum membranes is that rotations of the Lorentzian weights become simple readjustments of the quantum state without any change to the laboratory-determined kinetic and conductance parameters. Moreover, the quantum formalism allows us to model the activation energy of a membrane, and we show that maximizing entropy under constrained activation energy yields the previous [Formula: see text]-type Lorentzian weights, in which the spectral exponent [Formula: see text] is a Lagrange multiplier for the energy constraint. Thus, we provide a plausible neurophysical mechanism by which channel and membrane kinetics can give rise to [Formula: see text]-type noise (something that has been occasionally denied in the literature), as well as a realistic and experimentally testable explanation for the numerical values of the spectral exponents. We also discuss applications of quantum membranes beyond [Formula: see text]-type -noise, including applications to animal models and possible impact on quantum foundations.
journal_name
Neural Computjournal_title
Neural computationauthors
Paris A,Vosoughi A,Berman SA,Atia Gdoi
10.1162/NECO_a_01067subject
Has Abstractpub_date
2018-07-01 00:00:00pages
1830-1929issue
7eissn
0899-7667issn
1530-888Xjournal_volume
30pub_type
杂志文章abstract::As neural activity is transmitted through the nervous system, neuronal noise degrades the encoded information and limits performance. It is therefore important to know how information loss can be prevented. We study this question in the context of neural population codes. Using Fisher information, we show how informat...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00227
更新日期:2012-02-01 00:00:00
abstract::This article presents a reinforcement learning framework for continuous-time dynamical systems without a priori discretization of time, state, and action. Based on the Hamilton-Jacobi-Bellman (HJB) equation for infinite-horizon, discounted reward problems, we derive algorithms for estimating value functions and improv...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976600300015961
更新日期:2000-01-01 00:00:00
abstract::For any memoryless communication channel with a binary-valued input and a one-dimensional real-valued output, we introduce a probabilistic lower bound on the mutual information given empirical observations on the channel. The bound is built on the Dvoretzky-Kiefer-Wolfowitz inequality and is distribution free. A quadr...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00144
更新日期:2011-07-01 00:00:00
abstract::This article addresses the relationship between long-term reward predictions and slow-timescale neural activity in temporal difference (TD) models of the dopamine system. Such models attempt to explain how the activity of dopamine (DA) neurons relates to errors in the prediction of future rewards. Previous models have...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976602760407973
更新日期:2002-11-01 00:00:00
abstract::We explicitly analyze the trajectories of learning near singularities in hierarchical networks, such as multilayer perceptrons and radial basis function networks, which include permutation symmetry of hidden nodes, and show their general properties. Such symmetry induces singularities in their parameter space, where t...
journal_title:Neural computation
pub_type: 信件
doi:10.1162/neco.2007.12-06-414
更新日期:2008-03-01 00:00:00
abstract::In considering a statistical model selection of neural networks and radial basis functions under an overrealizable case, the problem of unidentifiability emerges. Because the model selection criterion is an unbiased estimator of the generalization error based on the training error, this article analyzes the expected t...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976602760128090
更新日期:2002-08-01 00:00:00
abstract::Intracortical brain computer interfaces can enable individuals with paralysis to control external devices through voluntarily modulated brain activity. Decoding quality has been previously shown to degrade with signal nonstationarities-specifically, the changes in the statistics of the data between training and testin...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco_a_01129
更新日期:2018-11-01 00:00:00
abstract::We study the emergence of synchronized burst activity in networks of neurons with spike adaptation. We show that networks of tonically firing adapting excitatory neurons can evolve to a state where the neurons burst in a synchronized manner. The mechanism leading to this burst activity is analyzed in a network of inte...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/08997660151134280
更新日期:2001-05-01 00:00:00
abstract::Recent studies have employed simple linear dynamical systems to model trial-by-trial dynamics in various sensorimotor learning tasks. Here we explore the theoretical and practical considerations that arise when employing the general class of linear dynamical systems (LDS) as a model for sensorimotor learning. In this ...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976606775774651
更新日期:2006-04-01 00:00:00
abstract::The nu-support vector machine (nu-SVM) for classification proposed by Schölkopf, Smola, Williamson, and Bartlett (2000) has the advantage of using a parameter nu on controlling the number of support vectors. In this article, we investigate the relation between nu-SVM and C-SVM in detail. We show that in general they a...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976601750399335
更新日期:2001-09-01 00:00:00
abstract::The mutual information between a set of stimuli and the elicited neural responses is compared to the corresponding decoded information. The decoding procedure is presented as an artificial distortion of the joint probabilities between stimuli and responses. The information loss is quantified. Whenever the probabilitie...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976602317318947
更新日期:2002-04-01 00:00:00
abstract::The brain is known to be active even when not performing any overt cognitive tasks, and often it engages in involuntary mind wandering. This resting state has been extensively characterized in terms of fMRI-derived brain networks. However, an alternate method has recently gained popularity: EEG microstate analysis. Pr...
journal_title:Neural computation
pub_type: 信件
doi:10.1162/neco_a_01229
更新日期:2019-11-01 00:00:00
abstract::We study the effect of competition between short-term synaptic depression and facilitation on the dynamic properties of attractor neural networks, using Monte Carlo simulation and a mean-field analysis. Depending on the balance of depression, facilitation, and the underlying noise, the network displays different behav...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.2007.19.10.2739
更新日期:2007-10-01 00:00:00
abstract::Neurons perform computations, and convey the results of those computations through the statistical structure of their output spike trains. Here we present a practical method, grounded in the information-theoretic analysis of prediction, for inferring a minimal representation of that structure and for characterizing it...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.2009.12-07-678
更新日期:2010-01-01 00:00:00
abstract::Humans learn categories of complex objects quickly and from a few examples. Random projection has been suggested as a means to learn and categorize efficiently. We investigate how random projection affects categorization by humans and by very simple neural networks on the same stimuli and categorization tasks, and how...
journal_title:Neural computation
pub_type: 信件
doi:10.1162/NECO_a_00769
更新日期:2015-10-01 00:00:00
abstract::Physiological signals such as neural spikes and heartbeats are discrete events in time, driven by continuous underlying systems. A recently introduced data-driven model to analyze such a system is a state-space model with point process observations, parameters of which and the underlying state sequence are simultaneou...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.2010.07-09-1047
更新日期:2010-08-01 00:00:00
abstract::When subjects adapt their reaching movements in the setting of a systematic force or visual perturbation, generalization of adaptation can be assessed psychophysically in two ways: by testing untrained locations in the work space at the end of adaptation (slow postadaptation generalization) or by determining the influ...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00262
更新日期:2012-04-01 00:00:00
abstract::Although the commonly used quadratic Hebbian-anti-Hebbian rules lead to successful models of plasticity and learning, they are inconsistent with neurophysiology. Other rules, more physiologically plausible, fail to specify the biological mechanism of bidirectionality and the biological mechanism that prevents synapses...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976698300017629
更新日期:1998-04-01 00:00:00
abstract::We outline a hybrid analog-digital scheme for computing with three important features that enable it to scale to systems of large complexity: First, like digital computation, which uses several one-bit precise logical units to collectively compute a precise answer to a computation, the hybrid scheme uses several moder...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976602320263971
更新日期:2002-09-01 00:00:00
abstract::Inspired by recent studies regarding dendritic computation, we constructed a recurrent neural network model incorporating dendritic lateral inhibition. Our model consists of an input layer and a neuron layer that includes excitatory cells and an inhibitory cell; this inhibitory cell is activated by the pooled activiti...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.2007.19.7.1798
更新日期:2007-07-01 00:00:00
abstract::The Nyström method is a well-known sampling-based technique for approximating the eigensystem of large kernel matrices. However, the chosen samples in the Nyström method are all assumed to be of equal importance, which deviates from the integral equation that defines the kernel eigenfunctions. Motivated by this observ...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.2008.11-07-651
更新日期:2009-01-01 00:00:00
abstract::The dynamic formation of groups of neurons--neuronal assemblies--is believed to mediate cognitive phenomena at many levels, but their detailed operation and mechanisms of interaction are still to be uncovered. One hypothesis suggests that synchronized oscillations underpin their formation and functioning, with a focus...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00502
更新日期:2013-11-01 00:00:00
abstract::This article studies a general theory of estimating functions of independent component analysis when the independent source signals are temporarily correlated. Estimating functions are used for deriving both batch and on-line learning algorithms, and they are applicable to blind cases where spatial and temporal probab...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976600300015079
更新日期:2000-09-01 00:00:00
abstract::We propose a scalable semiparametric Bayesian model to capture dependencies among multiple neurons by detecting their cofiring (possibly with some lag time) patterns over time. After discretizing time so there is at most one spike at each interval, the resulting sequence of 1s (spike) and 0s (silence) for each neuron ...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00631
更新日期:2014-09-01 00:00:00
abstract::Theories of learning and generalization hold that the generalization bias, defined as the difference between the training error and the generalization error, increases on average with the number of adaptive parameters. This article, however, shows that this general tendency is violated for a gaussian mixture model. Fo...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976600300015439
更新日期:2000-06-01 00:00:00
abstract::We present a reduction of a Hodgkin-Huxley (HH)--style bursting model to a hybridized integrate-and-fire (IF) formalism based on a thorough bifurcation analysis of the neuron's dynamics. The model incorporates HH--style equations to evolve the subthreshold currents and includes IF mechanisms to characterize spike even...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976603322518768
更新日期:2003-12-01 00:00:00
abstract::Much experimental evidence suggests that during decision making, neural circuits accumulate evidence supporting alternative options. A computational model well describing this accumulation for choices between two options assumes that the brain integrates the log ratios of the likelihoods of the sensory inputs given th...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/NECO_a_00917
更新日期:2017-02-01 00:00:00
abstract::A single-layered Hough transform network is proposed that accepts image coordinates of each object pixel as input and produces a set of outputs that indicate the belongingness of the pixel to a particular structure (e.g., a straight line). The network is able to learn adaptively the parametric forms of the linear segm...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/089976601300014501
更新日期:2001-03-01 00:00:00
abstract::Based on the dopamine hypotheses of cocaine addiction and the assumption of decrement of brain reward system sensitivity after long-term drug exposure, we propose a computational model for cocaine addiction. Utilizing average reward temporal difference reinforcement learning, we incorporate the elevation of basal rewa...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/neco.2009.10-08-882
更新日期:2009-10-01 00:00:00
abstract::Inner-product operators, often referred to as kernels in statistical learning, define a mapping from some input space into a feature space. The focus of this letter is the construction of biologically motivated kernels for cortical activities. The kernels we derive, termed Spikernels, map spike count sequences into an...
journal_title:Neural computation
pub_type: 杂志文章
doi:10.1162/0899766053019944
更新日期:2005-03-01 00:00:00