Scalable Semisupervised Functional Neurocartography Reveals Canonical Neurons in Behavioral Networks.

Abstract:

:Large-scale data collection efforts to map the brain are underway at multiple spatial and temporal scales, but all face fundamental problems posed by high-dimensional data and intersubject variability. Even seemingly simple problems, such as identifying a neuron/brain region across animals/subjects, become exponentially more difficult in high dimensions, such as recognizing dozens of neurons/brain regions simultaneously. We present a framework and tools for functional neurocartography-the large-scale mapping of neural activity during behavioral states. Using a voltage-sensitive dye (VSD), we imaged the multifunctional responses of hundreds of leech neurons during several behaviors to identify and functionally map homologous neurons. We extracted simple features from each of these behaviors and combined them with anatomical features to create a rich medium-dimensional feature space. This enabled us to use machine learning techniques and visualizations to characterize and account for intersubject variability, piece together a canonical atlas of neural activity, and identify two behavioral networks. We identified 39 neurons (18 pairs, 3 unpaired) as part of a canonical swim network and 17 neurons (8 pairs, 1 unpaired) involved in a partially overlapping preparatory network. All neurons in the preparatory network rapidly depolarized at the onsets of each behavior, suggesting that it is part of a dedicated rapid-response network. This network is likely mediated by the S cell, and we referenced VSD recordings to an activity atlas to identify multiple cells of interest simultaneously in real time for further experiments. We targeted and electrophysiologically verified several neurons in the swim network and further showed that the S cell is presynaptic to multiple neurons in the preparatory network. This study illustrates the basic framework to map neural activity in high dimensions with large-scale recordings and how to extract the rich information necessary to perform analyses in light of intersubject variability.

journal_name

Neural Comput

journal_title

Neural computation

authors

Frady EP,Kapoor A,Horvitz E,Kristan WB Jr

doi

10.1162/NECO_a_00852

subject

Has Abstract

pub_date

2016-08-01 00:00:00

pages

1453-97

issue

8

eissn

0899-7667

issn

1530-888X

journal_volume

28

pub_type

杂志文章
  • Analysis of cluttered scenes using an elastic matching approach for stereo images.

    abstract::We present a system for the automatic interpretation of cluttered scenes containing multiple partly occluded objects in front of unknown, complex backgrounds. The system is based on an extended elastic graph matching algorithm that allows the explicit modeling of partial occlusions. Our approach extends an earlier sys...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2006.18.6.1441

    authors: Eckes C,Triesch J,von der Malsburg C

    更新日期:2006-06-01 00:00:00

  • Kernels for longitudinal data with variable sequence length and sampling intervals.

    abstract::We develop several kernel methods for classification of longitudinal data and apply them to detect cognitive decline in the elderly. We first develop mixed-effects models, a type of hierarchical empirical Bayes generative models, for the time series. After demonstrating their utility in likelihood ratio classifiers (a...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00164

    authors: Lu Z,Leen TK,Kaye J

    更新日期:2011-09-01 00:00:00

  • Active Learning for Enumerating Local Minima Based on Gaussian Process Derivatives.

    abstract::We study active learning (AL) based on gaussian processes (GPs) for efficiently enumerating all of the local minimum solutions of a black-box function. This problem is challenging because local solutions are characterized by their zero gradient and positive-definite Hessian properties, but those derivatives cannot be ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01307

    authors: Inatsu Y,Sugita D,Toyoura K,Takeuchi I

    更新日期:2020-10-01 00:00:00

  • General Poisson exact breakdown of the mutual information to study the role of correlations in populations of neurons.

    abstract::We present an integrative formalism of mutual information expansion, the general Poisson exact breakdown, which explicitly evaluates the informational contribution of correlations in the spike counts both between and within neurons. The formalism was validated on simulated data and applied to real neurons recorded fro...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2010.04-09-989

    authors: Scaglione A,Moxon KA,Foffani G

    更新日期:2010-06-01 00:00:00

  • Synchrony in heterogeneous networks of spiking neurons.

    abstract::The emergence of synchrony in the activity of large, heterogeneous networks of spiking neurons is investigated. We define the robustness of synchrony by the critical disorder at which the asynchronous state becomes linearly unstable. We show that at low firing rates, synchrony is more robust in excitatory networks tha...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976600300015286

    authors: Neltner L,Hansel D,Mato G,Meunier C

    更新日期:2000-07-01 00:00:00

  • Changes in GABAB modulation during a theta cycle may be analogous to the fall of temperature during annealing.

    abstract::Changes in GABA modulation may underlie experimentally observed changes in the strength of synaptic transmission at different phases of the theta rhythm (Wyble, Linster, & Hasselmo, 1997). Analysis demonstrates that these changes improve sequence disambiguation by a neural network model of CA3. We show that in the fra...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976698300017539

    authors: Sohal VS,Hasselmo ME

    更新日期:1998-05-15 00:00:00

  • Machine Learning: Deepest Learning as Statistical Data Assimilation Problems.

    abstract::We formulate an equivalence between machine learning and the formulation of statistical data assimilation as used widely in physical and biological sciences. The correspondence is that layer number in a feedforward artificial network setting is the analog of time in the data assimilation setting. This connection has b...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01094

    authors: Abarbanel HDI,Rozdeba PJ,Shirman S

    更新日期:2018-08-01 00:00:00

  • Sparse coding on the spot: spontaneous retinal waves suffice for orientation selectivity.

    abstract::Ohshiro, Hussain, and Weliky (2011) recently showed that ferrets reared with exposure to flickering spot stimuli, in the absence of oriented visual experience, develop oriented receptive fields. They interpreted this as refutation of efficient coding models, which require oriented input in order to develop oriented re...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00333

    authors: Hunt JJ,Ibbotson M,Goodhill GJ

    更新日期:2012-09-01 00:00:00

  • Alignment of coexisting cortical maps in a motor control model.

    abstract::How do multiple feature maps that coexist in the same region of cerebral cortex align with each other? We hypothesize that such alignment is governed by temporal correlations: features in one map that are temporally correlated with those in another come to occupy the same spatial locations in cortex over time. To exam...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.1996.8.4.731

    authors: Chen Y,Reggia JA

    更新日期:1996-05-15 00:00:00

  • Synchrony of neuronal oscillations controlled by GABAergic reversal potentials.

    abstract::GABAergic synapse reversal potential is controlled by the concentration of chloride. This concentration can change significantly during development and as a function of neuronal activity. Thus, GABA inhibition can be hyperpolarizing, shunting, or partially depolarizing. Previous results pinpointed the conditions under...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2007.19.3.706

    authors: Jeong HY,Gutkin B

    更新日期:2007-03-01 00:00:00

  • Toward a biophysically plausible bidirectional Hebbian rule.

    abstract::Although the commonly used quadratic Hebbian-anti-Hebbian rules lead to successful models of plasticity and learning, they are inconsistent with neurophysiology. Other rules, more physiologically plausible, fail to specify the biological mechanism of bidirectionality and the biological mechanism that prevents synapses...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976698300017629

    authors: Grzywacz NM,Burgi PY

    更新日期:1998-04-01 00:00:00

  • A simple Hebbian/anti-Hebbian network learns the sparse, independent components of natural images.

    abstract::Slightly modified versions of an early Hebbian/anti-Hebbian neural network are shown to be capable of extracting the sparse, independent linear components of a prefiltered natural image set. An explanation for this capability in terms of a coupling between two hypothetical networks is presented. The simple networks pr...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976606775093891

    authors: Falconbridge MS,Stamps RL,Badcock DR

    更新日期:2006-02-01 00:00:00

  • Selectivity and stability via dendritic nonlinearity.

    abstract::Inspired by recent studies regarding dendritic computation, we constructed a recurrent neural network model incorporating dendritic lateral inhibition. Our model consists of an input layer and a neuron layer that includes excitatory cells and an inhibitory cell; this inhibitory cell is activated by the pooled activiti...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2007.19.7.1798

    authors: Morita K,Okada M,Aihara K

    更新日期:2007-07-01 00:00:00

  • Nonlinear complex-valued extensions of Hebbian learning: an essay.

    abstract::The Hebbian paradigm is perhaps the best-known unsupervised learning theory in connectionism. It has inspired wide research activity in the artificial neural network field because it embodies some interesting properties such as locality and the capability of being applicable to the basic weight-and-sum structure of ne...

    journal_title:Neural computation

    pub_type: 杂志文章,评审

    doi:10.1162/0899766053429381

    authors: Fiori S

    更新日期:2005-04-01 00:00:00

  • Online Reinforcement Learning Using a Probability Density Estimation.

    abstract::Function approximation in online, incremental, reinforcement learning needs to deal with two fundamental problems: biased sampling and nonstationarity. In this kind of task, biased sampling occurs because samples are obtained from specific trajectories dictated by the dynamics of the environment and are usually concen...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00906

    authors: Agostini A,Celaya E

    更新日期:2017-01-01 00:00:00

  • A Resource-Allocating Network for Function Interpolation.

    abstract::We have created a network that allocates a new computational unit whenever an unusual pattern is presented to the network. This network forms compact representations, yet learns easily and rapidly. The network can be used at any time in the learning process and the learning patterns do not have to be repeated. The uni...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.1991.3.2.213

    authors: Platt J

    更新日期:1991-07-01 00:00:00

  • Range-based ICA using a nonsmooth quasi-newton optimizer for electroencephalographic source localization in focal epilepsy.

    abstract::Independent component analysis (ICA) aims at separating a multivariate signal into independent nongaussian signals by optimizing a contrast function with no knowledge on the mixing mechanism. Despite the availability of a constellation of contrast functions, a Hartley-entropy-based ICA contrast endowed with the discri...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00700

    authors: Selvan SE,George ST,Balakrishnan R

    更新日期:2015-03-01 00:00:00

  • Constraint on the number of synaptic inputs to a visual cortical neuron controls receptive field formation.

    abstract::To date, Hebbian learning combined with some form of constraint on synaptic inputs has been demonstrated to describe well the development of neural networks. The previous models revealed mathematically the importance of synaptic constraints to reproduce orientation selectivity in the visual cortical neurons, but biolo...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2009.04-08-752

    authors: Tanaka S,Miyashita M

    更新日期:2009-09-01 00:00:00

  • Mean First Passage Memory Lifetimes by Reducing Complex Synapses to Simple Synapses.

    abstract::Memory models that store new memories by forgetting old ones have memory lifetimes that are rather short and grow only logarithmically in the number of synapses. Attempts to overcome these deficits include "complex" models of synaptic plasticity in which synapses possess internal states governing the expression of syn...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00956

    authors: Elliott T

    更新日期:2017-06-01 00:00:00

  • Analyzing and Accelerating the Bottlenecks of Training Deep SNNs With Backpropagation.

    abstract::Spiking neural networks (SNNs) with the event-driven manner of transmitting spikes consume ultra-low power on neuromorphic chips. However, training deep SNNs is still challenging compared to convolutional neural networks (CNNs). The SNN training algorithms have not achieved the same performance as CNNs. In this letter...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01319

    authors: Chen R,Li L

    更新日期:2020-12-01 00:00:00

  • Supervised learning in a recurrent network of rate-model neurons exhibiting frequency adaptation.

    abstract::For gradient descent learning to yield connectivity consistent with real biological networks, the simulated neurons would have to include more realistic intrinsic properties such as frequency adaptation. However, gradient descent learning cannot be used straightforwardly with adapting rate-model neurons because the de...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/0899766054323017

    authors: Fortier PA,Guigon E,Burnod Y

    更新日期:2005-09-01 00:00:00

  • Approximation by fully complex multilayer perceptrons.

    abstract::We investigate the approximation ability of a multilayer perceptron (MLP) network when it is extended to the complex domain. The main challenge for processing complex data with neural networks has been the lack of bounded and analytic complex nonlinear activation functions in the complex domain, as stated by Liouville...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976603321891846

    authors: Kim T,Adali T

    更新日期:2003-07-01 00:00:00

  • Robust Closed-Loop Control of a Cursor in a Person with Tetraplegia using Gaussian Process Regression.

    abstract::Intracortical brain computer interfaces can enable individuals with paralysis to control external devices through voluntarily modulated brain activity. Decoding quality has been previously shown to degrade with signal nonstationarities-specifically, the changes in the statistics of the data between training and testin...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01129

    authors: Brandman DM,Burkhart MC,Kelemen J,Franco B,Harrison MT,Hochberg LR

    更新日期:2018-11-01 00:00:00

  • Fast population coding.

    abstract::Uncertainty coming from the noise in its neurons and the ill-posed nature of many tasks plagues neural computations. Maybe surprisingly, many studies show that the brain manipulates these forms of uncertainty in a probabilistically consistent and normative manner, and there is now a rich theoretical literature on the ...

    journal_title:Neural computation

    pub_type: 信件

    doi:10.1162/neco.2007.19.2.404

    authors: Huys QJ,Zemel RS,Natarajan R,Dayan P

    更新日期:2007-02-01 00:00:00

  • The Discriminative Kalman Filter for Bayesian Filtering with Nonlinear and Nongaussian Observation Models.

    abstract::The Kalman filter provides a simple and efficient algorithm to compute the posterior distribution for state-space models where both the latent state and measurement models are linear and gaussian. Extensions to the Kalman filter, including the extended and unscented Kalman filters, incorporate linearizations for model...

    journal_title:Neural computation

    pub_type: 信件

    doi:10.1162/neco_a_01275

    authors: Burkhart MC,Brandman DM,Franco B,Hochberg LR,Harrison MT

    更新日期:2020-05-01 00:00:00

  • Learning Hough transform: a neural network model.

    abstract::A single-layered Hough transform network is proposed that accepts image coordinates of each object pixel as input and produces a set of outputs that indicate the belongingness of the pixel to a particular structure (e.g., a straight line). The network is able to learn adaptively the parametric forms of the linear segm...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976601300014501

    authors: Basak J

    更新日期:2001-03-01 00:00:00

  • Whence the Expected Free Energy?

    abstract::The expected free energy (EFE) is a central quantity in the theory of active inference. It is the quantity that all active inference agents are mandated to minimize through action, and its decomposition into extrinsic and intrinsic value terms is key to the balance of exploration and exploitation that active inference...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01354

    authors: Millidge B,Tschantz A,Buckley CL

    更新日期:2021-01-05 00:00:00

  • On the slow convergence of EM and VBEM in low-noise linear models.

    abstract::We analyze convergence of the expectation maximization (EM) and variational Bayes EM (VBEM) schemes for parameter estimation in noisy linear models. The analysis shows that both schemes are inefficient in the low-noise limit. The linear model with additive noise includes as special cases independent component analysis...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/0899766054322991

    authors: Petersen KB,Winther O,Hansen LK

    更新日期:2005-09-01 00:00:00

  • Bayesian active learning of neural firing rate maps with transformed gaussian process priors.

    abstract::A firing rate map, also known as a tuning curve, describes the nonlinear relationship between a neuron's spike rate and a low-dimensional stimulus (e.g., orientation, head direction, contrast, color). Here we investigate Bayesian active learning methods for estimating firing rate maps in closed-loop neurophysiology ex...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00615

    authors: Park M,Weller JP,Horwitz GD,Pillow JW

    更新日期:2014-08-01 00:00:00

  • A first-order nonhomogeneous Markov model for the response of spiking neurons stimulated by small phase-continuous signals.

    abstract::We present a first-order nonhomogeneous Markov model for the interspike-interval density of a continuously stimulated spiking neuron. The model allows the conditional interspike-interval density and the stationary interspike-interval density to be expressed as products of two separate functions, one of which describes...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2009.06-07-548

    authors: Tapson J,Jin C,van Schaik A,Etienne-Cummings R

    更新日期:2009-06-01 00:00:00