Correlational Neural Networks.

Abstract:

:Common representation learning (CRL), wherein different descriptions (or views) of the data are embedded in a common subspace, has been receiving a lot of attention recently. Two popular paradigms here are canonical correlation analysis (CCA)-based approaches and autoencoder (AE)-based approaches. CCA-based approaches learn a joint representation by maximizing correlation of the views when projected to the common subspace. AE-based methods learn a common representation by minimizing the error of reconstructing the two views. Each of these approaches has its own advantages and disadvantages. For example, while CCA-based approaches outperform AE-based approaches for the task of transfer learning, they are not as scalable as the latter. In this work, we propose an AE-based approach, correlational neural network (CorrNet), that explicitly maximizes correlation among the views when projected to the common subspace. Through a series of experiments, we demonstrate that the proposed CorrNet is better than AE and CCA with respect to its ability to learn correlated common representations. We employ CorrNet for several cross-language tasks and show that the representations learned using it perform better than the ones learned using other state-of-the-art approaches.

journal_name

Neural Comput

journal_title

Neural computation

authors

Chandar S,Khapra MM,Larochelle H,Ravindran B

doi

10.1162/NECO_a_00801

subject

Has Abstract

pub_date

2016-02-01 00:00:00

pages

257-85

issue

2

eissn

0899-7667

issn

1530-888X

journal_volume

28

pub_type

杂志文章
  • Statistical computer model analysis of the reciprocal and recurrent inhibitions of the Ia-EPSP in α-motoneurons.

    abstract::We simulate the inhibition of Ia-glutamatergic excitatory postsynaptic potential (EPSP) by preceding it with glycinergic recurrent (REN) and reciprocal (REC) inhibitory postsynaptic potentials (IPSPs). The inhibition is evaluated in the presence of voltage-dependent conductances of sodium, delayed rectifier potassium,...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00375

    authors: Gradwohl G,Grossman Y

    更新日期:2013-01-01 00:00:00

  • Computation in a single neuron: Hodgkin and Huxley revisited.

    abstract::A spiking neuron "computes" by transforming a complex dynamical input into a train of action potentials, or spikes. The computation performed by the neuron can be formulated as dimensional reduction, or feature detection, followed by a nonlinear decision function over the low-dimensional space. Generalizations of the ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/08997660360675017

    authors: Agüera y Arcas B,Fairhall AL,Bialek W

    更新日期:2003-08-01 00:00:00

  • Characterization of minimum error linear coding with sensory and neural noise.

    abstract::Robust coding has been proposed as a solution to the problem of minimizing decoding error in the presence of neural noise. Many real-world problems, however, have degradation in the input signal, not just in neural representations. This generalized problem is more relevant to biological sensory coding where internal n...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00181

    authors: Doi E,Lewicki MS

    更新日期:2011-10-01 00:00:00

  • Clustering based on gaussian processes.

    abstract::In this letter, we develop a gaussian process model for clustering. The variances of predictive values in gaussian processes learned from a training data are shown to comprise an estimate of the support of a probability density function. The constructed variance function is then applied to construct a set of contours ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2007.19.11.3088

    authors: Kim HC,Lee J

    更新日期:2007-11-01 00:00:00

  • Neural coding: higher-order temporal patterns in the neurostatistics of cell assemblies.

    abstract::Recent advances in the technology of multiunit recordings make it possible to test Hebb's hypothesis that neurons do not function in isolation but are organized in assemblies. This has created the need for statistical approaches to detecting the presence of spatiotemporal patterns of more than two neurons in neuron sp...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976600300014872

    authors: Martignon L,Deco G,Laskey K,Diamond M,Freiwald W,Vaadia E

    更新日期:2000-11-01 00:00:00

  • Neutral stability, rate propagation, and critical branching in feedforward networks.

    abstract::Recent experimental and computational evidence suggests that several dynamical properties may characterize the operating point of functioning neural networks: critical branching, neutral stability, and production of a wide range of firing patterns. We seek the simplest setting in which these properties emerge, clarify...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00461

    authors: Cayco-Gajic NA,Shea-Brown E

    更新日期:2013-07-01 00:00:00

  • A unifying view of wiener and volterra theory and polynomial kernel regression.

    abstract::Volterra and Wiener series are perhaps the best-understood nonlinear system representations in signal processing. Although both approaches have enjoyed a certain popularity in the past, their application has been limited to rather low-dimensional and weakly nonlinear systems due to the exponential growth of the number...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2006.18.12.3097

    authors: Franz MO,Schölkopf B

    更新日期:2006-12-01 00:00:00

  • Estimating a state-space model from point process observations: a note on convergence.

    abstract::Physiological signals such as neural spikes and heartbeats are discrete events in time, driven by continuous underlying systems. A recently introduced data-driven model to analyze such a system is a state-space model with point process observations, parameters of which and the underlying state sequence are simultaneou...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2010.07-09-1047

    authors: Yuan K,Niranjan M

    更新日期:2010-08-01 00:00:00

  • Temporal coding: assembly formation through constructive interference.

    abstract::Temporal coding is studied for an oscillatory neural network model with synchronization and acceleration. The latter mechanism refers to increasing (decreasing) the phase velocity of each unit for stronger (weaker) or more coherent (decoherent) input from the other units. It has been demonstrated that acceleration gen...

    journal_title:Neural computation

    pub_type: 信件

    doi:10.1162/neco.2008.09-06-342

    authors: Burwick T

    更新日期:2008-07-01 00:00:00

  • Determining Burst Firing Time Distributions from Multiple Spike Trains.

    abstract::Recent experimental findings have shown the presence of robust and cell-type-specific intraburst firing patterns in bursting neurons. We address the problem of characterizing these patterns under the assumption that the bursts exhibit well-defined firing time distributions. We propose a method for estimating these dis...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2008.07-07-571

    authors: Lago-Fernández LF,Szücs A,Varona P

    更新日期:2009-04-01 00:00:00

  • Traveling waves of excitation in neural field models: equivalence of rate descriptions and integrate-and-fire dynamics.

    abstract::Field models provide an elegant mathematical framework to analyze large-scale patterns of neural activity. On the microscopic level, these models are usually based on either a firing-rate picture or integrate-and-fire dynamics. This article shows that in spite of the large conceptual differences between the two types ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/08997660260028656

    authors: Cremers D,Herz AV

    更新日期:2002-07-01 00:00:00

  • A Gaussian attractor network for memory and recognition with experience-dependent learning.

    abstract::Attractor networks are widely believed to underlie the memory systems of animals across different species. Existing models have succeeded in qualitatively modeling properties of attractor dynamics, but their computational abilities often suffer from poor representations for realistic complex patterns, spurious attract...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2010.02-09-957

    authors: Hu X,Zhang B

    更新日期:2010-05-01 00:00:00

  • Density-weighted Nyström method for computing large kernel eigensystems.

    abstract::The Nyström method is a well-known sampling-based technique for approximating the eigensystem of large kernel matrices. However, the chosen samples in the Nyström method are all assumed to be of equal importance, which deviates from the integral equation that defines the kernel eigenfunctions. Motivated by this observ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2008.11-07-651

    authors: Zhang K,Kwok JT

    更新日期:2009-01-01 00:00:00

  • Optimal sequential detection of stimuli from multiunit recordings taken in densely populated brain regions.

    abstract::We address the problem of detecting the presence of a recurring stimulus by monitoring the voltage on a multiunit electrode located in a brain region densely populated by stimulus reactive neurons. Published experimental results suggest that under these conditions, when a stimulus is present, the measurements are gaus...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00257

    authors: Nossenson N,Messer H

    更新日期:2012-04-01 00:00:00

  • Employing the zeta-transform to optimize the calculation of the synaptic conductance of NMDA and other synaptic channels in network simulations.

    abstract::Calculation of the total conductance change induced by multiple synapses at a given membrane compartment remains one of the most time-consuming processes in biophysically realistic neural network simulations. Here we show that this calculation can be achieved in a highly efficient way even for multiply converging syna...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976698300017061

    authors: Köhn J,Wörgötter F

    更新日期:1998-10-01 00:00:00

  • NMDA Receptor Alterations After Mild Traumatic Brain Injury Induce Deficits in Memory Acquisition and Recall.

    abstract::Mild traumatic brain injury (mTBI) presents a significant health concern with potential persisting deficits that can last decades. Although a growing body of literature improves our understanding of the brain network response and corresponding underlying cellular alterations after injury, the effects of cellular disru...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01343

    authors: Gabrieli D,Schumm SN,Vigilante NF,Meaney DF

    更新日期:2021-01-01 00:00:00

  • When Not to Classify: Anomaly Detection of Attacks (ADA) on DNN Classifiers at Test Time.

    abstract::A significant threat to the recent, wide deployment of machine learning-based systems, including deep neural networks (DNNs), is adversarial learning attacks. The main focus here is on evasion attacks against DNN-based classifiers at test time. While much work has focused on devising attacks that make small perturbati...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01209

    authors: Miller D,Wang Y,Kesidis G

    更新日期:2019-08-01 00:00:00

  • State-Space Representations of Deep Neural Networks.

    abstract::This letter deals with neural networks as dynamical systems governed by finite difference equations. It shows that the introduction of k -many skip connections into network architectures, such as residual networks and additive dense n...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01165

    authors: Hauser M,Gunn S,Saab S Jr,Ray A

    更新日期:2019-03-01 00:00:00

  • Evaluating auditory performance limits: II. One-parameter discrimination with random-level variation.

    abstract::Previous studies have combined analytical models of stochastic neural responses with signal detection theory (SDT) to predict psychophysical performance limits; however, these studies have typically been limited to simple models and simple psychophysical tasks. A companion article in this issue ("Evaluating Auditory P...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976601750541813

    authors: Heinz MG,Colburn HS,Carney LH

    更新日期:2001-10-01 00:00:00

  • Training nu-support vector classifiers: theory and algorithms.

    abstract::The nu-support vector machine (nu-SVM) for classification proposed by Schölkopf, Smola, Williamson, and Bartlett (2000) has the advantage of using a parameter nu on controlling the number of support vectors. In this article, we investigate the relation between nu-SVM and C-SVM in detail. We show that in general they a...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976601750399335

    authors: Chang CC,Lin CJ

    更新日期:2001-09-01 00:00:00

  • Multispike interactions in a stochastic model of spike-timing-dependent plasticity.

    abstract::Recently we presented a stochastic, ensemble-based model of spike-timing-dependent plasticity. In this model, single synapses do not exhibit plasticity depending on the exact timing of pre- and postsynaptic spikes, but spike-timing-dependent plasticity emerges only at the temporal or synaptic ensemble level. We showed...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco.2007.19.5.1362

    authors: Appleby PA,Elliott T

    更新日期:2007-05-01 00:00:00

  • Variations on the Theme of Synaptic Filtering: A Comparison of Integrate-and-Express Models of Synaptic Plasticity for Memory Lifetimes.

    abstract::Integrate-and-express models of synaptic plasticity propose that synapses integrate plasticity induction signals before expressing synaptic plasticity. By discerning trends in their induction signals, synapses can control destabilizing fluctuations in synaptic strength. In a feedforward perceptron framework with binar...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00889

    authors: Elliott T

    更新日期:2016-11-01 00:00:00

  • A sensorimotor map: modulating lateral interactions for anticipation and planning.

    abstract::Experimental studies of reasoning and planned behavior have provided evidence that nervous systems use internal models to perform predictive motor control, imagery, inference, and planning. Classical (model-free) reinforcement learning approaches omit such a model; standard sensorimotor models account for forward and ...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976606776240995

    authors: Toussaint M

    更新日期:2006-05-01 00:00:00

  • Analyzing and Accelerating the Bottlenecks of Training Deep SNNs With Backpropagation.

    abstract::Spiking neural networks (SNNs) with the event-driven manner of transmitting spikes consume ultra-low power on neuromorphic chips. However, training deep SNNs is still challenging compared to convolutional neural networks (CNNs). The SNN training algorithms have not achieved the same performance as CNNs. In this letter...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01319

    authors: Chen R,Li L

    更新日期:2020-12-01 00:00:00

  • Computing with self-excitatory cliques: A model and an application to hyperacuity-scale computation in visual cortex.

    abstract::We present a model of visual computation based on tightly inter-connected cliques of pyramidal cells. It leads to a formal theory of cell assemblies, a specific relationship between correlated firing patterns and abstract functionality, and a direct calculation relating estimates of cortical cell counts to orientation...

    journal_title:Neural computation

    pub_type: 杂志文章,评审

    doi:10.1162/089976699300016782

    authors: Miller DA,Zucker SW

    更新日期:1999-01-01 00:00:00

  • The time-organized map algorithm: extending the self-organizing map to spatiotemporal signals.

    abstract::The new time-organized map (TOM) is presented for a better understanding of the self-organization and geometric structure of cortical signal representations. The algorithm extends the common self-organizing map (SOM) from the processing of purely spatial signals to the processing of spatiotemporal signals. The main ad...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976603765202695

    authors: Wiemer JC

    更新日期:2003-05-01 00:00:00

  • Normalization enables robust validation of disparity estimates from neural populations.

    abstract::Binocular fusion takes place over a limited region smaller than one degree of visual angle (Panum's fusional area), which is on the order of the range of preferred disparities measured in populations of disparity-tuned neurons in the visual cortex. However, the actual range of binocular disparities encountered in natu...

    journal_title:Neural computation

    pub_type: 信件

    doi:10.1162/neco.2008.05-07-532

    authors: Tsang EK,Shi BE

    更新日期:2008-10-01 00:00:00

  • Asynchronous Event-Based Motion Processing: From Visual Events to Probabilistic Sensory Representation.

    abstract::In this work, we propose a two-layered descriptive model for motion processing from retina to the cortex, with an event-based input from the asynchronous time-based image sensor (ATIS) camera. Spatial and spatiotemporal filtering of visual scenes by motion energy detectors has been implemented in two steps in a simple...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/neco_a_01191

    authors: Khoei MA,Ieng SH,Benosman R

    更新日期:2019-06-01 00:00:00

  • Random embedding machines for pattern recognition.

    abstract::Real classification problems involve structured data that can be essentially grouped into a relatively small number of clusters. It is shown that, under a local clustering condition, a set of points of a given class, embedded in binary space by a set of randomly parameterized surfaces, is linearly separable from other...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/089976601753196012

    authors: Baram Y

    更新日期:2001-11-01 00:00:00

  • Parameter learning for alpha integration.

    abstract::In pattern recognition, data integration is an important issue, and when properly done, it can lead to improved performance. Also, data integration can be used to help model and understand multimodal processing in the brain. Amari proposed α-integration as a principled way of blending multiple positive measures (e.g.,...

    journal_title:Neural computation

    pub_type: 杂志文章

    doi:10.1162/NECO_a_00445

    authors: Choi H,Choi S,Choe Y

    更新日期:2013-06-01 00:00:00