An Information Theoretic Approach to Reveal the Formation of Shared Representations.

Front Comput Neurosci

Basic Research Group, Araya Inc., Tokyo, Japan.

Published: January 2020

Modality-invariant categorical representations, i.e., shared representation, is thought to play a key role in learning to categorize multi-modal information. We have investigated how a bimodal autoencoder can form a shared representation in an unsupervised manner with multi-modal data. We explored whether altering the depth of the network and mixing the multi-modal inputs at the input layer affect the development of the shared representations. Based on the activation of units in the hidden layers, we classified them into four different types: visual cells, auditory cells, inconsistent visual and auditory cells, and consistent visual and auditory cells. Our results show that the number and quality of the last type (i.e., shared representation) significantly differ depending on the depth of the network and are enhanced when the network receives mixed inputs as opposed to separate inputs for each modality, as occurs in typical two-stage frameworks. In the present work, we present a way to utilize information theory to understand the abstract representations formed in the hidden layers of the network. We believe that such an information theoretic approach could potentially provide insights into the development of more efficient and cost-effective ways to train neural networks using qualitative measures of the representations that cannot be captured by analyzing only the final outputs of the networks.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7001587PMC
http://dx.doi.org/10.3389/fncom.2020.00001DOI Listing

Publication Analysis

Top Keywords

shared representation
12
auditory cells
12
theoretic approach
8
shared representations
8
depth network
8
hidden layers
8
visual auditory
8
shared
5
representations
5
approach reveal
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!