Words as a window: Using word embeddings to explore the learned representations of Convolutional Neural Networks.

Neural Netw

University of Alberta, Department of Computing Science & Department of Psychology, 116 St. and 85 Ave., Edmonton, Alberta, Canada. Electronic address:

Published: May 2021

As deep neural net architectures minimize loss, they accumulate information in a hierarchy of learned representations that ultimately serve the network's final goal. Different architectures tackle this problem in slightly different ways, but all create intermediate representational spaces built to inform their final prediction. Here we show that very different neural networks trained on two very different tasks build knowledge representations that display similar underlying patterns. Namely, we show that the representational spaces of several distributional semantic models bear a remarkable resemblance to several Convolutional Neural Network (CNN) architectures (trained for image classification). We use this information to explore the network behavior of CNNs (1) in pretrained models, (2) during training, and (3) during adversarial attacks. We use these findings to motivate several applications aimed at improving future research on CNNs. Our work illustrates the power of using one model to explore another, gives new insights into the function of CNN models, and provides a framework for others to perform similar analyses when developing new architectures. We show that one neural network model can provide a window into understanding another.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.neunet.2020.12.009DOI Listing

Publication Analysis

Top Keywords

learned representations
8
convolutional neural
8
neural networks
8
representational spaces
8
neural network
8
neural
5
window word
4
word embeddings
4
embeddings explore
4
explore learned
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!