Understanding the role of individual units in a deep neural network.

Proc Natl Acad Sci U S A

Computer Science and Artificial Intelligence Laboratory, Massachusetts Institute of Technology, Cambridge, MA 02139.

Published: December 2020

Deep neural networks excel at finding hierarchical representations that solve complex tasks over large datasets. How can we humans understand these learned representations? In this work, we present network dissection, an analytic framework to systematically identify the semantics of individual hidden units within image classification and image generation networks. First, we analyze a convolutional neural network (CNN) trained on scene classification and discover units that match a diverse set of object concepts. We find evidence that the network has learned many object classes that play crucial roles in classifying scene classes. Second, we use a similar analytic method to analyze a generative adversarial network (GAN) model trained to generate scenes. By analyzing changes made when small sets of units are activated or deactivated, we find that objects can be added and removed from the output scenes while adapting to the context. Finally, we apply our analytic framework to understanding adversarial attacks and to semantic image editing.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7720226PMC
http://dx.doi.org/10.1073/pnas.1907375117DOI Listing

Publication Analysis

Top Keywords

deep neural
8
neural network
8
analytic framework
8
network
5
understanding role
4
role individual
4
units
4
individual units
4
units deep
4
network deep
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!