AI Article Synopsis

  • Power Normalizations (PN) are non-linear operators that help manage feature imbalances in classification tasks, and this study focuses on a new PN layer that pools feature maps from CNNs to improve performance.
  • The research delves into two popular PN functions, MaxExp and Gamma, offering probabilistic interpretations and discovering derivatives that aid in efficient training.
  • Additionally, the work connects Spectral Power Normalizations (SPN) to the Heat Diffusion Process in graph analysis, leading to the development of a faster version of MaxExp tailored for covariance matrices, which is evaluated across various recognition and classification scenarios.

Article Abstract

Power Normalizations (PN) are useful non-linear operators which tackle feature imbalances in classification problems. We study PNs in the deep learning setup via a novel PN layer pooling feature maps. Our layer combines the feature vectors and their respective spatial locations in the feature maps produced by the last convolutional layer of CNN into a positive definite matrix with second-order statistics to which PN operators are applied, forming so-called Second-order Pooling (SOP). As the main goal of this paper is to study Power Normalizations, we investigate the role and meaning of MaxExp and Gamma, two popular PN functions. To this end, we provide probabilistic interpretations of such element-wise operators and discover surrogates with well-behaved derivatives for end-to-end training. Furthermore, we look at the spectral applicability of MaxExp and Gamma by studying Spectral Power Normalizations (SPN). We show that SPN on the autocorrelation/covariance matrix and the Heat Diffusion Process (HDP) on a graph Laplacian matrix are closely related, thus sharing their properties. Such a finding leads us to the culmination of our work, a fast spectral MaxExp which is a variant of HDP for covariances/autocorrelation matrices. We evaluate our ideas on fine-grained recognition, scene recognition, and material classification, as well as in few-shot learning and graph classification.

Download full-text PDF

Source
http://dx.doi.org/10.1109/TPAMI.2021.3107164DOI Listing

Publication Analysis

Top Keywords

power normalizations
16
graph classification
8
feature maps
8
maxexp gamma
8
power
4
normalizations fine-grained
4
fine-grained image
4
image few-shot
4
few-shot image
4
image graph
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!