On joint parameterizations of linear and nonlinear functionals in neural networks.

Neural Netw

Université Savoie Mont Blanc, Laboratory of Computer Science, Systems, Information and Knowledge Processing, BP 80439 - F-74944 Annecy-le-Vieux Cedex, France.

Published: March 2023

The paper proposes a new class of nonlinear operators and a dual learning paradigm where optimization jointly concerns both linear convolutional weights and the parameters of these nonlinear operators. The nonlinear class proposed to perform a rich functional representation is composed by functions called rectified parametric sigmoid units. This class is constructed to benefit from the advantages of both sigmoid and rectified linear unit functions, while rejecting their respective drawbacks. Moreover, the analytic form of this new neural class involves scale, shift and shape parameters to obtain a wide range of activation shapes, including the standard rectified linear unit as a limit case. Parameters of this neural transfer class are considered as learnable for the sake of discovering the complex shapes that can contribute to solving machine learning issues. Performance achieved by the joint learning of convolutional and rectified parametric sigmoid learnable parameters are shown to be outstanding in both shallow and deep learning frameworks. This class opens new prospects with respect to machine learning in the sense that main learnable parameters are attached not only to linear transformations, but also to a wide range of nonlinear operators.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.neunet.2022.12.019DOI Listing

Publication Analysis

Top Keywords

nonlinear operators
12
rectified parametric
8
parametric sigmoid
8
rectified linear
8
linear unit
8
wide range
8
machine learning
8
learnable parameters
8
class
6
linear
5

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!