Deep neural networks can be trained in reciprocal space by acting on the eigenvalues and eigenvectors of suitable transfer operators in direct space. Adjusting the eigenvalues while freezing the eigenvectors yields a substantial compression of the parameter space. This latter scales by definition with the number of computing neurons.
View Article and Find Full Text PDF