In this paper, we derive diffusion equation models in the spectral domain to study the evolution of the training error of two-layer multiscale deep neural networks (MscaleDNN) (Cai and Xu, 2019; Liu et al., 2020), which is designed to reduce the spectral bias of fully connected deep neural networks in approximating oscillatory functions. The diffusion models are obtained from the spectral form of the error equation of the MscaleDNN, derived with a neural tangent kernel approach and gradient descent training and a sine activation function, assuming a vanishing learning rate and infinite network width and domain size.
View Article and Find Full Text PDF