Deep learning has been immensely successful at a variety of tasks, ranging from classification to artificial intelligence. Learning corresponds to fitting training data, which is implemented by descending a very high-dimensional loss function. Understanding under which conditions neural networks do not get stuck in poor minima of the loss, and how the landscape of that loss evolves as depth is increased, remains a challenge. Here we predict, and test empirically, an analogy between this landscape and the energy landscape of repulsive ellipses. We argue that in fully connected deep networks a phase transition delimits the over- and underparametrized regimes where fitting can or cannot be achieved. In the vicinity of this transition, properties of the curvature of the minima of the loss (the spectrum of the Hessian) are critical. This transition shares direct similarities with the jamming transition by which particles form a disordered solid as the density is increased, which also occurs in certain classes of computational optimization and learning problems such as the perceptron. Our analysis gives a simple explanation as to why poor minima of the loss cannot be encountered in the overparametrized regime. Interestingly, we observe that the ability of fully connected networks to fit random data is independent of their depth, an independence that appears to also hold for real data. We also study a quantity Δ which characterizes how well (Δ<0) or badly (Δ>0) a datum is learned. At the critical point it is power-law distributed on several decades, P_{+}(Δ)∼Δ^{θ} for Δ>0 and P_{-}(Δ)∼(-Δ)^{-γ} for Δ<0, with exponents that depend on the choice of activation function. This observation suggests that near the transition the loss landscape has a hierarchical structure and that the learning dynamics is prone to avalanche-like dynamics, with abrupt changes in the set of patterns that are learned.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1103/PhysRevE.100.012115 | DOI Listing |
Sci Rep
January 2025
School of Mechanical, Electrical, and Information Engineering, Putian University, Putian, 351100, China.
Noise label learning has attracted considerable attention owing to its ability to leverage large amounts of inexpensive and imprecise data. Sharpness aware minimization (SAM) has shown effective improvements in the generalization performance in the presence of noisy labels by introducing adversarial weight perturbations in the model parameter space. However, our experimental observations have shown that the SAM generalization bottleneck primarily stems from the difficulty of finding the correct adversarial perturbation amidst the noisy data.
View Article and Find Full Text PDFJ Optim Theory Appl
September 2024
Department of Mathematics and RiskLab, ETH Zurich, Zurich, Switzerland.
Dynamical systems theory has recently been applied in optimization to prove that gradient descent algorithms bypass so-called strict saddle points of the loss function. However, in many modern machine learning applications, the required regularity conditions are not satisfied. In this paper, we prove a variant of the relevant dynamical systems result, a center-stable manifold theorem, in which we relax some of the regularity requirements.
View Article and Find Full Text PDFNanophotonics
September 2024
The School of Electrical Engineering, Korea Advanced Institute of Science and Technology (KAIST), Daejeon, Republic of Korea.
Once light is coupled to a photonic chip, its efficient distribution in terms of power splitting throughout silicon photonic circuits is very crucial. We present two types of 1 × 4 power splitters with different splitting ratios of 1:1:1:1 and 2:1:1:2. Various taper configurations were compared and analyzed to find the suitable configuration for the power splitter, and among them, parabolic tapers were chosen.
View Article and Find Full Text PDFWorld Neurosurg
December 2024
Division of Otolaryngology, Department of Surgery, Universidad Nacional de Colombia, Bogota, Colombia; Department of Otolaryngology, Hospital Universitario Nacional. de Colombia, Bogotá, Colombia; Otologist and neurotologist, Department of Otolaryngology, Hospital Universitario Nacional. de Colombia, Bogotá, Colombia.
Sci Rep
October 2024
National Astronomical Observatories, Chinese Academy of Sciences, Beijing, 100012, China.
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!