Neural architecture search (NAS) is a popular method that can automatically design deep neural network structures. However, designing a neural network using NAS is computationally expensive. This article proposes a gradient-guided evolutionary NAS (GENAS) to design convolutional neural networks (CNNs) for image classification. GENAS is a hybrid algorithm that combines evolutionary global and local search operators to evolve a population of subnets sampled from a supernet. Each candidate architecture is encoded as a table describing which operations are associated with the edges between nodes signifying feature maps. Besides, evolutionary optimization uses novel crossover and mutation operators to manipulate the subnets using the proposed tabular encoding. Every n generations, the candidate architectures undergo a local search inspired by differentiable NAS. GENAS is designed to overcome the limitations of both evolutionary and gradient descent NAS. This algorithmic structure enables the performance assessment of the candidate architecture without retraining, thus limiting the NAS calculation time. Furthermore, subnet individuals are decoupled during evaluation to prevent strong coupling of operations in the supernet. The experimental results indicate that the searched structures achieve test errors of 2.45%, 16.86%, and 23.9% on CIFAR-10/100/ImageNet datasets and it costs only 0.26 GPU days on a graphic card. GENAS can effectively expedite the training and evaluation processes and obtain high-performance network structures.

Download full-text PDF

Source
http://dx.doi.org/10.1109/TNNLS.2024.3371432DOI Listing

Publication Analysis

Top Keywords

gradient-guided evolutionary
8
neural architecture
8
architecture search
8
neural network
8
network structures
8
nas genas
8
local search
8
candidate architecture
8
nas
6
neural
5

Similar Publications

Neural architecture search (NAS) is a popular method that can automatically design deep neural network structures. However, designing a neural network using NAS is computationally expensive. This article proposes a gradient-guided evolutionary NAS (GENAS) to design convolutional neural networks (CNNs) for image classification.

View Article and Find Full Text PDF

It has been widely recognized that the efficient training of neural networks (NNs) is crucial to classification performance. While a series of gradient-based approaches have been extensively developed, they are criticized for the ease of trapping into local optima and sensitivity to hyperparameters. Due to the high robustness and wide applicability, evolutionary algorithms (EAs) have been regarded as a promising alternative for training NNs in recent years.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!