AMPLIFY: attention-based mixup for performance improvement and label smoothing in transformer.

PeerJ Comput Sci

School of Information Science and Technology, Yunnan Normal University, Kunming, Yunnan, China.

Published: April 2024

Mixup is an effective data augmentation method that generates new augmented samples by aggregating linear combinations of different original samples. However, if there are noises or aberrant features in the original samples, mixup may propagate them to the augmented samples, leading to over-sensitivity of the model to these outliers. To solve this problem, this paper proposes a new mixup method called AMPLIFY. This method uses the attention mechanism of Transformer itself to reduce the influence of noises and aberrant values in the original samples on the prediction results, without increasing additional trainable parameters, and the computational cost is very low, thereby avoiding the problem of high resource consumption in common mixup methods such as Sentence Mixup. The experimental results show that, under a smaller computational resource cost, AMPLIFY outperforms other mixup methods in text classification tasks on seven benchmark datasets, providing new ideas and new ways to further improve the performance of pre-trained models based on the attention mechanism, such as BERT, ALBERT, RoBERTa, and GPT. Our code can be obtained at https://github.com/kiwi-lilo/AMPLIFY.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11157605PMC
http://dx.doi.org/10.7717/peerj-cs.2011DOI Listing

Publication Analysis

Top Keywords

original samples
12
augmented samples
8
noises aberrant
8
attention mechanism
8
mixup methods
8
mixup
7
samples
5
amplify attention-based
4
attention-based mixup
4
mixup performance
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!