AI Article Synopsis

  • Training a generative model with limited data, like just 10 samples, poses significant challenges, often leading to overfitting and a loss of content while adapting style.
  • Recent methods attempt to maintain a content-style correspondence, but they still struggle with diversity and effective style adaptation.
  • The proposed method introduces a paired image reconstruction approach with a translation module that helps separate style and content, demonstrating superior performance compared to existing methods in few-shot scenarios.

Article Abstract

Training a generative model with limited data (e.g., 10) is a very challenging task. Many works propose to fine-tune a pretrained GAN model. However, this can easily result in overfitting. In other words, they manage to adapt the style but fail to preserve the content, where style denotes the specific properties that define a domain while content denotes the domain-irrelevant information that represents diversity. Recent works try to maintain a predefined correspondence to preserve the content, however, the diversity is still not enough and it may affect style adaptation. In this work, we propose a paired image reconstruction approach for content preservation. We propose to introduce an image translation module to GAN transferring, where the module teaches the generator to separate style and content, and the generator provides training data to the translation module in return. Qualitative and quantitative experiments show that our method consistently surpasses the state-of-the-art methods in a few-shot setting.

Download full-text PDF

Source
http://dx.doi.org/10.1109/TNNLS.2024.3477467DOI Listing

Publication Analysis

Top Keywords

style adaptation
8
content preservation
8
preserve content
8
translation module
8
content
6
style
5
few-shot image
4
image generation
4
generation style
4
adaptation content
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!