Photorealistic style transfer is a challenging task, which demands the stylized image remains real. Existing methods are still suffering from unrealistic artifacts and heavy computational cost. In this paper, we propose a novel Style-Corpus Constrained Learning (SCCL) scheme to address these issues. The style-corpus with the style-specific and style-agnostic characteristics simultaneously is proposed to constrain the stylized image with the style consistency among different samples, which improves photorealism of stylization output. By using adversarial distillation learning strategy, a simple fast-to-execute network is trained to substitute previous complex feature transforms models, which reduces the computational cost significantly. Experiments demonstrate that our method produces rich-detailed photorealistic images, with 13 ~ 50 times faster than the state-of-the-art method (WCT).
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1109/TIP.2021.3058566 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!