Few-Shot Class-Incremental Learning (FSCIL) techniques are essential for developing Deep Learning (DL) models that can continuously learn new classes with limited samples while retaining existing knowledge. This capability is particularly crucial for DL-based retinal disease diagnosis system, where acquiring large annotated datasets is challenging, and disease phenotypes evolve over time. This paper introduces Re-FSCIL, a novel framework for Few-Shot Class-Incremental Retinal Disease Recognition (FSCIRDR). Re-FSCIL integrates the RETFound model with a fine-grained module, employing a forward-compatible training strategy to improve adaptability, supervised contrastive learning to enhance feature discrimination, and feature fusion for robust representation quality. We convert existing datasets into the FSCIL format and reproduce numerous representative FSCIL methods to create two new benchmarks, RFMiD38 and JSIEC39, specifically for FSCIRDR. Our experimental results demonstrate that Re-FSCIL achieves State-of-the-art (SOTA) performance, significantly surpassing existing FSCIL methods on these benchmarks.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1109/JBHI.2024.3457915 | DOI Listing |
IEEE Trans Pattern Anal Mach Intell
November 2024
Depicting novel classes with language descriptions by observing few-shot samples is inherent in human-learning systems. This lifelong learning capability helps to distinguish new knowledge from old ones through the increase of open-world learning, namely Few-Shot Class-Incremental Learning (FSCIL). Existing works to solve this problem mainly rely on the careful tuning of visual encoders, which shows an evident trade-off between the base knowledge and incremental ones.
View Article and Find Full Text PDFThe proliferation of Few-Shot Class Incremental Learning (FSCIL) methodologies has highlighted the critical challenge of maintaining robust anti-amnesia capabilities in FSCIL learners. In this paper, we present a novel conceptualization of anti-amnesia in terms of mathematical generalization, leveraging the Neural Tangent Kernel (NTK) perspective. Our method focuses on two key aspects: ensuring optimal NTK convergence and minimizing NTK-related generalization loss, which serve as the theoretical foundation for cross-task generalization.
View Article and Find Full Text PDFSci Rep
October 2024
College of Computer and Information Sciences, Prince Sultan University, Riyadh, Saudi Arabia.
IEEE J Biomed Health Inform
September 2024
IEEE Trans Neural Netw Learn Syst
September 2024
Few-shot class-incremental learning (FSCIL) aims to learn new classes incrementally with a limited number of samples per class. Foundation models combined with prompt tuning showcase robust generalization and zero-shot learning (ZSL) capabilities, endowing them with potential advantages in transfer capabilities for FSCIL. However, existing prompt tuning methods excel in optimizing for stationary datasets, diverging from the inherent sequential nature in the FSCIL paradigm.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!