In the few-shot class incremental learning (FSCIL) setting, new classes with few training examples become available incrementally, and deep learning models suffer from catastrophic forgetting of the previous classes when trained on new classes. Data augmentation techniques are generally used to increase the training data and improve the model performance. In this work, we demonstrate that differently augmented views of the same image obtained by applying data augmentations may not necessarily activate the same set of neurons in the model.
View Article and Find Full Text PDFA class-incremental learning problem is characterized by training data becoming available in a phase-by-phase manner. Deep learning models suffer from catastrophic forgetting of the classes in the older phases as they get trained on the classes introduced in the new phase. In this work, we show that the change in orientation of an image has a considerable effect on the model prediction accuracy, which in turn demonstrates the different rates of catastrophic forgetting for the different orientations of the same image, which is a novel finding.
View Article and Find Full Text PDFIEEE Trans Pattern Anal Mach Intell
March 2024
In the task incremental learning problem, deep learning models suffer from catastrophic forgetting of previously seen classes/tasks as they are trained on new classes/tasks. This problem becomes even harder when some of the test classes do not belong to the training class set, i.e.
View Article and Find Full Text PDF