Deep Convolution Neural Networks(DCNNs)can capture discriminative features from large datasets.However,how to incrementally learn new samples without forgetting old ones and recognize novel classes that arise in the d...Deep Convolution Neural Networks(DCNNs)can capture discriminative features from large datasets.However,how to incrementally learn new samples without forgetting old ones and recognize novel classes that arise in the dynamically changing world,e.g.,classifying newly discovered fish species,remains an open problem.We address an even more challenging and realistic setting of this problem where new class samples are insufficient,i.e.,Few-Shot Class-Incremental Learning(FSCIL).Current FSCIL methods augment the training data to alleviate the overfitting of novel classes.By contrast,we propose Filter Bank Networks(FBNs)that augment the learnable filters to capture fine-detailed features for adapting to future new classes.In the forward pass,FBNs augment each convolutional filter to a virtual filter bank containing the canonical one,i.e.,itself,and multiple transformed versions.During back-propagation,FBNs explicitly stimulate fine-detailed features to emerge and collectively align all gradients of each filter bank to learn the canonical one.FBNs capture pattern variants that do not yet exist in the pretraining session,thus making it easy to incorporate new classes in the incremental learning phase.Moreover,FBNs introduce model-level prior knowledge to efficiently utilize the limited few-shot data.Extensive experiments on MNIST,CIFAR100,CUB200,andMini-ImageNet datasets show that FBNs consistently outperformthe baseline by a significantmargin,reporting new state-of-the-art FSCIL results.In addition,we contribute a challenging FSCIL benchmark,Fishshot1K,which contains 8261 underwater images covering 1000 ocean fish species.The code is included in the supplementary materials.展开更多
基金support from the Strategic Priority Research Program of the Chinese Academy of Sciences under Grant No.XDA27000000.
文摘Deep Convolution Neural Networks(DCNNs)can capture discriminative features from large datasets.However,how to incrementally learn new samples without forgetting old ones and recognize novel classes that arise in the dynamically changing world,e.g.,classifying newly discovered fish species,remains an open problem.We address an even more challenging and realistic setting of this problem where new class samples are insufficient,i.e.,Few-Shot Class-Incremental Learning(FSCIL).Current FSCIL methods augment the training data to alleviate the overfitting of novel classes.By contrast,we propose Filter Bank Networks(FBNs)that augment the learnable filters to capture fine-detailed features for adapting to future new classes.In the forward pass,FBNs augment each convolutional filter to a virtual filter bank containing the canonical one,i.e.,itself,and multiple transformed versions.During back-propagation,FBNs explicitly stimulate fine-detailed features to emerge and collectively align all gradients of each filter bank to learn the canonical one.FBNs capture pattern variants that do not yet exist in the pretraining session,thus making it easy to incorporate new classes in the incremental learning phase.Moreover,FBNs introduce model-level prior knowledge to efficiently utilize the limited few-shot data.Extensive experiments on MNIST,CIFAR100,CUB200,andMini-ImageNet datasets show that FBNs consistently outperformthe baseline by a significantmargin,reporting new state-of-the-art FSCIL results.In addition,we contribute a challenging FSCIL benchmark,Fishshot1K,which contains 8261 underwater images covering 1000 ocean fish species.The code is included in the supplementary materials.