期刊文献+
共找到3篇文章
< 1 >
每页显示 20 50 100
Deep learning algorithm featuring continuous learning for modulation classifications in wireless networks
1
作者 WU Nan SUN Yu WANG Xudong 《太赫兹科学与电子信息学报》 2024年第2期209-218,共10页
Although modulation classification based on deep neural network can achieve high Modulation Classification(MC)accuracies,catastrophic forgetting will occur when the neural network model continues to learn new tasks.In... Although modulation classification based on deep neural network can achieve high Modulation Classification(MC)accuracies,catastrophic forgetting will occur when the neural network model continues to learn new tasks.In this paper,we simulate the dynamic wireless communication environment and focus on breaking the learning paradigm of isolated automatic MC.We innovate a research algorithm for continuous automatic MC.Firstly,a memory for storing representative old task modulation signals is built,which is employed to limit the gradient update direction of new tasks in the continuous learning stage to ensure that the loss of old tasks is also in a downward trend.Secondly,in order to better simulate the dynamic wireless communication environment,we employ the mini-batch gradient algorithm which is more suitable for continuous learning.Finally,the signal in the memory can be replayed to further strengthen the characteristics of the old task signal in the model.Simulation results verify the effectiveness of the method. 展开更多
关键词 Deep Learning(DL) modulation classification continuous learning catastrophic forgetting cognitive radio communications
下载PDF
Squeezing More Past Knowledge for Online Class-Incremental Continual Learning 被引量:1
2
作者 Da Yu Mingyi Zhang +4 位作者 Mantian Li Fusheng Zha Junge Zhang Lining Sun Kaiqi Huang 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2023年第3期722-736,共15页
Continual learning(CL)studies the problem of learning to accumulate knowledge over time from a stream of data.A crucial challenge is that neural networks suffer from performance degradation on previously seen data,kno... Continual learning(CL)studies the problem of learning to accumulate knowledge over time from a stream of data.A crucial challenge is that neural networks suffer from performance degradation on previously seen data,known as catastrophic forgetting,due to allowing parameter sharing.In this work,we consider a more practical online class-incremental CL setting,where the model learns new samples in an online manner and may continuously experience new classes.Moreover,prior knowledge is unavailable during training and evaluation.Existing works usually explore sample usages from a single dimension,which ignores a lot of valuable supervisory information.To better tackle the setting,we propose a novel replay-based CL method,which leverages multi-level representations produced by the intermediate process of training samples for replay and strengthens supervision to consolidate previous knowledge.Specifically,besides the previous raw samples,we store the corresponding logits and features in the memory.Furthermore,to imitate the prediction of the past model,we construct extra constraints by leveraging multi-level information stored in the memory.With the same number of samples for replay,our method can use more past knowledge to prevent interference.We conduct extensive evaluations on several popular CL datasets,and experiments show that our method consistently outperforms state-of-the-art methods with various sizes of episodic memory.We further provide a detailed analysis of these results and demonstrate that our method is more viable in practical scenarios. 展开更多
关键词 catastrophic forgetting class-incremental learning continual learning(CL) experience replay
下载PDF
Continual learning fault diagnosis:A dual-branch adaptive aggregation residual network for fault diagnosis with machine increments 被引量:1
3
作者 Bojian CHEN Changqing SHEN +4 位作者 Juanjuan SHI Lin KONG Luyang TAN Dong WANG Zhongkui ZHU 《Chinese Journal of Aeronautics》 SCIE EI CAS CSCD 2023年第6期361-377,共17页
As a data-driven approach, Deep Learning(DL)-based fault diagnosis methods need to collect the relatively comprehensive data on machine fault types to achieve satisfactory performance. A mechanical system may include ... As a data-driven approach, Deep Learning(DL)-based fault diagnosis methods need to collect the relatively comprehensive data on machine fault types to achieve satisfactory performance. A mechanical system may include multiple submachines in the real-world. During condition monitoring of a mechanical system, fault data are distributed in a continuous flow of constantly generated information and new faults will inevitably occur in unconsidered submachines, which are also called machine increments. Therefore, adequately collecting fault data in advance is difficult. Limited by the characteristics of DL, training existing models directly with new fault data of new submachines leads to catastrophic forgetting of old tasks, while the cost of collecting all known data to retrain the models is excessively high. DL-based fault diagnosis methods cannot learn continually and adaptively in dynamic environments. A new Continual Learning Fault Diagnosis method(CLFD) is proposed in this paper to solve a series of fault diagnosis tasks with machine increments. The stability–plasticity dilemma is an intrinsic issue in continual learning. The core of CLFD is the proposed Dual-branch Adaptive Aggregation Residual Network(DAARN).Two types of residual blocks are created in each block layer of DAARN: steady and dynamic blocks. The stability–plasticity dilemma is solved by assigning them with adaptive aggregation weights to balance stability and plasticity, and a bi-level optimization program is used to optimize adaptive aggregation weights and model parameters. In addition, a feature-level knowledge distillation loss function is proposed to further overcome catastrophic forgetting. CLFD is then applied to the fault diagnosis case with machine increments. Results demonstrate that CLFD outperforms other continual learning methods and has satisfactory robustness. 展开更多
关键词 catastrophic forgetting Continual learning Fault diagnosis Knowledge distillation Machine increments Stability-plasticity dilemma
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部