Institute of Computing Technology, Chinese Academy IR
| Low-redundancy distillation for continual learning | |
| Liu, Ruiqi1,2; Diao, Boyu1,2; Huang, Libo1; An, Zijia1,2; Liu, Hangda1,2; An, Zhulin1,2; Xu, Yongjun1,2 | |
| 2025-11-01 | |
| 发表期刊 | PATTERN RECOGNITION
![]() |
| ISSN | 0031-3203 |
| 卷号 | 167页码:12 |
| 摘要 | Continual learning (CL) aims to learn new tasks without erasing previous knowledge. However, current CL methods primarily emphasize improving accuracy while often neglecting training efficiency, which consequently restricts their practical application. Drawing inspiration from the brain's contextual gating mechanism, which selectively filters neural information and continuously updates past memories, we propose Low-redundancy Distillation (LoRD), a novel CL method that enhances model performance while maintaining training efficiency. This is achieved by eliminating redundancy in three aspects of CL: student model redundancy, teacher model redundancy, and rehearsal sample redundancy. By compressing the learnable parameters of the student model and pruning the teacher model, LoRD facilitates the retention and optimization of prior knowledge, effectively decoupling task-specific knowledge without manually assigning isolated parameters for each task. Furthermore, we optimize the selection of rehearsal samples and refine rehearsal frequency to improve training efficiency. Through a meticulous design of distillation and rehearsal strategies, LoRD effectively balances training efficiency and model precision. Extensive experimentation across various benchmark datasets and environments demonstrates LoRD's superiority, achieving the highest accuracy with the lowest training FLOPs. |
| 关键词 | Continual learning Lifelong learning Catastrophic forgetting Knowledge distillation Experience replay |
| DOI | 10.1016/j.patcog.2025.111712 |
| 收录类别 | SCI |
| 语种 | 英语 |
| WOS研究方向 | Computer Science ; Engineering |
| WOS类目 | Computer Science, Artificial Intelligence ; Engineering, Electrical & Electronic |
| WOS记录号 | WOS:001497128300001 |
| 出版者 | ELSEVIER SCI LTD |
| 引用统计 | |
| 文献类型 | 期刊论文 |
| 条目标识符 | http://119.78.100.204/handle/2XEOYT63/42408 |
| 专题 | 中国科学院计算技术研究所期刊论文_英文 |
| 通讯作者 | Diao, Boyu |
| 作者单位 | 1.Chinese Acad Sci, Inst Comp Technol, Beijing 100190, Peoples R China 2.Univ Chinese Acad Sci, Beijing 100049, Peoples R China |
| 推荐引用方式 GB/T 7714 | Liu, Ruiqi,Diao, Boyu,Huang, Libo,et al. Low-redundancy distillation for continual learning[J]. PATTERN RECOGNITION,2025,167:12. |
| APA | Liu, Ruiqi.,Diao, Boyu.,Huang, Libo.,An, Zijia.,Liu, Hangda.,...&Xu, Yongjun.(2025).Low-redundancy distillation for continual learning.PATTERN RECOGNITION,167,12. |
| MLA | Liu, Ruiqi,et al."Low-redundancy distillation for continual learning".PATTERN RECOGNITION 167(2025):12. |
| 条目包含的文件 | 条目无相关文件。 | |||||
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论