CSpace  > 中国科学院计算技术研究所期刊论文  > 英文
Low-redundancy distillation for continual learning
Liu, Ruiqi1,2; Diao, Boyu1,2; Huang, Libo1; An, Zijia1,2; Liu, Hangda1,2; An, Zhulin1,2; Xu, Yongjun1,2
2025-11-01
发表期刊PATTERN RECOGNITION
ISSN0031-3203
卷号167页码:12
摘要Continual learning (CL) aims to learn new tasks without erasing previous knowledge. However, current CL methods primarily emphasize improving accuracy while often neglecting training efficiency, which consequently restricts their practical application. Drawing inspiration from the brain's contextual gating mechanism, which selectively filters neural information and continuously updates past memories, we propose Low-redundancy Distillation (LoRD), a novel CL method that enhances model performance while maintaining training efficiency. This is achieved by eliminating redundancy in three aspects of CL: student model redundancy, teacher model redundancy, and rehearsal sample redundancy. By compressing the learnable parameters of the student model and pruning the teacher model, LoRD facilitates the retention and optimization of prior knowledge, effectively decoupling task-specific knowledge without manually assigning isolated parameters for each task. Furthermore, we optimize the selection of rehearsal samples and refine rehearsal frequency to improve training efficiency. Through a meticulous design of distillation and rehearsal strategies, LoRD effectively balances training efficiency and model precision. Extensive experimentation across various benchmark datasets and environments demonstrates LoRD's superiority, achieving the highest accuracy with the lowest training FLOPs.
关键词Continual learning Lifelong learning Catastrophic forgetting Knowledge distillation Experience replay
DOI10.1016/j.patcog.2025.111712
收录类别SCI
语种英语
WOS研究方向Computer Science ; Engineering
WOS类目Computer Science, Artificial Intelligence ; Engineering, Electrical & Electronic
WOS记录号WOS:001497128300001
出版者ELSEVIER SCI LTD
引用统计
文献类型期刊论文
条目标识符http://119.78.100.204/handle/2XEOYT63/42408
专题中国科学院计算技术研究所期刊论文_英文
通讯作者Diao, Boyu
作者单位1.Chinese Acad Sci, Inst Comp Technol, Beijing 100190, Peoples R China
2.Univ Chinese Acad Sci, Beijing 100049, Peoples R China
推荐引用方式
GB/T 7714
Liu, Ruiqi,Diao, Boyu,Huang, Libo,et al. Low-redundancy distillation for continual learning[J]. PATTERN RECOGNITION,2025,167:12.
APA Liu, Ruiqi.,Diao, Boyu.,Huang, Libo.,An, Zijia.,Liu, Hangda.,...&Xu, Yongjun.(2025).Low-redundancy distillation for continual learning.PATTERN RECOGNITION,167,12.
MLA Liu, Ruiqi,et al."Low-redundancy distillation for continual learning".PATTERN RECOGNITION 167(2025):12.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Liu, Ruiqi]的文章
[Diao, Boyu]的文章
[Huang, Libo]的文章
百度学术
百度学术中相似的文章
[Liu, Ruiqi]的文章
[Diao, Boyu]的文章
[Huang, Libo]的文章
必应学术
必应学术中相似的文章
[Liu, Ruiqi]的文章
[Diao, Boyu]的文章
[Huang, Libo]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。