Institute of Computing Technology, Chinese Academy IR
A Comprehensive Framework for Long-Tailed Learning via Pretraining and Normalization | |
Kang, Nan1,2; Chang, Hong1,2,3; Ma, Bingpeng4; Shan, Shiguang1,2,3 | |
2022-07-27 | |
发表期刊 | IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS |
ISSN | 2162-237X |
页码 | 13 |
摘要 | Data in the visual world often present long-tailed distributions. However, learning high-quality representations and classifiers for imbalanced data is still challenging for data-driven deep learning models. In this work, we aim at improving the feature extractor and classifier for long-tailed recognition via contrastive pretraining and feature normalization, respectively. First, we carefully study the influence of contrastive pretraining under different conditions, showing that current self-supervised pretraining for long-tailed learning is still suboptimal in both performance and speed. We thus propose a new balanced contrastive loss and a fast contrastive initialization scheme to improve previous long-tailed pretraining. Second, based on the motivative analysis on the normalization for classifier, we propose a novel generalized normalization classifier that consists of generalized normalization and grouped learnable scaling. It outperforms traditional inner product classifier as well as cosine classifier. Both the two components proposed can improve recognition ability on tail classes without the expense of head classes. We finally build a unified framework that achieves competitive performance compared with state of the arts on several long-tailed recognition benchmarks and maintains high efficiency. |
关键词 | Training Tail Task analysis Head Visualization Feature extraction Data models Classifier design contrastive learning long-tailed recognition normalization |
DOI | 10.1109/TNNLS.2022.3192475 |
收录类别 | SCI |
语种 | 英语 |
资助项目 | Natural Science Foundation of China (NSFC)[U19B2036] ; Natural Science Foundation of China (NSFC)[61976203] ; Natural Science Foundation of China (NSFC)[61876171] ; Fundamental Research Funds for the Central Universities |
WOS研究方向 | Computer Science ; Engineering |
WOS类目 | Computer Science, Artificial Intelligence ; Computer Science, Hardware & Architecture ; Computer Science, Theory & Methods ; Engineering, Electrical & Electronic |
WOS记录号 | WOS:000833050600001 |
出版者 | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC |
引用统计 | |
文献类型 | 期刊论文 |
条目标识符 | http://119.78.100.204/handle/2XEOYT63/19487 |
专题 | 中国科学院计算技术研究所期刊论文_英文 |
通讯作者 | Chang, Hong |
作者单位 | 1.Chinese Acad Sci, Inst Comp Technol, Key Lab Intelligent Informat Proc, Beijing 100190, Peoples R China 2.Univ Chinese Acad Sci, Sch Comp Sci & Technol, Beijing 100049, Peoples R China 3.Peng Cheng Lab, Shenzhen 518055, Peoples R China 4.Univ Chinese Acad Sci, Sch Comp Sci & Technol, Beijing 100049, Peoples R China |
推荐引用方式 GB/T 7714 | Kang, Nan,Chang, Hong,Ma, Bingpeng,et al. A Comprehensive Framework for Long-Tailed Learning via Pretraining and Normalization[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,2022:13. |
APA | Kang, Nan,Chang, Hong,Ma, Bingpeng,&Shan, Shiguang.(2022).A Comprehensive Framework for Long-Tailed Learning via Pretraining and Normalization.IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,13. |
MLA | Kang, Nan,et al."A Comprehensive Framework for Long-Tailed Learning via Pretraining and Normalization".IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS (2022):13. |
条目包含的文件 | 条目无相关文件。 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论