CSpace  > 中国科学院计算技术研究所期刊论文  > 英文
A Comprehensive Framework for Long-Tailed Learning via Pretraining and Normalization
Kang, Nan1,2; Chang, Hong1,2,3; Ma, Bingpeng4; Shan, Shiguang1,2,3
2022-07-27
发表期刊IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
ISSN2162-237X
页码13
摘要Data in the visual world often present long-tailed distributions. However, learning high-quality representations and classifiers for imbalanced data is still challenging for data-driven deep learning models. In this work, we aim at improving the feature extractor and classifier for long-tailed recognition via contrastive pretraining and feature normalization, respectively. First, we carefully study the influence of contrastive pretraining under different conditions, showing that current self-supervised pretraining for long-tailed learning is still suboptimal in both performance and speed. We thus propose a new balanced contrastive loss and a fast contrastive initialization scheme to improve previous long-tailed pretraining. Second, based on the motivative analysis on the normalization for classifier, we propose a novel generalized normalization classifier that consists of generalized normalization and grouped learnable scaling. It outperforms traditional inner product classifier as well as cosine classifier. Both the two components proposed can improve recognition ability on tail classes without the expense of head classes. We finally build a unified framework that achieves competitive performance compared with state of the arts on several long-tailed recognition benchmarks and maintains high efficiency.
关键词Training Tail Task analysis Head Visualization Feature extraction Data models Classifier design contrastive learning long-tailed recognition normalization
DOI10.1109/TNNLS.2022.3192475
收录类别SCI
语种英语
资助项目Natural Science Foundation of China (NSFC)[U19B2036] ; Natural Science Foundation of China (NSFC)[61976203] ; Natural Science Foundation of China (NSFC)[61876171] ; Fundamental Research Funds for the Central Universities
WOS研究方向Computer Science ; Engineering
WOS类目Computer Science, Artificial Intelligence ; Computer Science, Hardware & Architecture ; Computer Science, Theory & Methods ; Engineering, Electrical & Electronic
WOS记录号WOS:000833050600001
出版者IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
引用统计
被引频次:3[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符http://119.78.100.204/handle/2XEOYT63/19487
专题中国科学院计算技术研究所期刊论文_英文
通讯作者Chang, Hong
作者单位1.Chinese Acad Sci, Inst Comp Technol, Key Lab Intelligent Informat Proc, Beijing 100190, Peoples R China
2.Univ Chinese Acad Sci, Sch Comp Sci & Technol, Beijing 100049, Peoples R China
3.Peng Cheng Lab, Shenzhen 518055, Peoples R China
4.Univ Chinese Acad Sci, Sch Comp Sci & Technol, Beijing 100049, Peoples R China
推荐引用方式
GB/T 7714
Kang, Nan,Chang, Hong,Ma, Bingpeng,et al. A Comprehensive Framework for Long-Tailed Learning via Pretraining and Normalization[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,2022:13.
APA Kang, Nan,Chang, Hong,Ma, Bingpeng,&Shan, Shiguang.(2022).A Comprehensive Framework for Long-Tailed Learning via Pretraining and Normalization.IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,13.
MLA Kang, Nan,et al."A Comprehensive Framework for Long-Tailed Learning via Pretraining and Normalization".IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS (2022):13.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Kang, Nan]的文章
[Chang, Hong]的文章
[Ma, Bingpeng]的文章
百度学术
百度学术中相似的文章
[Kang, Nan]的文章
[Chang, Hong]的文章
[Ma, Bingpeng]的文章
必应学术
必应学术中相似的文章
[Kang, Nan]的文章
[Chang, Hong]的文章
[Ma, Bingpeng]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。