CSpace  > 中国科学院计算技术研究所期刊论文  > 英文
A Hierarchical CNN-RNN Approach for Visual Emotion Classification
Li, Liang1; Zhu, Xinge2; Hao, Yiming3; Wang, Shuhui1; Gao, Xingyu4,6; Huang, Qingming2,4,5
2019
发表期刊ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS
ISSN1551-6857
卷号15期号:3页码:17
摘要Visual emotion classification is predicting emotional reactions of people for the given visual content. Psychological studies show that human emotions are affected by various visual stimuli from low level to high level, including contrast, color, texture, scene, object, and association, among others. Traditional approaches regarded different levels of stimuli as independent components and ignored to effectively fuse different stimuli. This article proposes a hierarchical convolutional neural network (CNN)-recurrent neural network (RNN) approach to predict the emotion based on the fused stimuli by exploiting the dependency among different-level features. First, we introduce a dual CNN to extract different levels of visual stimulus, where two related loss functions are designed to learn the stimuli representation under a multi-task learning structure. Further, to model the dependency between the low- and high-level stimulus, a stacked bi-directional RNN is proposed to fuse the preceding learned features from the dual CNN. Comparison experiments on one large-scale and three small scale datasets show that the proposed approach brings significant improvement. Ablation experiments demonstrate the effectiveness of different modules from our model.
关键词Visual emotion recognition multi-task learning feature fusing hierarchical CNN-RNN stacked bi-directional RNN
DOI10.1145/3359753
收录类别SCI
语种英语
资助项目National MCF Energy RD Program[2018YFE0303100] ; National Natural Science Foundation of China[61771457] ; National Natural Science Foundation of China[61732007] ; National Natural Science Foundation of China[61772494] ; National Natural Science Foundation of China[61672497] ; National Natural Science Foundation of China[61836002] ; National Natural Science Foundation of China[61472389] ; National Natural Science Foundation of China[61702491] ; National Natural Science Foundation of China[61620106009] ; National Natural Science Foundation of China[U1636214] ; Key Research Program of Frontier Sciences, Chinese Academy of Sciences[QYZDJ-SSWSYS013]
WOS研究方向Computer Science
WOS类目Computer Science, Information Systems ; Computer Science, Software Engineering ; Computer Science, Theory & Methods
WOS记录号WOS:000535718800013
出版者ASSOC COMPUTING MACHINERY
引用统计
被引频次:21[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符http://119.78.100.204/handle/2XEOYT63/15315
专题中国科学院计算技术研究所期刊论文_英文
通讯作者Wang, Shuhui
作者单位1.Chinese Acad Sci, Key Lab Intelligent Informat Proc, Inst Comp Technol, CAS, 6 Kexueyuan South Rd, Beijing 100190, Peoples R China
2.Univ Chinese Acad Sci, Sch Comp Sci & Technol, 19 A Yuquan Rd, Beijing 100049, Peoples R China
3.Shandong Univ, Shanda South Rd 27, Jinan 250100, Peoples R China
4.Chinese Acad Sci, Beijing, Peoples R China
5.Peng Cheng Lab, Shenzhen, Peoples R China
6.Chinese Acad Sci, Inst Microelect, Beijing 100029, Peoples R China
推荐引用方式
GB/T 7714
Li, Liang,Zhu, Xinge,Hao, Yiming,et al. A Hierarchical CNN-RNN Approach for Visual Emotion Classification[J]. ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS,2019,15(3):17.
APA Li, Liang,Zhu, Xinge,Hao, Yiming,Wang, Shuhui,Gao, Xingyu,&Huang, Qingming.(2019).A Hierarchical CNN-RNN Approach for Visual Emotion Classification.ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS,15(3),17.
MLA Li, Liang,et al."A Hierarchical CNN-RNN Approach for Visual Emotion Classification".ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS 15.3(2019):17.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Li, Liang]的文章
[Zhu, Xinge]的文章
[Hao, Yiming]的文章
百度学术
百度学术中相似的文章
[Li, Liang]的文章
[Zhu, Xinge]的文章
[Hao, Yiming]的文章
必应学术
必应学术中相似的文章
[Li, Liang]的文章
[Zhu, Xinge]的文章
[Hao, Yiming]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。