CSpace  > 中国科学院计算技术研究所期刊论文
Bi-STAN: bilinear spatial-temporal attention network for wearable human activity recognition
Gao, Chenlong1,2,3; Chen, Yiqiang1,2,3; Jiang, Xinlong1,2,3; Hu, Lisha; Zhao, Zhicheng4,5,6; Zhang, Yuxin7
2023-02-02
发表期刊INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS
ISSN1868-8071
页码17
摘要With the progressive development of ubiquitous computing, wearable human activity recognition is playing an increasingly important role in many fields, such as health monitoring, disease-assisted diagnostic rehabilitation, and exercise assessment. Internal measurement unit in wearable devices provides a rich representation of motion. Human activity recognition based on sensor sequence has proven to be crucial in machine learning research. The key challenge is to extract powerful representational features from multi-sensor data to capture subtle differences in human activities. Beyond this challenge, due to the lack of attention to the temporal and spatial dependence of the data, critical information is often lost in the feature extraction process. Few previous papers can jointly address these two challenges. In this paper, we propose an efficient Bilinear Spatial-Temporal Attention Network (Bi-STAN). Firstly, a multi-scale ResNet backbone network is used to extract multimodal signal features and jointly optimize the feature extraction process. Then, to adaptively focus on what and where is important in the original data and to mine the discriminative part of the features, we design a spatial-temporal attention network. Finally, a bilinear pooling with low redundancy is introduced to efficiently obtain second-order information. Experiments on three public datasets and our real-world dataset demonstrate that the proposed Bi-STAN is superior to existing methods in terms of both accuracy and efficiency.
关键词Human activity recognition Spatial-temporal attention Bilinear pooling Low-redundancy
DOI10.1007/s13042-023-01781-1
收录类别SCI
语种英语
资助项目National Key Research and Development Plan of China[2021YFC2501202] ; Natural Science Foundation of China[61972383] ; ~Beijing Municipal Science & Technology Commission[Z221100002722009] ; ~Youth Innovation Promotion Association CAS ; Science and Technology Research Project of Higher Education of Hebei Province[QN2023184]
WOS研究方向Computer Science
WOS类目Computer Science, Artificial Intelligence
WOS记录号WOS:000923910600001
出版者SPRINGER HEIDELBERG
引用统计
被引频次:4[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符http://119.78.100.204/handle/2XEOYT63/19953
专题中国科学院计算技术研究所期刊论文
通讯作者Chen, Yiqiang
作者单位1.Chinese Acad Sci, Inst Comp Technol, Beijing 100190, Peoples R China
2.Univ Chinese Acad Sci, Beijing 100190, Peoples R China
3.Beijing Key Lab Mobile Comp & Pervas Device, Beijing 100190, Peoples R China
4.Hebei Univ Econ & Business, Inst Informat Technol, Shijiazhuang 050061, Peoples R China
5.Anhui Univ, Sch Artificial Intelligence, Hefei 230601, Peoples R China
6.China Elect Technol Grp Corp, Res Inst 38, Hefei 230088, Peoples R China
7.Global Energy Interconnect Dev & Cooperat Org, Beijing 100031, Peoples R China
推荐引用方式
GB/T 7714
Gao, Chenlong,Chen, Yiqiang,Jiang, Xinlong,et al. Bi-STAN: bilinear spatial-temporal attention network for wearable human activity recognition[J]. INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS,2023:17.
APA Gao, Chenlong,Chen, Yiqiang,Jiang, Xinlong,Hu, Lisha,Zhao, Zhicheng,&Zhang, Yuxin.(2023).Bi-STAN: bilinear spatial-temporal attention network for wearable human activity recognition.INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS,17.
MLA Gao, Chenlong,et al."Bi-STAN: bilinear spatial-temporal attention network for wearable human activity recognition".INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS (2023):17.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Gao, Chenlong]的文章
[Chen, Yiqiang]的文章
[Jiang, Xinlong]的文章
百度学术
百度学术中相似的文章
[Gao, Chenlong]的文章
[Chen, Yiqiang]的文章
[Jiang, Xinlong]的文章
必应学术
必应学术中相似的文章
[Gao, Chenlong]的文章
[Chen, Yiqiang]的文章
[Jiang, Xinlong]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。