CSpace

浏览/检索结果: 共6条,第1-6条 帮助

限定条件    
已选(0)清除 条数/页:   排序方式:
Amorphous Region Context Modeling for Scene Recognition 期刊论文
IEEE TRANSACTIONS ON MULTIMEDIA, 2022, 卷号: 24, 页码: 141-151
作者:  Zeng, Haitao;  Song, Xinhang;  Chen, Gongwei;  Jiang, Shuqiang
收藏  |  浏览/下载:16/0  |  提交时间:2022/12/07
Semantics  Feature extraction  Image segmentation  Convolution  Context modeling  Saliency detection  Layout  Graph neural network  scene recognition  semantic segmentation  
Food Recommendation: Framework, Existing Solutions, and Challenges 期刊论文
IEEE TRANSACTIONS ON MULTIMEDIA, 2020, 卷号: 22, 期号: 10, 页码: 2659-2671
作者:  Min, Weiqing;  Jiang, Shuqiang;  Jain, Ramesh
收藏  |  浏览/下载:43/0  |  提交时间:2020/12/10
Artificial intelligence  knowledge based systems  image recognition  data mining  health information management  
Learning Scene Attribute for Scene Recognition 期刊论文
IEEE TRANSACTIONS ON MULTIMEDIA, 2020, 卷号: 22, 期号: 6, 页码: 1519-1530
作者:  Zeng, Haitao;  Song, Xinhang;  Chen, Gongwei;  Jiang, Shuqiang
收藏  |  浏览/下载:40/0  |  提交时间:2020/12/10
Feature extraction  Visualization  Semantics  Context modeling  Image recognition  Computer vision  Aggregates  Scene recognition  scene attribute  
Bundled Object Context for Referring Expressions 期刊论文
IEEE TRANSACTIONS ON MULTIMEDIA, 2018, 卷号: 20, 期号: 10, 页码: 2749-2760
作者:  Li, Xiangyang;  Jiang, Shuqiang
收藏  |  浏览/下载:52/0  |  提交时间:2019/12/10
Bundled object context  referring expression  LSTM  vision-language  
A survey on context-aware mobile visual recognition 期刊论文
MULTIMEDIA SYSTEMS, 2017, 卷号: 23, 期号: 6, 页码: 647-665
作者:  Min, Weiqing;  Jiang, Shuqiang;  Wang, Shuhui;  Xu, Ruihan;  Cao, Yushan;  Herranz, Luis;  He, Zhiqiang
收藏  |  浏览/下载:49/0  |  提交时间:2019/12/12
Mobile visual recognition  Context  Survey  
Modeling Restaurant Context for Food Recognition 期刊论文
IEEE TRANSACTIONS ON MULTIMEDIA, 2017, 卷号: 19, 期号: 2, 页码: 430-440
作者:  Herranz, Luis;  Jiang, Shuqiang;  Xu, Ruihan
收藏  |  浏览/下载:44/0  |  提交时间:2019/12/12
Food recognition  image recognition  location  mobile applications  probabilistic modeling.