CSpace  > 中国科学院计算技术研究所期刊论文  > 英文
MGCoT : Multi-Grained Contextual Transformer for table-based text generation
Mo, Xianjie1,2,3; Xiang, Yang2; Pan, Youcheng2; Hou, Yongshuai2; Luo, Ping1,2,3
2024-09-15
发表期刊EXPERT SYSTEMS WITH APPLICATIONS
ISSN0957-4174
卷号250页码:10
摘要Recent advances in Transformer have led to the revolution of table -based text generation. However, most existing Transformer -based architectures ignore the rich contexts among input tokens distributed in multilevel units (e.g., cell, row, or column), leading to sometimes unfaithful text generation that fails to establish accurate association relationships and misses vital information. In this paper, we propose M ulti - G rained Co ntextual T ransformer ( MGCoT ), a novel architecture that fully capitalizes on the multi -grained contexts among input tokens and thus strengthens the capacity of table -based text generation. The key primitive, M ulti - G rained Co ntexts ( MGCo ) module, involves two components: a local context sub -module that adaptively gathers neighboring tokens to form the token -wise local context features, and a global context sub -module that consistently aggregates tokens from a broader range to form the shared global context feature. The former aims at modeling the short-range dependencies that reflect the salience of tokens within similar fine-grained units (e.g., cell and row) attending to the query token, while the latter aims at capturing the long-range dependencies that reflect the significance of each token within similar coarse -grained units (e.g., multiple rows or columns). Based on the fused multi -grained contexts, MGCoT can flexibly and holistically model the content of a table across multi -level structures. On three benchmark datasets, ToTTo, FeTaQA, and Tablesum, MGCoT outperforms strong baselines by a large margin on the quality of the generated texts, demonstrating the effectiveness of multi -grained context modeling. Our source codes are available at https://github.com/Cedric-Mo/MGCoT.
关键词Multi-grained contexts Transformer Abstractive table question answering Table-to-text generation
DOI10.1016/j.eswa.2024.123742
收录类别SCI
语种英语
资助项目Major National Science and Technology Project[2022ZD0115305] ; Major Key Project of PCL[PCL2022D01] ; Major Key Project of PCL[PCL2023A09] ; National Natural Science Foundation of China[62106115] ; China Postdoctoral Science Foundation[2023M741843]
WOS研究方向Computer Science ; Engineering ; Operations Research & Management Science
WOS类目Computer Science, Artificial Intelligence ; Engineering, Electrical & Electronic ; Operations Research & Management Science
WOS记录号WOS:001224645700001
出版者PERGAMON-ELSEVIER SCIENCE LTD
引用统计
被引频次:2[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符http://119.78.100.204/handle/2XEOYT63/38973
专题中国科学院计算技术研究所期刊论文_英文
通讯作者Xiang, Yang
作者单位1.Chinese Acad Sci, Inst Comp Technol, Key Lab Intelligent Informat Proc Chinese Acad Sci, Beijing, Peoples R China
2.Peng Cheng Lab, Shenzhen, Peoples R China
3.Univ Chinese Acad Sci, Beijing, Peoples R China
推荐引用方式
GB/T 7714
Mo, Xianjie,Xiang, Yang,Pan, Youcheng,et al. MGCoT : Multi-Grained Contextual Transformer for table-based text generation[J]. EXPERT SYSTEMS WITH APPLICATIONS,2024,250:10.
APA Mo, Xianjie,Xiang, Yang,Pan, Youcheng,Hou, Yongshuai,&Luo, Ping.(2024).MGCoT : Multi-Grained Contextual Transformer for table-based text generation.EXPERT SYSTEMS WITH APPLICATIONS,250,10.
MLA Mo, Xianjie,et al."MGCoT : Multi-Grained Contextual Transformer for table-based text generation".EXPERT SYSTEMS WITH APPLICATIONS 250(2024):10.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Mo, Xianjie]的文章
[Xiang, Yang]的文章
[Pan, Youcheng]的文章
百度学术
百度学术中相似的文章
[Mo, Xianjie]的文章
[Xiang, Yang]的文章
[Pan, Youcheng]的文章
必应学术
必应学术中相似的文章
[Mo, Xianjie]的文章
[Xiang, Yang]的文章
[Pan, Youcheng]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。