CSpace  > 中国科学院计算技术研究所期刊论文  > 英文
Efficient knowledge graph to text powered by LLGM: linear latent graph model
Zhao, Xiaokang1; Zheng, Yao2; Shan, Yubo2; Li, Jingyuan1; Zhang, Kun3; Wang, Yuanzhuo4
2025-08-01
发表期刊COMPLEX & INTELLIGENT SYSTEMS
ISSN2199-4536
卷号11期号:8页码:22
摘要Knowledge graph to text generation is crucial for interpreting complex structured data, yet state-of-the-art transformer models face significant computational burdens, limiting their practical deployment. This paper introduces the Linear Latent Graph Model (LLGM), a novel architecture that significantly enhances efficiency in KG-to-text generation without compromising performance. LLGM's core innovations are three-fold: (1) a Multi-head Statistical Attention (MSA) mechanism that achieves linear O(N) complexity by replacing pairwise token interactions with efficient statistical approximations, drastically reducing the primary computational bottleneck; (2) a Graph Latent Self-Attention (GLSA) module that efficiently encodes explicit graph structures using dimension-reduced intermediate representations, preserving relational fidelity with fewer parameters; and (3) a Graph Periodicity Projector (GPP) that optimizes feed-forward networks by decomposing representations into periodic and non-periodic components, adeptly capturing both regular and unique graph patterns. Experiments on the WebNLG and EventNarrative datasets demonstrate LLGM's significant contributions: it achieves competitive text generation quality, evidenced by a mere 0.8% BLEU-4 gap to the top model and the highest CIDEr score (4.63) on WebNLG, while requiring 20-37% fewer parameters than leading models. LLGM offers a robust and scalable solution, effectively bridging the efficiency-effectiveness gap in KG-to-text generation and enabling broader application in resource-constrained environments.
关键词Knowledge graph to text generation Linear attention Low rank compression Fourier network
DOI10.1007/s40747-025-01985-8
收录类别SCI
语种英语
资助项目National Natural Science Foundation of China[62172393] ; Henan Province Key Research and Development Project[241111211900]
WOS研究方向Computer Science
WOS类目Computer Science, Artificial Intelligence
WOS记录号WOS:001510573200009
出版者SPRINGER HEIDELBERG
引用统计
文献类型期刊论文
条目标识符http://119.78.100.204/handle/2XEOYT63/42366
专题中国科学院计算技术研究所期刊论文_英文
通讯作者Li, Jingyuan
作者单位1.Beijing Technol & Business Univ, Sch Comp & Artificial Intelligence, Beijing 100048, Peoples R China
2.Zhengzhou Univ, Henan Inst Adv Technol, Zhengzhou, Peoples R China
3.Tencent Inc, Pattern Recognit Ctr, WeChat AI, Beijing, Peoples R China
4.Chinese Acad Sci, Inst Comp Technol, Beijing, Peoples R China
推荐引用方式
GB/T 7714
Zhao, Xiaokang,Zheng, Yao,Shan, Yubo,et al. Efficient knowledge graph to text powered by LLGM: linear latent graph model[J]. COMPLEX & INTELLIGENT SYSTEMS,2025,11(8):22.
APA Zhao, Xiaokang,Zheng, Yao,Shan, Yubo,Li, Jingyuan,Zhang, Kun,&Wang, Yuanzhuo.(2025).Efficient knowledge graph to text powered by LLGM: linear latent graph model.COMPLEX & INTELLIGENT SYSTEMS,11(8),22.
MLA Zhao, Xiaokang,et al."Efficient knowledge graph to text powered by LLGM: linear latent graph model".COMPLEX & INTELLIGENT SYSTEMS 11.8(2025):22.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Zhao, Xiaokang]的文章
[Zheng, Yao]的文章
[Shan, Yubo]的文章
百度学术
百度学术中相似的文章
[Zhao, Xiaokang]的文章
[Zheng, Yao]的文章
[Shan, Yubo]的文章
必应学术
必应学术中相似的文章
[Zhao, Xiaokang]的文章
[Zheng, Yao]的文章
[Shan, Yubo]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。