Institute of Computing Technology, Chinese Academy IR
| Efficient knowledge graph to text powered by LLGM: linear latent graph model | |
| Zhao, Xiaokang1; Zheng, Yao2; Shan, Yubo2; Li, Jingyuan1; Zhang, Kun3; Wang, Yuanzhuo4 | |
| 2025-08-01 | |
| 发表期刊 | COMPLEX & INTELLIGENT SYSTEMS
![]() |
| ISSN | 2199-4536 |
| 卷号 | 11期号:8页码:22 |
| 摘要 | Knowledge graph to text generation is crucial for interpreting complex structured data, yet state-of-the-art transformer models face significant computational burdens, limiting their practical deployment. This paper introduces the Linear Latent Graph Model (LLGM), a novel architecture that significantly enhances efficiency in KG-to-text generation without compromising performance. LLGM's core innovations are three-fold: (1) a Multi-head Statistical Attention (MSA) mechanism that achieves linear O(N) complexity by replacing pairwise token interactions with efficient statistical approximations, drastically reducing the primary computational bottleneck; (2) a Graph Latent Self-Attention (GLSA) module that efficiently encodes explicit graph structures using dimension-reduced intermediate representations, preserving relational fidelity with fewer parameters; and (3) a Graph Periodicity Projector (GPP) that optimizes feed-forward networks by decomposing representations into periodic and non-periodic components, adeptly capturing both regular and unique graph patterns. Experiments on the WebNLG and EventNarrative datasets demonstrate LLGM's significant contributions: it achieves competitive text generation quality, evidenced by a mere 0.8% BLEU-4 gap to the top model and the highest CIDEr score (4.63) on WebNLG, while requiring 20-37% fewer parameters than leading models. LLGM offers a robust and scalable solution, effectively bridging the efficiency-effectiveness gap in KG-to-text generation and enabling broader application in resource-constrained environments. |
| 关键词 | Knowledge graph to text generation Linear attention Low rank compression Fourier network |
| DOI | 10.1007/s40747-025-01985-8 |
| 收录类别 | SCI |
| 语种 | 英语 |
| 资助项目 | National Natural Science Foundation of China[62172393] ; Henan Province Key Research and Development Project[241111211900] |
| WOS研究方向 | Computer Science |
| WOS类目 | Computer Science, Artificial Intelligence |
| WOS记录号 | WOS:001510573200009 |
| 出版者 | SPRINGER HEIDELBERG |
| 引用统计 | |
| 文献类型 | 期刊论文 |
| 条目标识符 | http://119.78.100.204/handle/2XEOYT63/42366 |
| 专题 | 中国科学院计算技术研究所期刊论文_英文 |
| 通讯作者 | Li, Jingyuan |
| 作者单位 | 1.Beijing Technol & Business Univ, Sch Comp & Artificial Intelligence, Beijing 100048, Peoples R China 2.Zhengzhou Univ, Henan Inst Adv Technol, Zhengzhou, Peoples R China 3.Tencent Inc, Pattern Recognit Ctr, WeChat AI, Beijing, Peoples R China 4.Chinese Acad Sci, Inst Comp Technol, Beijing, Peoples R China |
| 推荐引用方式 GB/T 7714 | Zhao, Xiaokang,Zheng, Yao,Shan, Yubo,et al. Efficient knowledge graph to text powered by LLGM: linear latent graph model[J]. COMPLEX & INTELLIGENT SYSTEMS,2025,11(8):22. |
| APA | Zhao, Xiaokang,Zheng, Yao,Shan, Yubo,Li, Jingyuan,Zhang, Kun,&Wang, Yuanzhuo.(2025).Efficient knowledge graph to text powered by LLGM: linear latent graph model.COMPLEX & INTELLIGENT SYSTEMS,11(8),22. |
| MLA | Zhao, Xiaokang,et al."Efficient knowledge graph to text powered by LLGM: linear latent graph model".COMPLEX & INTELLIGENT SYSTEMS 11.8(2025):22. |
| 条目包含的文件 | 条目无相关文件。 | |||||
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论