结合对比学习和双流网络融合知识图谱摘要模型  

Integrating contrastive learning and dual-stream networks for knowledge graph summarization models

在线阅读下载全文

作  者:赵霞[1] 王钊 Zhao Xia;Wang Zhao(School of Management Sciences&Information Engineering,Hebei University of Economics&Business,Shijiazhuang 050061,China)

机构地区:[1]河北经贸大学管理科学与信息工程学院,石家庄050061

出  处:《计算机应用研究》2025年第3期720-727,共8页Application Research of Computers

基  金:河北省自然科学基金资助项目(F2021207005)。

摘  要:提出了一种融合对比学习与双流网络的新型知识图谱摘要模型(KGDR-CLSUM),旨在解决现有模型在生成摘要时存在的事实性错误和信息提取不足的问题。该模型通过设计双流网络同时处理文本特征和知识图谱特征,并采用对比学习来强化这两类特征的有效融合。此外,引入动量蒸馏策略以降低知识图谱中的数据噪声,从而提升摘要生成的质量和准确性。在CNN/Daily Mail数据集上,KGDR-CLSUM相较于基线模型PEGASUS BASE,在ROUGE-1、ROUGE-2和ROUGE-L指标上分别提升了3.03%、3.42%和2.56%,在XSum数据集上更是达到了7.54%、8.78%和8.51%的显著提升。此外,人工评分显著高于ChatGPT,进一步证明了该模型的优越性能。结果表明,KGDR-CLSUM在生成摘要时,尤其在短文本生成任务中,能够有效降低错误信息,并显著提高摘要的质量。This study presented a novel knowledge graph-based summarization model(KGDR-CLSUM),which integrated contrastive learning with a dual-stream network to address factual errors and improve information extraction in existing summarization models.The model used a dual-stream network to process textual and knowledge graph features simultaneously,while contrastive learning enhanced the integration of these features.Additionally,it introduced a momentum distillation strategy to reduce data noise in the knowledge graph,improving the quality and accuracy of the generated summaries.On the CNN/Daily Mail dataset,KGDR-CLSUM outperforms the baseline model PEGASUS BASE,improving ROUGE-1,ROUGE-2,and ROUGE-L scores by 3.03%,3.42%,and 2.56%,respectively.On the XSum dataset,it observes even more significant improvements of 7.54%,8.78%,and 8.51%.Human’s evaluations also report significantly higher scores compared to ChatGPT,further demonstrating the superior performance of our model.These results show that KGDR-CLSUM effectively minimizes factual errors and significantly enhances summary quality,especially for short-text generation tasks.

关 键 词:文本摘要 知识图谱 动量蒸馏 对比学习 双流网络 

分 类 号:TP391[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象