基于知识协同微调的低资源知识图谱补全方法  被引量:11

Knowledge Collaborative Fine-tuning for Low-resource Knowledge Graph Completion

在线阅读下载全文

作  者:张宁豫 谢辛 陈想 邓淑敏 叶宏彬 陈华钧[1,2] ZHANG Ning-Yu;XIE Xin;CHEN Xiang;DENG Shu-Min;YE Hong-Bin;CHEN Hua-Jun(AZFT Joint Laboratory for Knowledge Engine,Zhejiang University,Hangzhou 310028,China;ZJU-Hangzhou Global Scientific and Technological Innovation Center,Hangzhou 310028,China)

机构地区:[1]浙江大学AZFT知识引擎实验室,浙江杭州310028 [2]浙江大学杭州国际科创中心,浙江杭州310028

出  处:《软件学报》2022年第10期3531-3545,共15页Journal of Software

基  金:国家自然科学基金(91846204,U19B2027)。

摘  要:知识图谱补全能让知识图谱变得更加完整.现有的知识图谱补全工作大多会假设知识图谱中的实体或关系有充足的三元组实例.然而,在通用领域,存在大量长尾三元组;在垂直领域,较难获得大量高质量的标注数据.针对这一问题,提出了一种基于知识协同微调的低资源知识图谱补全方法.通过已有的结构化知识来构造初始的知识图谱补全提示,并提出一种协同微调算法来学习最优的模板、标签和模型的参数.所提方法同时利用了知识图谱中的显式结构化知识和语言模型中的隐式事实知识,且可以同时应用于链接预测和关系抽取两种任务.实验结果表明,该方法在3个知识图谱推理数据集和5个关系抽取数据集上都取得了目前最优的性能.Knowledge graph completion can make the knowledge graph more complete.Unfortunately,most of existing methods on knowledge graph completion assume that the entities or relations in the knowledge graph have sufficient triple instances.Nevertheless,there are great deals of long-tail triple sin general domains.Furthermore,it is challenging to obtain a large amount of high-quality annotation data in vertical domains.To address these issues,a knowledge collaborative fine-tuning approach is proposed for low-resource knowledge graph completion.The structured knowledge is leveraged to construct the initial prompt template and the optimal templates,labels,and model parameters are learnt through a collaborative fine-tuning algorithm.The proposed method leverages the explicit structured knowledge in the knowledge graph and the implicit triple knowledge from the language model,which can be applied to the tasks of link prediction and relation extraction.Experimental results show that the proposed approach can obtain state-of-the-art performance on three knowledge graph reasoning datasets and five relation extraction datasets.

关 键 词:低资源 知识图谱补全 链接预测 关系抽取 预训练语言模型 

分 类 号:TP18[自动化与计算机技术—控制理论与控制工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象