基于大语言模型的时序知识图谱推理模型蒸馏方法  被引量:1

Distillation Method on Temporal Knowledge Graph Reasoning Model Based on Large Language Models

在线阅读下载全文

作  者:司悦航 成清[1,2] 黄金才 胡星辰[1] SI Yuehang;CHENG Qing;HUANG Jincai;HU Xingchen(National University of Defense Technology,Laboratory for Big Data and Decision,Changsha 410073,China;Hunan Advanced Technology Research Institute,Changsha 410006,China)

机构地区:[1]国防科技大学大数据与决策实验室,长沙410073 [2]湖南先进技术研究院,长沙410006

出  处:《指挥与控制学报》2024年第6期712-719,共8页Journal of Command and Control

基  金:国家自然科学基金(62376279)资助。

摘  要:基于时序知识图谱的推理,是提升智能决策效率推理未来态势的技术基础。传统推理模型面临着模型参数规模大、计算硬件需求高等问题,难以满足低性能、低功耗分布式设备的实时推理决策要求。传统模型压缩方法忽略了时序特征。提出一种应用于时序知识图谱推理模型的蒸馏方法,构建基于大语言模型的蒸馏框架,融合海量公开知识和特定时序知识,辅助轻量模型训练。在公开数据集上展开的实验表明该方法优于国际同类方法。The temporal knowledge graph reasoning is a technical foundation for improving the future situation of intelligent decisionmaking efficiency.Traditional reasoning models face such problems as large scale of model parameters and high computing hardware requirements,etc.and it is difficult to meet the real-time reasoning and decision-making requirements of low performance and low-power distributed equipment.Traditional model compression methods ignore the timing characteristics.A distillation method applied to the temporal knowledge graph reasoning model is proposed.The distillation framework is constructed based on large language models,Massive public knowledge and specific temporal knowledge are integrated,and assist lightweight model training is assisted.The experiments carried out on the open datasets indicate that the method is better than similarly international methods.

关 键 词:时序知识图谱 知识图谱推理 知识蒸馏 大语言模型 

分 类 号:TP18[自动化与计算机技术—控制理论与控制工程] TP391.1[自动化与计算机技术—控制科学与工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象