检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:司悦航 成清[1,2] 黄金才 胡星辰[1] SI Yuehang;CHENG Qing;HUANG Jincai;HU Xingchen(National University of Defense Technology,Laboratory for Big Data and Decision,Changsha 410073,China;Hunan Advanced Technology Research Institute,Changsha 410006,China)
机构地区:[1]国防科技大学大数据与决策实验室,长沙410073 [2]湖南先进技术研究院,长沙410006
出 处:《指挥与控制学报》2024年第6期712-719,共8页Journal of Command and Control
基 金:国家自然科学基金(62376279)资助。
摘 要:基于时序知识图谱的推理,是提升智能决策效率推理未来态势的技术基础。传统推理模型面临着模型参数规模大、计算硬件需求高等问题,难以满足低性能、低功耗分布式设备的实时推理决策要求。传统模型压缩方法忽略了时序特征。提出一种应用于时序知识图谱推理模型的蒸馏方法,构建基于大语言模型的蒸馏框架,融合海量公开知识和特定时序知识,辅助轻量模型训练。在公开数据集上展开的实验表明该方法优于国际同类方法。The temporal knowledge graph reasoning is a technical foundation for improving the future situation of intelligent decisionmaking efficiency.Traditional reasoning models face such problems as large scale of model parameters and high computing hardware requirements,etc.and it is difficult to meet the real-time reasoning and decision-making requirements of low performance and low-power distributed equipment.Traditional model compression methods ignore the timing characteristics.A distillation method applied to the temporal knowledge graph reasoning model is proposed.The distillation framework is constructed based on large language models,Massive public knowledge and specific temporal knowledge are integrated,and assist lightweight model training is assisted.The experiments carried out on the open datasets indicate that the method is better than similarly international methods.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.28