检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:曾碧卿 丁明浩 宋逸云 ZENG Bi-Qing;DING Ming-Hao;SONG Yi-Yun(School of Software,South China Normal University,Foshan 528225,China)
出 处:《计算机系统应用》2023年第7期47-56,共10页Computer Systems & Applications
基 金:国家自然科学基金(12001208);广东省基础与应用基础研究基金(2021A1515011171);广东省普通高校人工智能重点领域专项(2019KZDZX1033);广州市基础研究计划、基础与应用基础研究项目(202102080282)。
摘 要:引入结构化知识的对话系统因为能够生成流畅度更高、多样性更丰富的对话回复而受到广泛关注,但是以往的研究只注重于结构化知识中的实体,却忽略了实体之间的关系以及知识的完整性.本文提出了一种基于图卷积网络的知识感知对话生成模型(KCG).该模型通过知识编码器分别捕获实体与关系的语义信息并利用图卷积网络增强实体表征;再利用知识选择模块获得与对话上下文相关的实体与关系的知识选择概率分布;最后将知识选择概率分布与词表概率分布融合,解码器以此选择知识或词表字词.本文在中文公开数据集DuConv上进行实验,结果表明,KCG在自动评估指标上优于目前的基线模型,能生成更加流畅并且内容更加丰富的回复.The dialogue system that introduces structured knowledge has attracted widespread attention as it can generate more fluent and diverse dialogue replies.However,previous studies only focus on entities in structured knowledge,ignoring the relation between entities and the integrity of knowledge.In this study,a knowledge-aware conversation generation(KCG)model based on the graph convolutional network is proposed.The semantic information of the entity and relation is captured by the knowledge encoder and the representation of the entity is enhanced by the graph convolutional network.Then,the knowledge selection module is applied to obtain the knowledge selection probability distribution of the entities and relations related to the dialogue context.Finally,the knowledge selection probability distribution is fused with the vocabulary probability distribution so that the decoder can select the knowledge or words.In this study,the experiments are conducted on DuConv,a Chinese public data set.The results show that KCG is superior to the current baseline model in terms of automatic evaluation metrics and can generate more fluent and informative replies.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.127