融合意图列表查询机制的门控槽模型  被引量:1

Slot-Gated Modeling with Intent List Selection

在线阅读下载全文

作  者:胡光敏 姜黎 HU Guang-min;JIANG Li(School of Physics and Optoelectronic Engineering,Xiangtan University,Xiangtan 411100,China)

机构地区:[1]湘潭大学物理与光电工程学院,湖南湘潭411100

出  处:《软件导刊》2021年第9期51-55,共5页Software Guide

摘  要:将递归神经网络(RNN)应用于意图检测和槽填充已实现较好的识别效果。传统Slot-Gated模型旨在将意图特征融入槽位识别中,但未能将文本标签信息作为模型先验知识传入模型参与训练。在Slot-Gated模型的基础上,通过意图标签信息构建一种基于注意力机制的意图列表查询模块,并通过全局优化的方法提升模型意图识别以及意图与槽填充联合准确率。通过与Slot-Gated模型进行对比实验,该方法在ATIS数据集上的意图及联合准确率分别提升了1.1%和1.5%;在Snips数据集上,意图及联合准确率分别提升了0.3%和0.4%。实验结果表明,将意图种类标签信息作为先验知识加入训练能提升模型性能。Recurrent neural network(RNN)for intent detection and slot filling had achieved a good recognition effect.The traditional Slot-Gated model aimed to associate the intention with the slot display,but it failed to pass the label information of the text as the prior knowledge of the model to participate in the training.Based on the Slot-Gated model,this research constructes an intent list query module simultaneously the attention mechanism through intent label information,and used a global optimization method to enable the model to achieve better intent recognition and the combined accuracy of intent and slot filling rate.As compared with,the Slot-Gated model,The intention and joint accuracy of this method on the ATIS data set are increased by 1.1%and 1.5%respectively;In addition,on the Snips data set,this method has improved intent and joint accuracy of 0.3%and 0.4%.According to the experimental results,this method verifies that prior knowledge has a positive effect on the improvement of model performance.

关 键 词:自然语言理解 神经网络 预训练模型 槽填充 注意力机制 

分 类 号:TP301[自动化与计算机技术—计算机系统结构]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象