基于关联记忆网络的中文细粒度命名实体识别  被引量:13

Chinese Fine-grained Name Entity Recognition Based on Associated Memory Networks

在线阅读下载全文

作  者:琚生根[1] 李天宁 孙界平[1] JU Sheng-Gen;LI Tian-Ning;SUN Jie-Ping(College of Computer Science,Sichuan University,Chengdu 610065,China)

机构地区:[1]四川大学计算机学院,四川成都610065

出  处:《软件学报》2021年第8期2545-2556,共12页Journal of Software

基  金:国家自然科学基金(61972270);四川省新一代人工智能重大专项(2018GZDZX0039);四川省重点研发项目(2019YFG0521)。

摘  要:细粒度命名实体识别是对文本中的实体进行定位,并将其分类至预定义的细粒度类别中.目前,中文细粒度命名实体识别仅使用预训练语言模型对句子中的字符进行上下文编码,并没有考虑到类别的标签信息具有区分实体类别的能力.由于预测句子不带有实体标签,使用关联记忆网络来捕获训练集句子的实体标签信息,并将标签信息融入预测句子的字符表示中.该方法将训练集中带实体标签的句子作为记忆单元,利用预训练语言模型获取原句子和记忆单元句子的上下文表示,再通过注意力机制将记忆单元句子的标签信息与原句子的表示结合,从而提升识别效果.在CLUENER2020中文细粒度命名实体识别任务上,该方法对比基线方法获得了提升.Fine-grained named entity recognition is to locate entities in text and classify them into predefined fine-grained categories.At present,Chinese fine-grained named entity recognition only uses pre-trained language models to encode characters in sentences and does not take into account that the category label information can distinguish entity categories.Since the predicted sentence does not have the entity label,the associated memory network is used to capture the entity label information of the sentences in the training set and to incorporate label information into the representation of predicted sentences in this paper.In this method,sentences with entity labels in the training set are used as memory units,the pre-trained language model is used to obtain the contextual representations of the original sentence and the sentence in the memory unit.Then,the label information of the sentences in the memory unit is combined with the representation of the original sentence by the attention mechanism to improve the recognition effect.On the CLUENER2020 Chinese fine-grained named entity recognition task,this method improves performance over the baseline methods.

关 键 词:中文细粒度命名实体识别 关联记忆网络 多头自注意力 预训练语言模型 

分 类 号:TP18[自动化与计算机技术—控制理论与控制工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象