检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:琚生根[1] 李天宁 孙界平[1] JU Sheng-Gen;LI Tian-Ning;SUN Jie-Ping(College of Computer Science,Sichuan University,Chengdu 610065,China)
出 处:《软件学报》2021年第8期2545-2556,共12页Journal of Software
基 金:国家自然科学基金(61972270);四川省新一代人工智能重大专项(2018GZDZX0039);四川省重点研发项目(2019YFG0521)。
摘 要:细粒度命名实体识别是对文本中的实体进行定位,并将其分类至预定义的细粒度类别中.目前,中文细粒度命名实体识别仅使用预训练语言模型对句子中的字符进行上下文编码,并没有考虑到类别的标签信息具有区分实体类别的能力.由于预测句子不带有实体标签,使用关联记忆网络来捕获训练集句子的实体标签信息,并将标签信息融入预测句子的字符表示中.该方法将训练集中带实体标签的句子作为记忆单元,利用预训练语言模型获取原句子和记忆单元句子的上下文表示,再通过注意力机制将记忆单元句子的标签信息与原句子的表示结合,从而提升识别效果.在CLUENER2020中文细粒度命名实体识别任务上,该方法对比基线方法获得了提升.Fine-grained named entity recognition is to locate entities in text and classify them into predefined fine-grained categories.At present,Chinese fine-grained named entity recognition only uses pre-trained language models to encode characters in sentences and does not take into account that the category label information can distinguish entity categories.Since the predicted sentence does not have the entity label,the associated memory network is used to capture the entity label information of the sentences in the training set and to incorporate label information into the representation of predicted sentences in this paper.In this method,sentences with entity labels in the training set are used as memory units,the pre-trained language model is used to obtain the contextual representations of the original sentence and the sentence in the memory unit.Then,the label information of the sentences in the memory unit is combined with the representation of the original sentence by the attention mechanism to improve the recognition effect.On the CLUENER2020 Chinese fine-grained named entity recognition task,this method improves performance over the baseline methods.
关 键 词:中文细粒度命名实体识别 关联记忆网络 多头自注意力 预训练语言模型
分 类 号:TP18[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.210