检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:申晖 张英俊[1] 谢斌红[1] 赵红燕[1] SHEN Hui;ZHANG Ying-Jun;XIE Bin-Hong;ZHAO Hong-Yan(School of Computer Science and Technology,Taiyuan University of Science and Technology,Taiyuan 030024,China)
机构地区:[1]太原科技大学计算机科学与技术学院,太原030024
出 处:《计算机系统应用》2021年第6期262-270,共9页Computer Systems & Applications
基 金:山西省重点研发计划重点项目(201703D111027);山西省重点计划研发项目(201803D121048,201803D121055)。
摘 要:大多数中文命名实体识别模型中,语言预处理只关注单个词和字符的向量表示,忽略了它们之间的语义关系,无法解决一词多义问题;Transformer特征抽取模型的并行计算和长距离建模优势提升了许多自然语言理解任务的效果,但全连接结构使得计算复杂度为输入长度的平方,导致其在中文命名实体识别的效果不佳.针对这些问题,提出一种基于BSTTC (BERT-Star-Transformer-TextCNN-CRF)模型的中文命名实体识别方法.首先利用在大规模语料上预训练好的BERT模型根据其输入上下文动态生成字向量序列;然后使用星型Transformer与TextCNN联合模型进一步提取句子特征;最后将特征向量序列输入CRF模型得到最终预测结果.在MSRA中文语料上的实验结果表明,该模型的精确率、召回率和F1值与之前模型相比,均有所提高.与BERT-Transformer-CRF模型相比,训练时间大约节省了65%.In most recognition models of Chinese named entities,language preprocessing only focuses on the vector representation of single words and characters and ignores the semantic relationship between them,hence failing to tackle polysemy.The transformer feature extraction model improves the understanding of natural language due to parallel computing and long-distance modeling,but its fully connected structure makes the computational complexity the square of the input length,which leads to poor recognition of Chinese named entities.A recognition method for Chinese named entities based on the BERT-Star-Transformer-TextCNN-CRF(BSTTC)model is proposed to solve these problems.First,the BERT model pre-trained on a large-scale corpus is used to dynamically generate the word vector sequence according to its input context.Then,the star Transformer-TextCNN model is adopted to further extract sentence features.Finally,the prediction result is received by inputting the feature vector sequence into the CRF model.The experimental results on the Chinese corpus from MSRA show that the accuracy,recall,and F1 value of this model are all higher than those of existing models.Moreover,its training time is 65%shorter than that of the BSTTC model.
关 键 词:BERT 星型Transformer 命名实体识别 TextCNN 条件随机场
分 类 号:TP391.1[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.7