检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:刘云腾 LIU Yunteng(School of Internet of Things Engineering,Jiangnan University,Wuxi 214000)
出 处:《计算机与数字工程》2024年第2期487-491,520,共6页Computer & Digital Engineering
摘 要:知识图谱通过语义网络,建立现实世界和数据世界映射,支撑了很多行业中的具体应用,实体关系抽取是知识图谱构建中的核心环节。论文针对关系抽取任务中实体相关特征利用率低、文本特征提取不充分以及部分预训练模型不能够很好提取序列特征的问题,提出一个基于BERT预训练模型,下游利用长短期记忆网络(LSTM)能够有效处理长期依赖问题的特点,再结合实体位置自感知注意力机制组合成新的模型。模型分别在两个公共数据集上测试,实验结果表明论文模型在TacRed数据集和SemEval 2020 Task 8数据集上f1得分值分别可以达到67.1%,87.8%,均优于部分先前的模型。Knowledge Graph builds the mapping of the real world and the data world through the semantic network,which sup-ports many specific applications in the industry.Entity relationship extraction is the core link in the construction of knowledge graph.However,the automatic extraction of relational knowledge from documents to supplement the knowledge base has been slow to devel-op.Based on the low utilization of relationship between extraction task entity in the position and inadequate text feature extraction problem,this paper proposes a model of entity relationship extraction based on BERT,downstream uses both short-term and long-term memory network(LSTM)to deal effectively with the characteristics of long relied on,combining entity position self-sens-ing attention mechanism to from a new composite model.The model is tested on two common data sets respectively,and the experi-mental results show that the F1 score of the model in this paper can reach 67.1%and 87.8%on TacRed data set and SemEval 2020 Task 8 data set,respectively,which is better than some previous models.
关 键 词:预训练模型 语义关系抽取 注意力机制 长短期记忆网络 自然语言处理
分 类 号:TP389.1[自动化与计算机技术—计算机系统结构]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.46