融合实体注意力与语义信息的关系抽取模型  

Relational Extraction Model Incorporating Entity Attention and Semantic Information

在线阅读下载全文

作  者:刘云腾 LIU Yunteng(School of Internet of Things Engineering,Jiangnan University,Wuxi 214000)

机构地区:[1]江南大学物联网工程学院,无锡214000

出  处:《计算机与数字工程》2024年第2期487-491,520,共6页Computer & Digital Engineering

摘  要:知识图谱通过语义网络,建立现实世界和数据世界映射,支撑了很多行业中的具体应用,实体关系抽取是知识图谱构建中的核心环节。论文针对关系抽取任务中实体相关特征利用率低、文本特征提取不充分以及部分预训练模型不能够很好提取序列特征的问题,提出一个基于BERT预训练模型,下游利用长短期记忆网络(LSTM)能够有效处理长期依赖问题的特点,再结合实体位置自感知注意力机制组合成新的模型。模型分别在两个公共数据集上测试,实验结果表明论文模型在TacRed数据集和SemEval 2020 Task 8数据集上f1得分值分别可以达到67.1%,87.8%,均优于部分先前的模型。Knowledge Graph builds the mapping of the real world and the data world through the semantic network,which sup-ports many specific applications in the industry.Entity relationship extraction is the core link in the construction of knowledge graph.However,the automatic extraction of relational knowledge from documents to supplement the knowledge base has been slow to devel-op.Based on the low utilization of relationship between extraction task entity in the position and inadequate text feature extraction problem,this paper proposes a model of entity relationship extraction based on BERT,downstream uses both short-term and long-term memory network(LSTM)to deal effectively with the characteristics of long relied on,combining entity position self-sens-ing attention mechanism to from a new composite model.The model is tested on two common data sets respectively,and the experi-mental results show that the F1 score of the model in this paper can reach 67.1%and 87.8%on TacRed data set and SemEval 2020 Task 8 data set,respectively,which is better than some previous models.

关 键 词:预训练模型 语义关系抽取 注意力机制 长短期记忆网络 自然语言处理 

分 类 号:TP389.1[自动化与计算机技术—计算机系统结构]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象