基于优化信息融合策略的关系抽取  被引量:1

Relation Extraction Based on Optimized Information Fusion Strategy

在线阅读下载全文

作  者:周煜坤 陈渝 赵容梅 琚生根[1] ZHOU Yu-kun;CHEN Yu;ZHAO Rong-mei;JU Sheng-gen(College of Computer Science,Sichuan University,Chengdu 610065,China;College of Science and Technology,Sichuan Minzu College,Kangding 626001,China)

机构地区:[1]四川大学计算机学院,成都610065 [2]四川民族学院理工学院,四川康定626001

出  处:《小型微型计算机系统》2022年第11期2241-2250,共10页Journal of Chinese Computer Systems

基  金:国家自然科学基金重点项目(62137001)资助。

摘  要:现有的关系抽取方法提取全局特征和局部特征,并将其连接作为关系表示进行分类.然而,简单的连接操作意味着将各种特征视为同等重要的信息来处理,忽略了它们对关系抽取的不同贡献程度,限制了模型的效果.实际上,在复杂的语境下,不同信息的重要程度大相径庭.针对此问题,提出了一种基于优化信息融合策略的关系抽取方法.首先,通过BERT获得句向量以及实体表示,将句子表示分别融入到两实体表示中,以获得两种复合特征.随后,采用一种自适应的信息学习策略融合两种特征作为关系表示用于分类.该方法融合了输入序列的全局和局部信息,并自动聚焦于贡献更大的部分.在TACRED、TACREV、Semeval2010 Task8数据集上的实验表明,该方法的F1值优于当前最优模型.Existing relation extraction methods extract global and local features and concatenate them as relational representations for classification.However,a simple operation of concatenating means treating various features as equally important information,ignoring their different degrees of contribution to relation extraction,and limiting the effectiveness of the model.In fact,in complex contexts,the importance of different information varies widely.Aiming at this problem,a relation extraction method based on optimized information fusion strategy is proposed.First,the sentence vector and entity representation are obtained through BERT,and the sentence representation is integrated into the two entity representations to obtain two composite features.Subsequently,an adaptive information learning strategy is adopted to fuse the two features as relation representations for classification.The method fuses the global and local information of the input sequence and automatically focuses on the parts that contribute more.Experiments on TACRED,TACREV and Semeval2010 Task8 datasets show that the F1 values of this method is better than the current state-of-the-art model.

关 键 词:关系抽取 信息融合 BERT 深度学习 

分 类 号:TP391[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象