基于注意力机制的LSTM的语义关系抽取  被引量:65

Text semantic relation extraction of LSTM based on attention mechanism

在线阅读下载全文

作  者:王红[1] 史金钏 张志伟 Wang Hong;Shi Jinchuan;Zhang Zhiwei(School of Computer Science&Technology,Civil Aviation University of China,Tianjin 300300,China)

机构地区:[1]中国民航大学计算机科学与技术学院,天津300300

出  处:《计算机应用研究》2018年第5期1417-1420,1440,共5页Application Research of Computers

基  金:国家自然科学基金资助项目(U1633110;U1533104;U1233113)

摘  要:目前关系抽取方法中,传统深度学习方法存在长距离依赖问题,并且未考虑模型输入与输出的相关性。针对以上问题,提出了一种将LSTM(long short-term memory)模型与注意力机制相结合的关系抽取方法。将文本信息向量化,提取文本局部特征,再将文本局部特征导入双向LSTM模型中,通过注意力机制对LSTM模型的输入与输出之间的相关性进行重要度计算,根据重要度获取文本整体特征;最后将局部特征和整体特征进行特征融合,通过分类器输出分类结果。在Sem Eval-2010 task 8语料库上的实验结果表明,该方法的准确率和稳定性较传统深度学习方法有进一步提高,为自动问答、信息检索以及本体学习等领域提供了方法支持。In the methods of relation extraction,the traditional deep learning method has the problem of long distance depen-dence and does not consider the correlation between input and output of the model.This paper put forward a new relation extraction model,which combining LSTM and attention mechanism.Firstly,the model embedded the text information and then obtained the local feature.Secondly,it introduced the local feature into the bidirectional LSTM model,and used the attention mechanism to calculate the importance probability between the input and output of the LSTM model to obtain the global feature.Finally,it fused the local feature and the global feature and obtained the result of relation extraction by classifier.Experiments were conducted on the SemEval-2010 task 8 corpus.The results show that the accuracy and stability of the method have been further improved,which provides method support for automatic question answering,information retrieval and ontology learning.

关 键 词:文本信息 语义关系 关系抽取 LSTM 注意力机制 

分 类 号:TP391[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象