采用上下文专注机制的特定目标观点抽取  

Specific Target Viewpoint Extraction Using Context Focused Mechanism

在线阅读下载全文

作  者:陈聿鹏 陈佳伟 黄荣 韩芳[1,2] CHEN Yupeng;CHEN Jiawei;HUANG Rong;HAN Fang(College of Information Science and Technology,Donghua University,Shanghai 201620,China;Engineering Research Center of Digitalized Textile&Fashion Technology,Donghua University,Shanghai 201620,China)

机构地区:[1]东华大学信息科学与技术学院,上海201620 [2]东华大学数字化纺织服装技术教育部工程研究中心,上海201620

出  处:《计算机工程与应用》2022年第14期160-166,共7页Computer Engineering and Applications

基  金:国家自然科学基金(11972115,11572084)。

摘  要:针对现有的目标和观点抽取模型未能充分考虑两者的联系的问题,提出一种基于上下文专注机制的特定目标观点抽取模型。将抽取出的目标特征向量与每个位置的上下文词向量拼接构成最终的句子表示,加强目标与句子之间的交互,实现目标融合;采用上下文专注机制把注意力更多地放在目标词的周围,削弱远距离词的语义特征。提出的模型采用双向长短时记忆(bi-directionallongshort-termmemory,BiLSTM)网络将句子编码,并提取特征。与现有模型相比,所提模型的精确率、召回率和F1值都有一定程度的提升,证明了所提算法的有效性。同时,预训练的BERT模型也被应用到当前任务中,使模型效果获得了进一步的提升。Aiming at the problem that the existing target and viewpoint extraction models fail to fully consider the connec-tion between the two,a specific target viewpoint extraction model based on contextual focus mechanism is proposed.First,it splices the extracted target feature vector with the context word vector at each location to form the final sentence representation,strengthens the interaction between the target and the sentence,and achieves target fusion;secondly,it uses the context focus mechanism to focus more attention around the target word,weakens the semantic features of distant words.The proposed model uses BiLSTM(bi-directional long short-term memory)network to encode sentences and extract features.Compared with existing models,the accuracy,recall and F1 value of the proposed model have been improved to a certain extent,which proves the effectiveness of the proposed algorithm.At the same time,the pre trained BERT model is applied in the current task to further improve the effect of the model.

关 键 词:目标融合 上下文专注机制 双向长短时记忆(BiLSTM)网络 BERT模型 

分 类 号:TP391.1[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象