基于字符级联合网络特征融合的中文文本情感分析  被引量:10

Chinese text sentiment analysis based on joint network and attention model

在线阅读下载全文

作  者:王丽亚 刘昌辉[1] 蔡敦波[1] 赵彤洲[1] 王梦 WANG Li-ya;LIU Chang-hui;CAI Dun-bo;ZHAO Tong-zhou;WANG Meng(College of Computer Science and Engineering,Wuhan Institute of Technology,Wuhan 430205,China)

机构地区:[1]武汉工程大学计算机科学与工程学院

出  处:《微电子学与计算机》2020年第1期80-86,共7页Microelectronics & Computer

基  金:国家自然科学基金(61103136);武汉工程大学教育创新计划资助项目(CX2018196)

摘  要:针对传统卷积神经网络(CNN)同层神经元之间信息不能互传,无法充分利用同一层次上的特征信息,以及无法提取长距离上下文相关特征的问题.该文针对中文文本,提出字符级联合网络特征融合的模型进行情感分析,在字符级的基础上采用BiGRU和CNN-BiGRU并行的联合网络提取特征,利用CNN的强学习能力提取深层次特征,再利用双向门限循环神经网络(BiGRU)进行深度学习,加强模型对特征的学习能力.另一方面,利用BiGRU提取上下文相关的特征,丰富特征信息.最后在单方面上引入注意力机制进行特征权重分配,降低噪声干扰.在数据集上进行多组对比实验,该方法取得92.36%的F1值,结果表明本文提出的模型能有效的提高文本分类的准确率.For the traditional convolutional neural network(CNN), the information between the same layer of neurons cannot be transmitted to each other, and the feature information at the same level cannot be fully utilized, and the problem of long-distance context-related features cannot be extracted. In this paper, based on the Chinese text, this paper proposes a model of character-level joint network feature fusion for sentiment analysis. Based on the character level, BiGRU and CNN-BiGRU parallel joint network are used to extract features, and CNN’s strong learning ability is used to extract deep features and reuse. The two-way threshold cyclic neural network(BiGRU) performs deep learning and enhances the model’s ability to learn features. On the other hand, BiGRU is used to extract context-related features to enrich feature information. Finally, the attention mechanism is introduced unilaterally to perform feature weight distribution to reduce noise interference. Multi-group comparison experiments were carried out on the dataset. The method obtained 92.36% F1 value. The results show that the proposed model can effectively improve the accuracy of text classification.

关 键 词:卷积神经网络 BiGRU 注意力机制 中文文本情感分析 

分 类 号:TP391[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象