基于卷积递归深度学习模型的句子级文本情感分类  被引量:3

Convolutional recurrent deep learning modelfor sentence sentiment classification

在线阅读下载全文

作  者:向进勇 刘小龙 丁明扬 李欢 曹文婷 XIANG Jin-yong;LIU Xiao-long;DING Ming-yang;LI Huan;CAO Wen-ting(State Grid Huocheng County Power Supply Company,Huocheng 835200,China;State Grid Yili Power Supply Company,Yili 835000,China)

机构地区:[1]国家电网霍城县供电公司,新疆霍城835200 [2]国家电网伊犁供电公司,新疆伊犁835000

出  处:《东北师大学报(自然科学版)》2020年第2期73-79,共7页Journal of Northeast Normal University(Natural Science Edition)

基  金:国家“973”计划项目(2014CB340500).

摘  要:针对卷积层和池化层的局部性,提出了一种CNN与RNN的联合架构.通过使用一个无监督的神经语言模型训练初始词嵌入,然后使用网络的预训练参数对模型进行初始化;将信息通过卷积层进行特征映射以及通过长短时记忆模型学习长期依赖关系;通过轻微的超参数调优和词向量,在句子级文本情感分类中取得了出色的结果.使用循环层替代池化层来减少参数的数量以及卷积神经网络的复杂性.结果表明,该方法能够在减少本地信息丢失的同时,构建一个具有更少参数和更高性能的高效框架来捕获句子长期依赖关系.In this paper,we describe a joint CNN and RNN framework to overcome this problem.Briefly,we use an unsupervised neural language model to train initial word embeddings that are further tuned by our deep learning network,then,the pre-trained parameters of the network are used to initialize the model.At a final stage,the proposed framework combines former information with a set of feature maps learned by a convolutional layer with long-term dependencies learned via long-short-term memory.Empirically,we show that our approach,with slight hyperparameter tuning and word vectors,achieves outstanding results on multiple sentiment analysis benchmarks.Our approach has a significant role in reducing the number of parameters and constructing the convolutional layer followed by the recurrent layer as a substitute for the pooling layer.The results show that we were able to reduce the loss of detailed,local information and capture long-term dependencies with an efficient framework that has fewer parameters and a high level of performance.

关 键 词:卷积神经网络 循环神经网络 自然语言处理 深度学习 文本情感分类 长期依赖 

分 类 号:TP391.1[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象