基于对比散度-受限玻尔兹曼机深度学习的产品评论情感分析  被引量:12

Sentiment analysis of product reviews based on contrastive divergence- restricted Boltzmann machine deep learning

在线阅读下载全文

作  者:高琰[1] 陈白帆[1] 晁绪耀 毛芳[2] 

机构地区:[1]中南大学信息科学与工程学院,长沙410083 [2]中南大学软件学院,长沙410083

出  处:《计算机应用》2016年第4期1045-1049,共5页journal of Computer Applications

基  金:国家自然科学基金资助项目(61403423);湖南省科技计划项目(2014FJ3157);自主系统与网络控制教育部重点实验室开放基金资助项目(2013A11)~~

摘  要:针对目前大部分情感分析技术需要人工标注建立情感词典提取情感特征的问题,提出一种基于对比散度-受限玻尔兹曼机(CD-RBM)深度学习的产品评论情感分析方法。该方法在对产品评论时进行数据预处理并利用词袋模型产生产品评论的向量表示,然后通过CD-RBM提取产品评论的情感特征,最后结合支持向量机(SVM)将提取出来的情感特征进行文本情感分类。CD-RBM无需人工标注情感词典,即可获得情感特征,且可以提高特征的情感语义关联性;同时,SVM可以保证产品评论情感分类的准确度。通过实验确定了RBM最优训练周期为10,在此训练周期下对RBM、SVM、PCA+SVM,以及RBM+SVM方法进行了比较。实验结果表明,RBM特征提取和SVM分类结合方法能够获得最好的准确率和F值,并获得较好的召回率。Focusing on the issue that most of existing approaches need sentiment lexicon annotated manually to extract sentiment features,a sentiment analysis method of product reviews based on Contrastive Divergence- Restricted Boltzmann Machine( CD-RBM) deep learning was proposed. Firstly,product reviews were preprocessed and represented as vectors using the bag-of-words. Secondly,CD-RBM was used to extract the sentiment features from product review vectors. Finally,the sentiment features were classified with Support Vector Machine( SVM) as the sentiment analysis result. Without any manually pre-defined sentiment lexicon, CD-RBM can automatically obtain the sentiment features of higher semantic relevance;combining with SVM,the correctness of the sentiment analysis result is guaranteed. The optimum training period of RBM was experimentally determined as 10. In the comparison experiments with methods including RBM,SVM,PCA + SVM and RBM + SVM,the combination method of CD-RBM feature extraction and SVM classification shows the best precision and best F-measure,as well as better recall.

关 键 词:深度学习 受限玻尔兹曼机 情感分析 对比散度 支持向量机 

分 类 号:TP391.4[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象