检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:代杨 李永杰[1] DAI Yang;LI Yongjie(College of Electronic Engineering,Naval University of Engineering,Wuhan 430000)
出 处:《舰船电子工程》2023年第7期101-104,214,共5页Ship Electronic Engineering
摘 要:为实现网络评论文本的情感倾向性分析,针对传统情感分析方法不能充分提取文本特征、文本预训练不准确、词向量无法利用上下文信息和无法解决一词多义的问题,论文提出一种将BERT预训练模型与卷积神经网络融合的BERT-CNN模型。该模型首先使用BERT预训练提取网络文本的情感特征表示,然后卷积层利用不同大小的卷积核提取各种长度的分词特征,最后池化、映射并输出分类结果。实验结果表明基于BERT和卷积神经网络融合的网络文本情感分析模型的F1值达到87.13%,显著优于其他传统模型。与传统模型相比,该模型Accuracy最高提升了10.18%。与BERT模型相比,该模型Accuracy、Precision、Recall、F1值也均有所提升。For the purpose of realizing the sentiment tendency analysis of web commentary text,and aiming at the defect that traditional sentiment analysis methods can not fully extract text features,inaccurate text pre-training,the inability of word vectors to utilize contextual information,and the inability to solve multiple meanings of a word,this paper proposes a BERT-CNN model that fuses BERT pre-training model with a convolutional neural network.Firstly,the sentiment features of the Web texts are extract-ed and represented by the BERT pre-training model.Then,the convolutional layer uses convolutional kernels of different sizes to ex-tract feature values of different sizes of tokens.Finally,the pooling layer is mapped and the classification results are output.The ex-perimental results show that the F1 value of the web-based text sentiment analysis model based on the fusion of BERT and the convo-lutional neural network reaches 87.13%,which is significantly better than other traditional models.Compared with the traditional model,the model Accuracy is improved by up to 10.18%.The Accuracy,Precision,Recall,and F1 values of this model are also improved compared to the BERT model.
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.229