检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:孙敏 李旸[1] 庄正飞 余大为 SUN Min;LI Yang;ZHUANG Zhengfei;YU Dawei(School of Information and Computer,Anhui Agriculture University,Hefei Anhui 230036,China)
机构地区:[1]安徽农业大学信息与计算机学院,合肥230036
出 处:《计算机应用》2020年第9期2543-2548,共6页journal of Computer Applications
基 金:国家自然科学基金资助项目(61402013)。
摘 要:针对传统卷积神经网络(CNN)不仅会忽略词的上下文语义信息而且最大池化处理时会丢失大量特征信息的问题,传统循环神经网络(RNN)存在的信息记忆丢失和梯度弥散问题,和CNN和RNN都忽略了词对句子含义的重要程度的问题,提出一种并行混合网络融入注意力机制的模型。首先,将文本用Glove向量化;之后,通过嵌入层分别用CNN和双向门限循环神经网络提取不同特点的文本特征;然后,再把二者提取得到的特征进行融合,特征融合后接入注意力机制判断不同的词对句子含义的重要程度。在IMDB英文语料上进行多组对比实验,实验结果表明,所提模型在文本分类中的准确率达到91.46%而其F1-Measure达到91.36%。Concerning the problems that the traditional Convolutional Neural Network(CNN)ignores the context and semantic information of words and loses a lot of feature information in maximal pooling processing,the traditional Recurrent Neural Network(RNN)has information memory loss and vanishing gradient,and both CNN and RNN ignore the importance of words to sentence meaning,a model based on parallel hybrid network and attention mechanism was proposed.First,the text was vectorized with Glove.After that,the CNN and the bidirectional threshold recurrent neural network were respectively used to extract text features with different characteristics through the embedding layer.Then,the features extracted by two networks were fused.And the attention mechanism was introduced to judge the importance of different words to the meaning of sentence.Multiple sets of comparative experiments were performed on the English corpus of IMDB.The experimental results show that the accuracy of the proposed model in text classification reaches 91.46%and F1-Measure reaches 91.36%.
关 键 词:卷积神经网络 双向门限循环单元 特征融合 注意力机制 文本情感分析
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.144.41.22