检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:余珊珊 苏锦钿[2] 李鹏飞[2] YU Shan-shan;SU Jin-dian;LI Peng-fei(College of Medical Information Engineering,Guangdong Pharmaceutical University,Guangzhou 510006,Chin;College of Computer Science and Engineering,South China University of Technology,Guangzhou 510640,China)
机构地区:[1]广东药科大学医药信息工程学院,广州510006 [2]华南理工大学计算机科学与工程学院,广州510640
出 处:《计算机科学》2020年第4期204-210,共7页Computer Science
基 金:广东省自然科学基金(2015A030310318);广东省科技厅应用型科技研发专项资金(20168010124010);广东省医学科学技术研究基金项目(A2015065)。
摘 要:注意力机制近年来在多个自然语言任务中得到广泛应用,但在句子级别的情感分类任务中仍缺乏相应的研究。文中利用自注意力在学习句子中重要局部特征方面的优势,结合长短期记忆网络(Long Short-Term Model,LSTM),提出了一种基于注意力机制的神经网络模型(Attentional LSTM,AttLSTM),并将其应用于句子的情感分类。AttLSTM首先通过LSTM学习句子中词的上文信息;接着利用自注意力函数从句子中学习词的位置信息,并构造相应的位置权重向量矩阵;然后通过加权平均得到句子的最终语义表示;最后利用多层感知器进行分类和输出。实验结果表明,AttLSTM在公开的二元情感分类语料库Movie Reviews(MR),Stanford Sentiment Treebank(SSTb2)和Internet Movie Database(IMDB)上的准确率最高,分别为82.8%,88.3%和91.3%;在多元情感分类语料库SSTb5上取得50.6%的准确率。Although attention mechanisms are widely used in many natural language processing tasks,there still lacks of related works about its applications in sentence-level sentiment classification.By taking advantage of self-attention mechanism in learning important local features of sentences,a multi-layer attentional neural network based on long-short term memory network(LSTM)and attention mechanism,named AttLSTM,was proposed and then applied into the fields of sentiment classification for sentences.AttLSTM firstly uses LSTM network to capture the contexts of sentences,and then takes self-attention functions to learn the position information about words in the sentences and builds the corresponding position weight matrix,which yields the final semantic representations of the sentences by weighted averaging.Finally,the results is classified and outputted via a multi-layer perceptron.The experiment results show that AttLSTM outperforms some relative works and achieves the highest accuracy of 82.8%,88.3%and 91.3%respectively on open two-class sentiment classification corpora,including Movie Reviews(MR),Stanford Sentiment Treebank(SSTb2)and Internet Movie Database(IMDB),as well as 50.6%for multi-class classification corpora SSTb5.
关 键 词:深度学习 情感分类 自注意力 长短期记忆神经网络 自然语言处理
分 类 号:TP183[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.222