检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:向进勇 刘小龙 丁明扬 李欢 曹文婷 XIANG Jin-yong;LIU Xiao-long;DING Ming-yang;LI Huan;CAO Wen-ting(State Grid Huocheng County Power Supply Company,Huocheng 835200,China;State Grid Yili Power Supply Company,Yili 835000,China)
机构地区:[1]国家电网霍城县供电公司,新疆霍城835200 [2]国家电网伊犁供电公司,新疆伊犁835000
出 处:《东北师大学报(自然科学版)》2020年第2期73-79,共7页Journal of Northeast Normal University(Natural Science Edition)
基 金:国家“973”计划项目(2014CB340500).
摘 要:针对卷积层和池化层的局部性,提出了一种CNN与RNN的联合架构.通过使用一个无监督的神经语言模型训练初始词嵌入,然后使用网络的预训练参数对模型进行初始化;将信息通过卷积层进行特征映射以及通过长短时记忆模型学习长期依赖关系;通过轻微的超参数调优和词向量,在句子级文本情感分类中取得了出色的结果.使用循环层替代池化层来减少参数的数量以及卷积神经网络的复杂性.结果表明,该方法能够在减少本地信息丢失的同时,构建一个具有更少参数和更高性能的高效框架来捕获句子长期依赖关系.In this paper,we describe a joint CNN and RNN framework to overcome this problem.Briefly,we use an unsupervised neural language model to train initial word embeddings that are further tuned by our deep learning network,then,the pre-trained parameters of the network are used to initialize the model.At a final stage,the proposed framework combines former information with a set of feature maps learned by a convolutional layer with long-term dependencies learned via long-short-term memory.Empirically,we show that our approach,with slight hyperparameter tuning and word vectors,achieves outstanding results on multiple sentiment analysis benchmarks.Our approach has a significant role in reducing the number of parameters and constructing the convolutional layer followed by the recurrent layer as a substitute for the pooling layer.The results show that we were able to reduce the loss of detailed,local information and capture long-term dependencies with an efficient framework that has fewer parameters and a high level of performance.
关 键 词:卷积神经网络 循环神经网络 自然语言处理 深度学习 文本情感分类 长期依赖
分 类 号:TP391.1[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.28