检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:王浩宇 李蕴华 WANG Haoyu;LI Yunhua(Big Data and Network Management Center,Jilin University,Changchun 130012,China)
机构地区:[1]吉林大学大数据和网络管理中心,长春130012
出 处:《吉林大学学报(信息科学版)》2023年第6期1128-1134,共7页Journal of Jilin University(Information Science Edition)
摘 要:为解决基于RNN(Recurrent Neural Network)的序列推荐模型在处理长序列时易出现梯度消失或爆炸从而导致推荐模型训练过程不稳定问题,在传统门控循环单元(GRU:Gated Recurrent Unit)基础上,引入了残差连接、层归一化以及前馈神经网络等模块,提出了基于深度残差循环神经网络的序列推荐模型DeepGRU。并在3个公开数据集上进行了验证,实验结果表明,该DeepGRU相较于目前最先进的序列推荐方法具有明显的优势(推荐精度平均提升8.68%)。消融实验验证了引入的残差连接等模块在DeepGRU框架下的有效性。并且,该DeepGRU有效缓解了在处理长序列时训练过程不稳定的问题。To avoid the gradient vanishing or exploding issue in the RNN(Recurrent Neural Network)-based sequential recommenders,a gated recurrent unit based sequential recommender DeepGRU is proposed which introduces the residual connection,layer normalization and feed forward neural network.The proposed algorithm is verified on three public datasets,and the experimental results show that DeepGRU has superior recommendation performance over several state-of-the-art sequential recommenders(averagely improved by 8.68%) over all compared metrics.The ablation study verifies the effectiveness of the introduced residual connection,layer normalization and feedforward layer.It is empirically demonstrated that DeepGRU effectively alleviates the unstable training issue when dealing with long sequences.
分 类 号:TP3[自动化与计算机技术—计算机科学与技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.63