检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:陆悦聪 王瑞琴[1,2] 金楠 LU Yuecong;WANG Ruiqin;JIN Nan(School of Information Engineering,Huzhou University,Huzhou,Zhejiang 313000,China;Zhejiang Province Key Laboratory of Smart Management&Application of Modern Agricultural Resources,Huzhou,Zhejiang 313000,China)
机构地区:[1]湖州师范学院信息工程学院,浙江湖州313000 [2]浙江省现代农业资源智慧管理与应用研究重点实验室,浙江湖州313000
出 处:《计算机工程与应用》2022年第22期72-78,共7页Computer Engineering and Applications
基 金:国家社会科学基金(20BTQ093)。
摘 要:基于深度学习的推荐算法最初以用户和物品的ID信息作为输入,但是ID无法很好地表现用户与物品的特征。在原始数据中,用户对物品的评分数据在一定程度上能表现出用户和物品的特征,但是未考虑用户的评分偏好以及物品的热门程度。在评分任务中使用隐式反馈和ID信息作为用户与物品的特征,在消除用户主观性对特征造成的噪声的同时在一定程度上缓解冷启动问题,利用单层神经网络对原始高维稀疏特征降维,使用特征交叉得到用户与物品的低阶交互,再利用神经网络捕获用户与物品的高阶交互,有效提取了特征间的高低阶交互。在四个公开数据集上的实验表明,该算法能有效提高推荐精度。The recommendation algorithm based on deep learning initially takes the ID information of user and item as input.However,the ID cannot well show the characteristics of user and item.In the original data,the user’s rating data on the item can show the characteristics of user and item to a certain extent.In this paper,implicit feedback and ID information are used as features of users and items in the rating task,which can eliminate the noise caused by users’subjectivity and alle-viate the problem of cold start to a certain extent.Single layer neural network is used to reduce the dimension of original high-dimensional sparse features,and feature crossover is used to get the low-order interaction between users and items,and then neural network is used to capture the high-order interaction between users and items,The high-order and low order interactions between features are effectively extracted.Experiments on four public data sets show that the algorithm proposed in this paper can effectively improve the recommendation accuracy.
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.222