基于动态掩码和多对对比学习的序列推荐模型  

Sequential recommendation model based on dynamic mask and multi-pair contrastive learning

在线阅读下载全文

作  者:郑顺 王绍卿[1] 刘玉芳 李可可 孙福振[1] ZHENG Shun;WANG Shaoqing;LIU Yufang;LI Keke;SUN Fuzhen(School of Computer Science and Technology,Shandong University of Technology,Zibo 255000,Shandong,China)

机构地区:[1]山东理工大学计算机科学与技术学院,山东淄博255000

出  处:《山东大学学报(工学版)》2023年第6期47-55,共9页Journal of Shandong University(Engineering Science)

基  金:山东省自然科学基金资助项目(ZR2020MF147,ZR2021MF017);山东省高等学校青创科技计划创新团队项目(2021KJ031)。

摘  要:为解决BERT(bidirectional encoder representations from transformers)编码器在掩码过程中人为引入噪音、掩码比例过小难以掩盖短交互序列中的项目以及掩码比例过大导致模型难以训练3个问题,提出一种更改BERT编码器掩码方式的对比学习方法,为模型提供3类学习样本,使模型在训练过程中模仿人类学习进程,从而取得较好的结果。提出的算法在3个公开数据集上进行对比试验,性能基本优于基线模型,其中,在MovieLens-1M数据集上HR@5和NDCG@5指标分别提高9.68%和10.55%。由此可见,更改BERT编码器的掩码方式以及新的对比学习方法能够有效提高BERT编码器的编码准确性,从而提高推荐的正确率。In order to solve the three problems that the noise was artificially introduced in the masking process of the bidirectional encoder representations from transformers(BERT)encoder,the mask ratio was too small to make it difficult to cover the items in short interaction sequences,and the mask ratio was too large to make the model difficult to train,this study proposed a contrastive learning method to change the BERT encoder mask method and provide three types of samples for the model to learn,so that the model could imitate the human learning process during the training process,and thus achieved better results.The proposed algorithm was compared against baseline models on three public datasets,demonstrating superior performance in general.Among them,the HR@5 and NDCG@5 indicators on the MovieLens-1M dataset had increased by 9.68%and 10.55%,respectively.It could be seen that changing the mask method of BERT encoder and the new contrastive learning method could effectively improve the coding accuracy of the BERT encoder and thereby improving the accuracy of the recommendation.

关 键 词:自注意力 完形填空 序列推荐 对比学习 动态掩码 推荐系统 

分 类 号:TP399[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象