采用偏好编辑的轻量自注意降噪序列推荐模型  

Lightweight and denoising self-attentive sequential recommendation using preference editing

在线阅读下载全文

作  者:杨兴耀[1] 钟志强 于炯[1] 李梓杨 张少东 党子博 YANG Xing-yao;ZHONG Zhi-qiang;YU Jiong;LI Zi-yang;ZHANG Shao-dong;DANG Zi-bo(School of Software,Xinjiang University,Urumqi 830008,China)

机构地区:[1]新疆大学软件学院,新疆乌鲁木齐830008

出  处:《计算机工程与设计》2024年第10期2953-2959,共7页Computer Engineering and Design

基  金:国家自然科学基金项目(62262064、61862060);新疆维吾尔自治区教育厅基金项目(XJEDU2016S035);新疆大学博士科研启动基金项目(BS150257);新疆维吾尔自治区自然科学基金项目(2022D01C56)。

摘  要:在自注意序列推荐中,除项目嵌入矩阵带来巨大内存消耗问题和自注意层中的不相关信息带来噪声问题,还存在如何在用户行为数据稀疏的情况下准确提取和表示用户偏好的关键问题。针对这些问题,提出一种采用偏好编辑的轻量自注意降噪序列推荐模型(LDSR-PE)。采用上下文感知的动态嵌入组合方案缓解内存消耗问题,在每个自注意层上附加可训练的二进制掩膜,实现自适应修剪不相关噪声项。为更好训练模型,设计基于偏好编辑的自监督学习策略,促使序列推荐模型在不同的交互序列之间区分公共和唯一的偏好。在3个公开数据集上的实验结果表明,LDSR-PE优于主流先进推荐模型。In the self-attentive sequential recommendation,in addition to the huge memory consumption caused by the item embedding matrix and the noise caused by irrelevant information in the self-attention layer,there is also a key problem of how to accurately extract and represent user preferences in the case of sparse user behavior data.A lightweight and denoising self-attentive sequential recommendation using preference editing(LDSR-PE)was proposed to solve the above problems.A context-aware dynamic embedding composition scheme was used to alleviate the memory consumption problem,and trainable binary masks were attached to each self-attention layer to achieve adaptive pruning of irrelevant noise items.To better train the model,a self-supervised learning strategy based on preference editing was designed to force the sequential recommendation model to discriminate common and unique preferences in different sequences of interaction.Results of a large number of experiments conducted on three public datasets show that the LDSR-PE outperforms the mainstream advanced recommendation models.

关 键 词:序列推荐 偏好编辑 嵌入组合 自注意力机制 自监督学习 数据稀疏性 深度神经网络 

分 类 号:TP391[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象