基于自注意力机制改进GCNN模型的图书标签分类研究  

Research on Book Label Classification Based on Improved GCNN Model Based on Self-Attention Mechanism

在线阅读下载全文

作  者:张健[1] 

机构地区:[1]上海理工大学管理学院,上海

出  处:《建模与仿真》2024年第2期1322-1332,共11页Modeling and Simulation

摘  要:针对卷积神经网络聚焦于局部特征,不足以捕捉文本中长程依赖关系的问题,本文提出了一种基于CNN和自注意力机制改进的双通道图书标签分类模型(Gate Convolution Neural Network based on self- attention mechanism, GCNN-SAM)。该模型使用skip-gram将词嵌入成稠密低纬的向量,得到文本嵌入矩阵,分别输入到门卷积神经网络和自注意力机制,再经过逐点卷积,将两个通道中经过特征提取层得到的特征进行融合用于图书标签分类。在复旦大学中文文本分类数据集上进行对比实验,相较于SCNN、GCNN和其它改进的模型,测试集准确率达到96.21%,表明了GCNN-SAM模型在图书标签分类上具有优越性。同时,为验证GCNN-SAM模型的有效性,消融实验结果表明GCNN-SAM模型相较于CNN、GCNN和CNN-SAM在分类准确率上分别提升了5.9%、3.19%和3.66%。Aiming at the problem that convolutional neural networks focus on local features and are not enough to capture long-range dependencies in text, in order to improve this problem, this paper proposes an improved dual-channel book label classification model based on CNN and self-attention mechanism. The model uses skip-gram to embed words into dense low-latitude vectors to obtain a text embedding matrix, which is input into the gate convolutional neural network and self-attention mechanism respectively. After point-by-point convolution, the features obtained by the feature ex-traction layer in the two channels are fused for book label classification. A comparative experiment was conducted on the Fudan University Chinese text classification dataset. Compared with SCNN, GCNN and other improved models, the accuracy of the test set is 96.21%, which shows that the GCNN-SAM model has advantages in book label classification. At the same time, in order to verify the effectiveness of the GCNN-SAM model, ablation experiments were carried out. The results showed that the GCNN-SAM model improved the classification accuracy by 5.9%, 3.19% and 3.66% respec-tively compared with CNN, GCNN and CNN-SAM.

关 键 词:图书标签分类 门卷积神经网络 自注意力机制 双通道 

分 类 号:TP3[自动化与计算机技术—计算机科学与技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象